Developing people developing drugs
pharmafile | August 9, 2004 | Feature | |Â Â Â
What distinguishes pharma research and development from other high-tech industries? It is an oddity of pharma research R&D that, despite it being very high-tech, it is extremely labour-intensive.
This has gradually dawned on me over 30 years, but only in the last decade have the implications become clear. Maybe I'm a bit slow and you are all ahead of me, but just in case we are running neck and neck, it's worth analysing why this is the case.
Over that time I have seen a shift in the balance of labour in drug development. In 1974 I was involved in the launch of a new antihypertensive and was given a well-argued story describing the mechanism of action.
Two things unfolded fairly quickly: we found that the pivotal clinical trials were flawed and had not detected a key adverse event and later were told that the mechanism of action story was wrong and the drug was not unique at all.
Tackling the second problem first, it should be pointed out that in those days drug candidates were generated on a largely serendipitous basis (I know, I used to do it in my lab days!). This meant that drugs with interesting effects usually emerged without a clear idea of how they did it. This has changed dramatically – nowadays drugs are designed with an understanding of the molecular processes involved.
I'm not suggesting that the labour involved in drug discovery has dipped, but I do think it's more efficiently used. Admittedly, for most of my career I have been more involved in development than discovery, and therein lies the problem. Five years after this particular drug launch (ie, 25 years ago) there I was setting up and managing clinical trials, without any quality control or assurance. There has since been a huge increase in the work required to meet the regulators' demands for quality, traceable data.
The interesting thing is, the nearer a drug gets to market, the more we delegate control to external bodies (especially investigator sites), and the higher the costs and the more work we have to do. This is unusual among technology-based industries.
The development of pharma structure and culture
I hope my reminiscences don't bore you too much, but when I started in clinical research there were two kinds of people involved: those with medical degrees, who were managers and people with other qualifications, who did all the work. Think of it in terms of 'officers' and 'other ranks' – this is probably a bit disparaging (as well as over-simplistic), but anyone without a medical degree could forget about career development, unless they moved away from clinical research (and a lot of them did) into marketing.
Over the years burgeoning regulatory demands have driven the huge growth in the number of people managing clinical trials, which has meant improved organisation is crucial to maintain control. Hence the emergence 15 to 20 years ago of something resembling a career structure for non-medical people. It was during that period that I became interested in applying project management skills to clinical research, but it has sometimes been an uphill struggle.
In most pharma companies line management still dominates and project managers lack empowerment. This is the cultural challenge these companies face. They actively recruit strong scientific and medical personalities as line managers, often for kudos reasons, and put them in powerful positions. These people were accustomed to being dominant in medicine or academia, and they usually don't change when they become line managers.
Project teams options for drug development
This dominance is actively encouraged by top management, because these individuals are highly influential in their dealings with key opinion leaders.
The situation is compounded by budgetary matters. How many project managers actually have authority to spend their budgets? Most of the time they still have to get payments signed off by senior line managers, causing delay to projects. Indeed, financial responsibility is probably the easiest way to define overall authority. If someone hasn't the authority to spend their budget, what authority do they have?
Line managers will not let go of budgetary control without a fight, so project managers get blamed when things go wrong but little credit when things go well. Senior line managers then resume their around the world business class flights with a clear conscience.
One approach to this problem is to define the purpose of budgets a bit more clearly. There is nothing wrong with line managers retaining control of non-project budgets, but they should not have life and death control over project budgets. This begs the question of what line managers actually do, and several big pharma companies have been facing up to this question over recent years.
A model which is gaining more acceptance is to create a pool of expertise, managed by line managers, from which individuals are selected to serve on project teams. The role of the line manager is to ensure high quality and highly motivated people are available. So the budget gets spent on training, current awareness, etc- salaries will probably be included and these could be based on the demand for the line manager's people by projects.
Project and line management interfaces
Could a line manager and a project manager be the same person? There is no reason why not, and this happens in many companies. The problem is that the boundaries are not defined clearly enough.
A few years ago, for example, I was chief executive of an admittedly small company, but for part of the time team member on a project, subordinate to the project leader.
This can be very difficult for people to deal with, but one action we took was to abolish the permanent title of project manager. Leadership of a project only lasts as long as the project, which is temporary – it is something that people have to win and if their project is successful they usually secure the next leadership position.
Of course I am describing matrix management, but this is not a solution to a problem, it is just a name for it. The term is much abused because many people don't understand what a team is. Recently, I consulted for a company which had monthly 'team meetings'.
I quickly realised that the group which met was not a team at all – it was a department; I think (correct me if I wrong) that a team consists of people with shared goals. The meeting agenda comprised of people telling the head of department what they had been doing for the last month. All their projects were different – the only common purpose was sharing of resources. You will probably have worked out that the real purpose of the meeting was nothing to do with the 'team'; it was purely so the head of department could find out what was going on, with the result that 90% of the time was wasted, as people listened to irrelevant matters.
There is nothing wrong with regular departmental meetings, but the agenda should be line management not project issues; it would be an ideal forum for personal development actions, for example among these are what are now called 'soft' skills.
What are 'soft' skills?
To grasp what is meant by 'soft' skills it is worth defining 'hard' skills. Over the last 15 years training people in project management, I have identified a general perception that it is all about systems.
A common scenario is to send someone on a project software course and then call them a project manager. I think this is what people mean by 'hard' skills – the nuts and bolts of constructing project plans. They get sent on other courses as well, to develop interpersonal, negotiation, and leadership skills – these tend to be labelled oftskills.
They are extremely important in drug development, because of the need to deal with external people such as CROs and investigators, but I really don't see them as a separate discipline from project planning. This is because a project plan is just a set of agreements – various people have agreed to deliver various things at specific times and to defined costs. So the people skills can be divorced from the planning skills. But as we have seen, training tends to focus on one or the other.
Training – what kind, how much and when?
Any guesses what budget gets cut first when times are hard? No prizes if you said training. The reason is two-fold – not just the cost of the training, but the lost working time as well. In reality, when the business environment is tough, people need to be even better equipped to deal with it – they need more training not less. But this typifies what several industry observers are reporting now.
Louise Oram, now an information modelling consultant, after a long career in medical learning and development at Glaxo, has identified a lack of systems thinking and long-term career development. She believes more strategic planning is needed- indeed, in some companies, not necessarily healthcare ones, we are seeing the appointment of board level directors with the title of chief learning officer. It is time to move away from the gap-filling approach to training (ie, from reactive to proactive). Learning and development strategies must align with both the vertical (corporate) and the horizontal (business) objectives, if they are to have a systemic impact on the organisation.
Consider, for example, that a new project is coming up, so people need to be trained up in the therapeutic area. This is encouraged by good clinical practice, which requires us to document that people are competent. We need this tactical approach, but we also need the vision of where the organisation wants to be in say five years' time.
Most training budgets are probably set by guesswork, often as a standard percentage of something or other, rather than being linked to requirements. Very little training happens as the result of a formal needs analysis – it happens because a course becomes available, or there's a short-term knowledge void to fill (eg, after a failed audit).
Neil Sharpe, training manager at AstraZeneca Charnwood, points out how infrequently an effective training needs analysis is carried out; often it is not done at all and even if it is, it may be inaccurate or incomplete. Time spent on this will save a lot of time and money when it comes to running the course, because you will identify the right training for the right people.
Managing the 'outer team' and best practice
What about those CROs, investigators, and other outside people? I call them the 'outer team' because they are spending a smaller proportion of their time on the project than the 'inner team' people do. Even the CRO people, who will assure you of 100% commitment, have other clients – don't forget that! Getting what you want from the outer team can be difficult and the key is to increase their sense of involvement.
Take study sites, for example. Five years ago I tried out our home-grown electronic data capture and project management system, when many of the pundits were saying that investigators would resist the idea. There were no data queries at all, but even more importantly the study site people felt far closer to the inner team because they were in constant contact. This is a marriage of hard skills – the technology of project planning and control – with the soft skills of getting people to do what you want.
It grieves me to say this but I don't think we find best practice in drug development. You could say that the risky nature of drug development moves the goalposts, but try saying that to the oil and gas industry – their geophysical risks are at least as worrying as our drug safety ones.
They address this by making projects dominant and project managers powerful. In drug development, we are happy to empower project managers when problems arise, but not to prevent them happening in the first place. We will only get the best out of our people when we release the shackles.






