Training time and consultant practice

J. David Greaves

Newcastle Upon Tyne, UK E-mail: david.greaves{at}ncl.ac.uk

When I was an Anaesthetic Senior House Officer in the early 1970s, a respected consultant colleague remarked that, as a trainee, I could expect to be as good as the number of cases I had done. If he was even partly right, we need to be concerned about the paper from Underwood and McIndoe in this issue that records, from 1996, the falling caseload for trainees in a typical School of Anaesthesia in the United Kingdom.1 The authors discuss the reasons for this reduction, and indeed they are several, though the greatest are the recent changes in doctors' hours and conditions of service. These, and their impact on training, have recently been discussed by Spargo.2 Concerns are not exclusive to the UK, and recent changes in working times were greeted with mixed feelings by those responsible for residency programmes in the USA.3

That trainees in the UK now anaesthetize fewer cases will be no surprise to those consultants who are responsible for training schemes. Logbook data, presented by trainees at their annual review, has shown a downward trend in caseload for over a decade, and many with experience of reviewing trainees will feel that the reductions presented by Underwood and McIndoe1 are smaller than those they have been seeing in their own trainees.

Does it matter? This is a complex question. The Royal College of Anaesthetists has set few precise figures for experience in any phase of training.4 When the formal CCST for anaesthesia, and its associated structured training, was being planned in 1994, informal inquiry to Regional Advisors revealed that trainees anaesthetized 700 or more cases annually. With this wealth of experience there was little incentive to set ‘minimum experience’ standards in terms of cases. It was argued that doing so would invite hospitals to restrict training to those levels and, similarly, that there was no need for formal ‘blocks’ of training that might constrain anaesthetic departments and interfere with the trainees' contribution to service. As caseload has apparently halved, it may be necessary for the Royal College of Anaesthetists to reconsider its position. Indeed, in paediatric anaesthesia, one of the few areas where an indicative caseload is suggested, it is almost inconceivable that a trainee can reach the proposed target of 500 cases in the course of a 6 month attachment.5

How many cases are needed for competence? The relationship between numbers and competence is complex and there are many reasons (e.g. their aptitude, the teaching they receive and the amount of experience they gain) why trainees progress at different rates. The evidence is patchy but suggests that, to progress well in developing clinical practice, it is important that learners are properly instructed from the outset,6 have a number of similar repeated experiences during the early stages of training7 and undertake a reasonable number of cases.8 A number of investigations of the acquisition of skills in anaesthesia 6 911 and in other practical specialties12 13 have demonstrated a biphasic curve, with about 80% of competence being achieved over about 30 cases, but with continued improvement in performance over 100 or more cases. Although intuition tells us that complex procedures should be more difficult to learn, this is not borne out by data, presumably because how well you learn depends on what you know already. In this connection, it is interesting to note the prolonged learning curves and high morbidities encountered by experienced surgeons learning procedures that were significantly different from their existing skill base; such as laparoscopic abdominal procedures14 and cataract extraction by phaco-emulsification.15 The practical importance of these observations is that, with annual caseloads in anaesthesia falling towards 300, a training block of 3 months will bring experience of about 80 cases. Subspecialty areas will have even smaller caseloads. It is immediately clear that the level of experience of most trainees is less than the threshold at which expert performance can be expected and close to the minimum required for competent practice. This includes crucial areas such as general anaesthesia for Caesarean section, endobronchial intubation and emergency laparotomy for ruptured aortic aneurysm—capabilities that have been assumed to be in the armamentarium of all consultant anaesthetists.

Where does this lead us? In a competency-based system of curriculum and assessment it is presumed that trainees will progress at different rates. Outcome measures should determine when competence has been achieved. A decade of progress has left us far short of this ideal. Suitable outcome measures and assessments have been difficult to design. Systems that assess trainees by observing and scoring their performance during real work (or in simulation), whether incredibly complex or simple, still depend upon the expert opinion of a trained observer. Where these have been implemented, departments of anaesthesia are struggling to cope with the plethora of assessments that are required, and with the requirement for training of the assessors. It is probable that, in the past, trainees did so many cases that by the time they took up a consultant post they were unlikely to fail to cope with its clinical demands. Here then is the nub of the matter: with reduced caseload trainees may have deficiencies in their competences and, because of the same lack of cases, their deficiencies may not become evident during training in terms of poor outcomes.

What must be our response to this situation? There are three approaches, which may be summarized as, ‘Teach them better’, ‘Test them harder’ and ‘Look after them when they are new consultants’. All three are needed.

First, some attention must be paid to the way that trainees are taught and accumulate their experience. It is common sense to say that if training time is at a premium, then the best possible use must be made of it. The important elements in practical learning are well recognized. Learners must engage in relevant work, they must be supervised, they must be well taught in the basics at the outset of a phase of practice, they must be coached as their level of proficiency increases and they must have access to regular feedback on their progress. Feedback should include both the constructive criticism of their supervisors and an analysis of their outcomes. The Royal College of Anaesthetists could set minimum numbers for key procedures. This would result in chaos for hospital departments unless the introduction of targets was associated with a change to formal blocks of experience in all the targeted areas. It would be unacceptable to have a free for all as trainees vie with one another to log the more uncommon types of cases. If care is being taken to place trainees primarily for the benefit of their relevant clinical experiences, there will be significant consequence to the delivery of routine patient services. Should trainees become supernumerary, in that anaesthetic services can be run without their contribution? It is my belief that this would further damage their training. British doctors have always learned in situations that give them responsibility for real work, and both this clinical responsibility and the real duty of delivering care are an important aspect of the context of their development. Trainees should continue to treat patients and deliver service, with appropriate supervision and monitored outcomes. We must be cautious of using educational research that has been done with undergraduates to plan developments in specialist education, because the two have crucial differences. Undergraduates do not have these real responsibilities for care, and one of the preoccupations of their teachers is manufacturing or simulating patient encounters. Postgraduates have the real thing. Much of the published work on learning clinical skills deals with undergraduate learners, and those who deal with them may not adequately recognize the realities of the context in shaping postgraduate learning.

Secondly, attention must be paid to the ways in which we assure competency. The possibility that new consultants are inexperienced, and therefore less competent than formerly, inevitably leads to demands to tighten up assessment. Workplace assessment in the course of real work is difficult. Checking the outcomes of competences becomes more and more difficult as their complexity increases, and many of the situations with which we deal are staggeringly complex. The simple elements of practice may be observed and assessed by instruments such as directly observed procedural skill (DOPS)16 and Mini-Clinical Examination (Mini-CEX)17 that are currently being introduced into the routine assessment of more junior doctors. Such approaches are useful and we must adopt them as a means of monitoring progress during the acquisition of straightforward competences. What of the more complex aspects of practice? At present, at senior levels of training, useful workplace assessments are not widely available and the best way to assure competence is to ask the opinion of experienced trainers who have supervised the learner through sufficient relevant cases. One purpose of assessment is to check that the learner has learned what the curriculum intended. If trainees are, for whatever reason, denied the opportunity to refine and develop their skills, they will fail their assessment and this will lead to a prolongation of their training. A system of training, supervision and clinical work must be fit for its purpose, and it is fundamentally unfair to attempt to redress its deficiencies by instituting a more stringent assessment system. Hard examinations do promote hard work in candidates, but here we have a situation where what they need is clinical experience, and that is what they cannot get.

Finally, we must consider the role of a newly appointed consultant? Traditionally, UK consultants work independently and have clinical autonomy. Only in exceptional circumstances do consultants supervise one another, and further learning is generally self-directed and unsupervised. In the halcyon days of the 1970s, many Senior Registrars were grey-haired middle-aged men. They were worldly wise and had a huge amount of experience. They were ready, or overdue, for the responsibilities of a consultant post and could cope with the majority of what it could throw at them. The new consultants of today are younger, and as a bonus have a better work–life balance! However, they are both less experienced and less confident. If my consultant trainer's remarks in 1971 were correct, however, there will be situations in which some current new consultants do not measure up to the clinical task, simply because of this inexperience. Department managers must recognize this and, within the constraints of a unitary consultant grade, make provision to help their new colleagues as they gain the additional experience that they missed as trainees. After 3 or 4 years they will be excellent consultants with all the necessary experience. What can be done to help them? It is beyond the scope of this editorial to discuss the many ways that new consultants can be supported clinically and pastorally, but supported they must be. It is probable that if new consultants begin to run into clinical problems, because of their relative inexperience, demands to appoint them as a junior or subordinate grade may ensue. Whilst there may be legitimate reasons to go down that road, the profession should not be forced into it as a consequence of its failure to deal with a predictable and manageable change in the experience of applicants for consultant posts.

Until recently, the anaesthetic trainee has been subject to a surfeit of experience. If he or she was judged to fall short of the mark for competence, then lack of clinical experience was unlikely to be the cause. This situation has now changed and new approaches to training are needed.

  1. Training authorities should consider stipulating minimum levels of training experience in terms of either time spent in subspecialties or total caseload.
  2. Hospitals that train anaesthetists must allocate a high priority to ensuring that each individual undertakes placements that enable his or her needs for experience to be met.
  3. The best use must be made of available clinical experience by ensuring that trainees are well taught and supervised.
  4. The critical inexperience of new consultants must be recognized and departments must make special provisions for their support.
  5. Substantive consultant posts for new consultants must be designed to recognize the reality that they are less experienced and less confident than their predecessors, and that this must be reflected in the amount and quantity of work that is demanded of them in their first few years.

References

1 Underwood SM, McIndoe AK. Influence of changing work patterns on training in anaesthesia: an analysis of activity in a UK teaching hospital from 1996 to 2004 Br J Anaesth 2005; 95: 616–21[Abstract/Free Full Text]

2 Spargo PM. UK anaesthetic training and the law of unintended consequences. Cause for concern? Anaesthesia 2005; 60: 319–22[CrossRef][ISI][Medline]

3 Barone JE, Ivy ME. Resident work hours: the five stages of grief. Acad Med 2004; 79: 379–80[Abstract/Free Full Text]

4 CCST in Anaesthesia I: General Principles. A Manual for Trainees and Trainers, 2nd edn. London: Royal College of Anaesthetists, 2003

5 CCST in Anaesthesia IV: Competency Based Specialist Registrar Years, 3, 4 and 5 Training and Assessment: A Manual for Trainees and Trainers, 1st edn. London: Royal College of Anaesthetists, 2003

6 Kestin IG. A statistical approach to measuring the competence of anaesthetic trainees at practical procedures. Br J Anaesth 1995; 75: 805–9[Abstract/Free Full Text]

7 Titley OG, Bracka A. A 5-year audit of trainees experience and outcomes with two-stage hypospadias surgery. Br J Plastic Surg 1998; 51: 370–5[CrossRef][ISI][Medline]

8 Marshall JB. Technical proficiency of trainees performing colonoscopy: a learning curve. Gastrointest Endosc 1995; 42: 287–91[ISI][Medline]

9 Kopacz DJ, Neal JM, Pollock JE. The regional anesthesia ‘learning curve’. What is the minimum number of epidural and spinal blocks to reach consistency? Reg Anesth 1996; 21: 182–90[ISI][Medline]

10 Harrison MJ. Tracking the early acquisition of skills by trainees. Anaesthesia 2001; 56: 995–8[CrossRef][ISI][Medline]

11 Grau T, Bartusseck E, Conradi R. Martin E, Motsch J. Ultrasound imaging improves learning curves in obstetric epidural anesthesia: a preliminary study. Can J Anaesth 2003; 50: 1047–50[Abstract/Free Full Text]

12 Sutton DN, Wayman J, Griffin SM. Learning curve for oesophageal cancer surgery. Br J Surg 1998; 85: 1399–1402[CrossRef][ISI][Medline]

13 Davies BW, Campbell WB. Inguinal hernia repair: see one, do one, teach one? Ann R Coll Surg Engl 1995; 77 (Suppl): 299–301[ISI][Medline]

14 GM Fullarton, G Bell. Prospective audit of the introduction of laparoscopic cholecystectomy in the west of Scotland. West of Scotland Laparoscopic Cholecystectomy Audit Group. Gut 1994; 35: 1121–26[Abstract]

15 Martin KR, Burton RL. The phacoemulsification learning curve: per-operative complications in the first 3000 cases of an experienced surgeon. Eye 2000; 14: 190–5[ISI][Medline]

16 Wilkinson J, Benjamin A, Wade W. Assessing the performance of doctors in training. Br Med J 2003; 32: S91–2

17 Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med 2003; 138: 476–81[Abstract/Free Full Text]