Assessment of specialist registrars in rheumatology: experience of an objective structured clinical examination (OSCE)

A. B. Hassell on behalf of the members of the West Midlands Rheumatology Services and Training Committee

Staffordshire Rheumatology Centre and Keele University, Haywood Hospital, High Lane, Burslem, Stoke-on-Trent, Staffordshire ST6 7AG, UK

Abstract

Objectives. Assessment of higher medical trainees (specialist registrars) in rheumatology is an important challenge facing the rheumatology community, particularly with the advent of the implemention of the changes recommended by the Calman Report in the UK. So far there has been remarkably little work in this area. Our aim was to implement and evaluate an objective structured clinical examination (OSCE) for rheumatology specialist registrars (SpRs).

Methods. Twelve SpRs completed a 12-station OSCE designed to assess core rheumatological clinical skills. The OSCE was designed and manned by consultant members of the West Midlands Rheumatology Services and Training Committee. The OSCE was evaluated by the SpRs, the participating consultant supervisors and the patients, by means of questionnaires.

Results. We present the details of the OSCE stations and the scores for each station. In terms of evaluation, 11 out of 12 SpRs felt that it was a very worthwhile exercise. Participating patients found it interesting, if tiring. All would be happy to participate in such an examination again. All participating consultants found it interesting and useful in terms of establishing the level of competence among trainees.

Conclusion. The OSCE represents one practical approach to assessing clinical skills in rheumatology SpRs. It has potential in both formative and summative assessment. The broader issues around the assessment of rheumatology trainees are discussed.

KEY WORDS: Assessment, Rheumatology trainees, Training.

The Calman Report and the introduction of the training grade of specialist registrar (SpR) has seen major changes in the higher specialist training of doctors in the UK [1]. The period of training is shorter, educational goals must be explicit and agreed upon with the trainee in an educational contract, and trainees must meet annual assessment requirements before proceeding to the next stage of their training. In rheumatology, in common with many other medical specialities, there is relatively little guidance on, or experience in, the nature of the annual assessments and the criteria by which trainees should be judged. Trainees now complete logbooks, scrutiny of which constitutes a significant component of their assessment [2], yet there is little guidance as to the scoring basis on which the logbook is constructed. Each regional training committee holds annual assessment meetings of their trainees. However, the structure and nature of these assessments is undetermined. In practice, the assessment can end up dwelling on aspects of process—number of clinics and ward rounds attended, courses attended, protected study time etc. While clearly relevant to a training post, there is no evidence to suggest that these reliably assess the attainment of educational goals, competence, attitudinal development or performance.

Objective structured clinical examinations (OSCE) have been used as a means of assessing clinical skills since 1975 [3]. They have been used most extensively to assess undergraduates and seem well suited to this level of assessment. More challenging is their use in higher medical training, in the context of which more refined skills may require assessment. However, OSCEs have been used to assess general practitioners, surgeons and paediatricians [46]. The OSCE generally consists of a circuit of stations. At each station the candidate is asked to perform a specific task, such as taking a history from a patient or performing a regional examination of a patient. Each station is manned by an examiner, who scores the candidate on a pro forma specific to that station. After a fixed period of time (typically 5–15 min), the bell rings and the candidate moves on to the next station of the circuit [7].

The West Midlands Rheumatology Services and Training Committee (WMRSTC), in common with all regional training committees, is charged with overseeing the training of SpRs within the region on behalf of the Regional Postgraduate Dean. These registrars are all doctors who have passed the MRCP examinations or equivalent and are in a 4- or 5-yr rheumatology training post which culminates in the attainment of their Certificate of Completion of Specialist Training (CCST) in Rheumatology, which makes the candidate eligible to apply for consultant rheumatologist posts in the UK. The WMRSTC has performed annual assessment meetings since the inception of the rheumatology SpR training grade in 1997. However, it was felt that these meetings lacked the ability to assess accurately competence and quality of trainees. In an initial effort to assess clinical competence, one crucial aspect of rheumatology training, we conducted an OSCE for SpRs in rheumatology. We report the results of this exercise.

The overall aim of the exercise was to assess the practicability of using an OSCE to assess the clinical skills of rheumatology SpRs. The specific aim of the OSCE was to establish whether candidates were clinically competent in some of the essential skills of a rheumatology specialist.

Methods

Candidates and setting
Twelve SpRs, representing all those available in the West Midlands except two, were assessed at the rheumatology out-patient department of the Staffordshire Rheumatology Centre. There were 13 stations, including one rest station, each in a different consultation room. Each station lasted 10 min.

The OSCE stations
All invited consultant rheumatologists who agreed to act as examiners were circulated with a paper describing the OSCE examination [7]. Examiners met 1 month before the examination to discuss and decide on stations. Stations were designed to assess a range of clinical skills relevant to a practising rheumatologist with reference to the core curriculum produced by the Joint Committee on Higher Medical Training [8]. These skills included history-taking, physical examination, problem-solving and the interpretation of radiographs, synovial fluid microscopy and clinical slides.

Once the stations were agreed upon, one individual examiner took responsibility for producing the scoring pro forma for that station, in conjunction with the examination coordinator (ABH) to ensure consistency in the marking approach. The coordinator arranged for appropriately consented patients to attend for the assessment morning.

OSCE evaluation
The assessment was evaluated in terms of the marks for individual stations (the total possible score at each station was 10), giving some indication of each station's difficulty and the candidates' abilities.

The SpR candidates evaluated the OSCE by means of a questionnaire, which was completed immediately after the OSCE, before they had received any feedback. The questionnaire consisted of four statements with which the trainee registered their agreement/disagreement on a Likert scale. They were also asked to score each station and to make any overall comments. Feedback from the patients was also by questionnaire, which was sent by post 1 month after the OSCE. This questionnaire consisted of eight statements. Patients were asked to tick any statement with which they agreed. They were then asked to score, on a Likert scale, their willingness to take part in similar examinations in future.

Feedback from examiners on the day was informal, at the examiners' meeting immediately after the assessment. One month later the annual regional assessment meeting for rheumatology SpRs conducted by four delegated members of the WMRTSC took place. At this meeting the committee members (three of whom had examined in the OSCE) were asked to score the usefulness of the information arising out of the OSCE.

Cost
We felt it important to estimate the costs of the exercise. Consultant time was costed at £150 per 3-h session (representing cost to the NHS and employing trusts), SpR time was costed at £90 per session (representing costs to the Regional Department of Postgraduate Medicine and employing trusts). An estimate of the cost for clinic space was made on the basis of previous charges for out-patient appointments to fund-holding general practices.

Results

The OSCE stations
Table 1Go summarizes the stations. Four of the twelve stations did not involve a patient: one entailing the interpretation of radiographs on a viewing box; one synovial fluid microscopy station (four slides of urate crystals, pyrophosphate crystals, cholesterol crystals and Gram-positive cocci); one a viva station in which the examiner gave a clinical scenario on which the candidate was questioned; and one showing projected slides of clinical conditions. For one station the patient was a ‘mock’ patient. A musculoskeletal physiotherapist played the part of a young woman working in the potteries who had developed a regional pain syndrome involving the right upper limb. The role taken by the physiotherapist had been closely worked out in conjunction with the rheumatologist examining that station.


View this table:
[in this window]
[in a new window]
 
TABLE 1. Summary of the individual stations in the OSCE

 
For each station there was an objective checklist against which the examiner scored each candidate. Table 2Go shows an example of a station scoring pro forma. For each station a final mark was awarded to the candidate on a scale of 0–10.


View this table:
[in this window]
[in a new window]
 
TABLE 2. Example of a scoring pro forma

 

Candidates' scores
Table 3Go summarizes the candidates' scores for each station and their overall scores. Trainees generally scored well on the aspects assessing core clinical skills—the assessment of patients with ankylosing spondylitis, hypermobility syndrome, shoulder impingement syndrome and scleroderma. The active joint count station scores were lower than might be expected (median 6, range 4–7), but this represented a lack of familiarity with joint scores used in research studies rather than inadequacy of joint examination technique. More candidates struggled with the synovial fluid microscopy station, the clinical sides and the X-ray station. The one clinical assessment with which most candidates struggled was the assessment of the feet in a patient with rheumatoid arthritis.


View this table:
[in this window]
[in a new window]
 
TABLE 3. Candidates' scores for each OSCE station (all candidates were scored between 1 and 10 for each station)

 
One candidate's overall performance was felt to cause concern. Appropriate action was taken with respect to review of this candidate's duties, level of supervision and formal training.

Evaluation of the OSCE by participating trainees
All 12 participating SpRs completed feedback questionnaires. Eleven agreed (including four who strongly agreed) with the statement that ‘overall the OSCE was a very worthwhile exercise’. One disagreed with this statement.

While two agreed that ‘the OSCE was too anxiety provoking’, nine disagreed.

All 12 disagreed (three strongly) with the statement ‘it is unnecessary to attempt to assess the clinical skills of specialist registrars in rheumatology’.

Nine disagreed (two strongly) with the statement ‘Clinical assessment is useful for rheumatology SpRs, but there are better methods’. Two agreed with this statement.

In the space left for free comments, three candidates volunteered that the assessment was ‘well organised with a good balance of cases and serving to highlight areas requiring attention’. Two commented that they would have liked feedback during the assessment. One commented that it was ‘unclear why the assessment was done—for formative or summative purposes, to assess the trainee or their training?’ Finally, one trainee felt that such assessment could adequately be performed within the setting of each clinical post by the supervising consultant.

Evaluation of the OSCE by participating patients
Nine out of ten patients contacted returned completed feedback forms. Table 4Go summarizes the patient feedback, which was generally very positive, all patients feeling that this was an interesting and worthwhile exercise. Seven found it ‘a little tiring’ and three ‘uncomfortable for my joints’.


View this table:
[in this window]
[in a new window]
 
TABLE 4. Patient responses to OSCE on feedback questionnaire (patients were asked to tick any statements with which they agreed; n=9)

 
All nine agreed (four strongly) that they would ‘be happy to participate in such an assessment again’.

Examiner feedback
Informal feedback by participating consultants immediately after the OSCE was extremely positive. All found the exercise worthwhile and interesting, if tiring.

Questionnaire feedback from the four consultants responsible for the SpR annual assessment 1 month later was also positive. All four felt that the OSCE contributed ‘a lot’ of useful information about the trainee, which would otherwise have been unavailable to them for the annual assessment. Areas identified as particularly valuable were information on regional examination skills and the identification of under-achievers. All four felt that an OSCE should form part of the formative and summative assessment of SpRs in the future.

Cost of the OSCE
Table 5Go shows the estimates of the costs of the OSCE, calculated as outlined in the Methods sections. The total estimated cost was £6580. This does not include any cost in time for patients or their carers.


View this table:
[in this window]
[in a new window]
 
TABLE 5. Estimated costs of the OSCE

 

Discussion

We have described the implementation of an OSCE for SpRs in rheumatology, designed to assess core clinical skills and to provide useful information for the annual regional assessment. At previous regional assessment meetings, consultants had expressed concern that we were not really assessing the quality of SpRs but rather the quality of training posts. Traditionally in rheumatology, as in other specialities, trainees have not been formally assessed but their supervising consultants have formed impressions of the trainee during their period of attachment with that consultant. The success of this approach is hard to assess objectively. Moreover, with the changes in higher medical training, this as the sole method of assessment becomes more difficult to sustain; the combination of a shorter period of training and the restrictions of junior doctors' hours means that there may be less opportunity to informally assess a trainee in clinical practice. Furthermore, the advent of clinical governance puts a greater responsibility on postgraduate deans and training committees to ensure the competence of the trainee.

As an initial step in developing a framework for assessing rheumatology trainees, we have implemented an OSCE. The advantages of the OSCE lie in its objectivity and the fact that a variety of clinical skills can be assessed [8]. Moreover, such an assessment serves to emphasize the importance of the core clinical skills of the rheumatologist. It provides a useful reference point for trainees and trainers alike. The disadvantages are that the assessment of higher levels of skills, involving more subtle aspects of patient communication, diagnosis, problem-solving and management, are more difficult to examine in this model. However, we did find that there was scope to conduct a ‘mini viva’ on certain stations, an approach that worked quite successfully. Although we have not formally assessed this OSCE's reliability, the OSCE as an examination has been validated fairly exhaustively in other fields [8]. The participating SpRs and consultants certainly had no doubts of its face and content validity. Our overall impression is that this assessment was a sound means of judging core clinical skills. It must be acknowledged that it assesses competence rather than actual performance in everyday clinical practice. However, this seems a reasonable compromise in the setting of trainee assessment.

If an OSCE were to be used regularly as part of the process of assessing rheumatology SpRs, a number of issues would require exploration. One is the frequency of such assessments. Those participating in the West Midlands felt it would be appropriate to include such an assessment every 2 yr, so that an individual trainee would be assessed twice in such an exercise. One approach would be to hold an OSCE at the end of the first year and at the end of the third year, prior to the assessment in the penultimate year. The first assessment might then be mainly formative in its aims, i.e. to provide feedback to the trainee and his/her supervisors regarding areas of strength and weakness. The penultimate-year OSCE would fulfil a summative function, i.e. it would have the aim of ensuring competence. The implication of this is that there would be a criterion-referenced pass mark, i.e. a competency-based mark, rather then the norm-referenced approach previously adopted by competitive examinations, such as the examination for Membership of the Royal College of Physicians (MRCP), in which a given percentage of candidates pass. A second important question is whether such assessments should be performed on a regional or a national basis. It could be argued that a regional approach would be suited to the first-year formative assessment but that a national approach would be appropriate for the penultimate-year assessment, although the latter would present considerable logistic difficulties. Perhaps two neighbouring regions could collaborate in assessing trainees.

It remains to be seen how many meaningful different stations can be designed in the OSCE format. It is possible that we will have to develop alternative or supplementary methods of assessing clinical skills. We would contend that this exercise has served to highlight the importance of addressing questions of clinical competence. Clinical competence may also be partially monitored by means of the Training Record (JCHMT, Royal College of Physicians), which includes a type of logbook in which the SpR records their competence in defined procedures and clinical activities (e.g. history-taking and examination with reference to the musculoskeletal system, understanding the social and legal aspects of the rheumatic diseases, aspirating and injecting synovial joints and analysing synovial fluid). Skills and knowledge are scored 0–3. The record is countersigned by the educational supervisor. This represents a significant development on the traditional apprenticeship model and should be valuable in highlighting gaps in an individual's training. Whether it is useful in assessing a trainee's competence is more controversial. There is currently no real standardization of perceived competencies between trainers, nor is it specified by what means a trainee should be judged. Should a supervisor witness the trainee performing all activities described? Should the form be completed on the basis of the overall impression of the trainee together with the trainee's insights into his or her own competencies?

There are a number of domains of competence of a rheumatologist, including knowledge, clinical skills, skills of communication with patients and other health professionals, presentation skills, management skills, and teaching and learning skills. In this assessment we addressed clinical skills and, to an extent, patient communication skills. We recognize the need to look at other domains of competence. We are exploring methods of assessing these other domains. Thus we have started assessing presentation skills by means of formal presentations by the SpR to their peers and a consultant audience. We have used mock interviews for consultant posts as a means of allowing us to evaluate trainees as potential consultants and at the same time giving the trainee the opportunity to assess themselves in this context.

In conjunction with this development of formal and semiformal annual assessments, we have been aware of the need to develop a strategy of trainee appraisal in which the emphasis is on the trainee's educational needs and on feedback rather than on competence. Thus, as well as regular meetings between the trainee and their educational supervisor we hold regular subregional appraisal meetings at which training needs are identified and attainment goals agreed upon.

The implementation of the OSCE in SpR assessment may represent a small but significant step in this difficult area of rheumatology training. It has the potential to be useful in both formative and summative assessment. There is a clear need for the rheumatology community to identify appropriate methods of ensuring our trainees are—and are seen to be—adequately trained.

Acknowledgments

Dr M. Allen, Professor P. A. Bacon, Dr S. Bowman, Dr T. E. Hothersall, Dr G. Kitas (Chairman), Dr M. Pugh, Dr I. Rowe (Secretary), Dr M. F. Shadforth, Dr D. Situnayake and Dr T. Sheeran were all members of the West Midlands Rheumatology Services and Training Committee who took part in the design and implementation of the OSCE.

Notes

Correspondence to: A. B. Hassell. Back

References

  1. Calman KC, Temple JG, Naysmith R, Cairncross RG, Bennett SJ. Reforming higher specialist training in the United Kingdom—a step along the continuum of medical education. Med Educ1999;33:28–33.[ISI][Medline]
  2. A guide to specialist registrar training. Department of Health 1996.
  3. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured clinical examination. Br Med J1975;1:447–51.[ISI][Medline]
  4. Grand'Maison P, Brailovsky CA, Lescop J. Content validity of the Quebec licensing examination (OSCE). Can Fam Physician1996;42:254–9.[ISI][Medline]
  5. Sloan DA, Donnelly MB, Schwartz RW, Felts JL, Blue AV, Strodel WE. The use of objective structured clinical examination for evaluation and instruction in graduate medical education. J Surg Res1996;63:1225–30.[ISI][Medline]
  6. Matsell DG, Wolfish NM, Hsu E. Reliability and validity of the OSCE in Paediatrics. Med Educ1991;25:293–9.[ISI][Medline]
  7. Selby C, Osman L, Davis M, Lee M. How to do it: Set up and run an objective structured clinical exam. Br Med J1995;310:1187–90.[Free Full Text]
  8. Sloan DA, Donnelly MB, Schwartz RW, Strodel WE. The Objective Structured Clinical Examination. The new gold standard for evaluation of postgraduate clinical performance. Ann Surg1995;222:735–42.[ISI][Medline]
Submitted 16 November 2001; Accepted 5 April 2002