1University of Toronto, Department of Anaesthesia, Sunnybrook and Womens College Health Sciences Centre, Womens College Campus, 76 Grenville Street, Toronto, Ontario, Canada M5S 1B2 2Medical Education, Department of Anaesthesia, University of Toronto, Centre for Research in Education at the Toronto Hospital, 585 University Avenue, Bell Wing 6-600, Toronto, Ontario, Canada M5G 2C4
Presented in part at the International Anaesthesia Research Society, 2000.
Accepted for publication: June 5, 2000
Abstract
One hundred and forty-three students and 18 faculty at the University of Toronto participated in a study of the anaesthesia simulator as an evaluation tool. Both student and faculty opinions regarding the experience were elicited using questionnaires with a five-point scale, 1=strongly disagree, 5=strongly agree. Faculty and student opinion were similar and positive with respect to the use of the simulator and matching of educational objectives, its use as a learning experience, its use as an evaluation tool and the need for familiarity with the tool before use as an assessment method. This study supports the use of the simulator as an evaluation tool based on faculty and student opinions provided that prior exposure to the environment is offered.
Br J Anaesth 2000; 85: 779-81
Keywords: education, medical students
At the University of Toronto, medical students spend 2 weeks during their final year of medical school in an anaesthesia rotation. Final clinical evaluation is based on faculty assessments of daily student performance in the operating room. A written examination at the end of a 6-week block, consisting of 10 short answer questions, comprises 60% of the final mark. Educational literature documents many studies attesting to the complexity of the evaluation of medical competence.13 Undergraduate medical education has been fraught with arguments regarding the reliability and validity of assessments that are neither consistent nor standardized.4 The introduction of the objective structured clinical examination (OSCE) by Harden in 1975 as an examination tool is now widespread in Canada and many other parts of the world.5 However, due to the nature of the practice of anaesthesia, the use of the OSCE does not lend itself to comprehensive examination of anaesthesia skills and knowledge. The purchase and availability of the CAE-link simulator at the University of Toronto supplies the means whereby the standardized assessment of performance of undergraduates can be undertaken during their anaesthesia rotation. The validity and reliability of any new assessment tool is important and has been addressed in a previous pilot study.6 Using the same methodology as the pilot project, a subsequent study involving all final year medical students was undertaken. As part of this study, it was deemed important to gather opinions from participants to see if they endorsed the use of this innovative technology for evaluation and educational purposes.
Methods and results
The Canadian Simulation Centre for Human Performance and Crisis Management Training, housed at Sunnybrook Health Sciences Centre, incorporates a full-sized simulated operating room with adjacent control and storage areas and two debriefing rooms. The computer mannequin, complete with a drug recognition system, responds in an appropriate manner to pharmacological and physiological interventions. The mannequin itself has the ability to emit vocal sounds from a distant operator, has breath and heart sounds, and eyelid movements. Technical skills such as manual ventilation of the lungs, tracheal intubation and chest tube insertion can be performed. Emergency carts for difficult intubation and defibrillation are available for use when needed.
After ethics approval from the University of Toronto Research Ethics Board was obtained, all 177 final year medical students at the University of Toronto were invited to participate in this study during the 19981999 academic year. Eighteen faculty agreed to be either video evaluators or simulator facilitators. A workshop for faculty was held to review the purpose of the study and to ensure that protocols used for simulator sessions and evaluation were understood. Ten faculty were assigned as video viewers and eight as facilitators in the simulator.
Six scenarios were designed incorporating the learning objectives of the curriculum. Evaluation protocols for each scenario were developed with five sections: preoperative assessment, preparation, induction, and two intraoperative problems.
All students were given an information sheet regarding the purpose of the study. It was also made clear that their simulator performance evaluation would not be used towards their final grade nor would they receive a mark related to the session. If they agreed to participate, a consent form was signed. On a scheduled educational day during the second week of their 2-week rotation, students worked through a 15-min faculty-facilitated simulator scenario. These sessions were videotaped. Two faculty reviewed and evaluated each students videotaped performance. At the end of the simulator sessions, students met with the attending faculty and group feedback was given, and the learning objectives were addressed in detail. After the feedback session was completed, students were asked to complete an evaluation form summarizing their impression of the experience. Students were asked to respond to nine items using a five-point Likert scale from 1 to 5, 1=strongly disagree, 5=strongly agree. Comments were solicited. Opinions regarding the specifics of the simulator experience, content of the scenarios, realism and value of the simulator as an educational and evaluation tool were addressed. Responses were anonymous.
At the completion of the study, faculty, including the study co-ordinator, were asked to provide feedback regarding the experience using a questionnaire and the same Likert scale (n=19). Responses to 15 items were solicited together with comments pertaining to the items. Comments on the audio-visual qualities, the educational content and the value of the simulator as an evaluation tool were requested. The final item addressed the willingness to participate in further simulator research.
There was a 100% return rate of student questionnaires (n=145). Not all students completed all items, medians and range reflect the number of students who completed the items.
Table 1 shows the median and range of students responses. Students rated the simulator highly as a learning experience and felt strongly that prior exposure to the simulator was needed if it were to be used as an evaluation tool.
|
Comment
Students found the sessions to be an excellent learning experience and pages of comments to this effect were elicited. The response of participants to the use of the simulator as an evaluation tool was less enthusiastic than its use as a learning tool. Thirty-eight percent of students agreed or strongly agreed and the same percentage disagreed or strongly disagreed on the use of the simulator as an evaluation tool, with faculty opinions demonstrating a similar trend. Students and faculty who disagreed expressed the need for experience in a simulator setting before an evaluation process.
In order for this tool to be introduced as an evaluation method, participants opinions should be addressed, and where appropriate, integrated into the proposed assessment technique. With respect to undergraduates and faculty at our institution, important opinions have been elicited that will strengthen the ultimate integration of this innovative tool into undergraduate evaluation.
Acknowledgements
This study was supported in part by a research grant from the Canadian Anesthesiologists Society. The authors would like to acknowledge the efforts of the faculty from the Department of Anaesthesia and the medical students from the University of Toronto.
Footnotes
References
1 Linn RL. Education assessment: expanded expectations and challenges. Educ Eval Policy Analysis 1993; 15: 116
2 Van Der Vleuten CPM, Newble DI. How can clinical reasoning be tested? Lancet 1995; 345: 10324[ISI][Medline]
3 Swanson D, Norcini JJ, Grosso L. Assessment of clinical competence: written and computer based simulations. Assessment Eval Educ 1987; 12: 22046
4 Van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ 1996; 1: 4167
5 Harden RM, Gleeson FA. Assessment of clinical competence using objective structured examination. Br Med J 1975; 1: 44751[ISI][Medline]
6 Morgan P, Cleave-Hogg D. Performance evaluation using the anaesthesia simulator. Med Educ 2000; 34: 425[ISI][Medline]