Home Print this page Email this page Users Online: 224
Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2017  |  Volume : 5  |  Issue : 1  |  Page : 49-55

Student and faculty perception of objective structured clinical examination: A teaching hospital experience


1 Department of Internal Medicine, King Fahd Hospital of the University, University of Dammam, Dammam, Saudi Arabia
2 Department of Medical Education, College of Medicine, University of Dammam, Dammam, Saudi Arabia

Date of Web Publication16-Nov-2016

Correspondence Address:
Abir H Alsaid
P.O. Box 2258, Al-Khobar 31952
Saudi Arabia
Login to access the Email id

DOI: 10.4103/1658-631X.194250

Rights and Permissions
  Abstract 

Introduction: The primary objective of this study was to explore student and faculty perception of the objective structured clinical examination (OSCE) to assess the clinical competence of 5th year medical students.
Methods: Two validated tools were used to survey students' and faculty perception of the OSCE as an assessment tool. The questionnaires were self-administered and handed to the students immediately after the OSCE was conducted. Subjects were 29 female students who had completed their 3-week Internal Medicine rotation and 15 faculty members who had participated in evaluating the students. The response rate was 100%. The OSCE comprised of 21 active stations involving skills like history taking standardized patients were used, physical examination, and data interpretation for which real patients were used.
Results: Majority of students, 63.2% indicated that the OSCE assessed their skills fairly. This was also true for 80% thought the OSCE was a fair method of assessing students' skills as well as a better assessment tool than the traditional long/short case exams.
Conclusion: The OSCE was positively perceived by 5th year medical students and faculty members as a tool that can fairly assess students' clinical skills.

  Abstract in Arabic 

ملخص البحث:
تعنى هذه الدراسة المستقبلية عن مدى إدراك طلبة الطب للامتحان السريري المنظم (OSCE). أجرى هذا الامتحان في مستشفى الملك فهد الجامعي بالخبر في شهر ابريل 2014، وزع استبيان على الطلبة بعد انتهائهم من تأدية الامتحان لمعرفة مدى إدراكهم وأرائهم عن هذا النوع من الامتحان. بينت هذه الدراسة أن (68.8%) من الطلاب وجدو أن حالات الامتحان كانت موثوق بها وعليه فان هذا النوع من الامتحانات قيمت ايجابيا من الطلاب. يوصي الباحثان بان تجرى دراسات أخرى تشمل عدد اكبر من الطلاب.

Keywords: Faculty survey, objective structured clinical examination, student survey


How to cite this article:
Alsaid AH, Al-Sheikh M. Student and faculty perception of objective structured clinical examination: A teaching hospital experience. Saudi J Med Med Sci 2017;5:49-55

How to cite this URL:
Alsaid AH, Al-Sheikh M. Student and faculty perception of objective structured clinical examination: A teaching hospital experience. Saudi J Med Med Sci [serial online] 2017 [cited 2017 Nov 22];5:49-55. Available from: http://www.sjmms.net/text.asp?2017/5/1/49/194250


  Introduction Top


There are three main intersecting areas of medical education: Curriculum design, instructional methods and assessment measures.[1] The objective structured clinical examination (OSCE) is an examination method that was hypothesized in the 1960s by Harden and first reported in the British Medical Journal.[2] It has since been used as a valuable tool to evaluate students' clinical skills in medical, dentistry, nursing and pharmacology schools worldwide. It has also been evaluated in a comparative study which aimed to asses 2 groups of final year medical students in 2 British medical schools. The result of that study highlighted that OSCE is a valid tool for assessing clinical competence among medical students and is also able to determine areas where teaching methods and or curriculum content might have contributed to students' performance in each group.[3] According to the concept of Miller's pyramid, OSCE is defined as an assessment method designed for the evaluation of clinical competency at the level of “shows how.”[4]

OSCEs are conducted by means of rotating students in successive stations that assess students' skills in history taking, physical examination, communication skills, patient management, diagnosis and data interpretation. The stations are organized in such a way that it allows students to rotate smoothly in a predetermined time while being observed by an examiner on a one-to-one basis using standard patients. Standardized OSCE is a fairly new method of assessment in Saudi Arabia and is being conducted in many universities across the Kingdom. A study conducted in King Saud University in 2006 on 95 students proved the OSCE to be a highly reliable method of student assessment.[5] The Department of Internal Medicine at the University of Dammam has used this method of assessment since 2013 to assess the clinical competency of 5th year medical students.

The Accreditation Council for Graduate Medical Education (ACGME), classifies medical competence into six domains: Medical knowledge (MK), patient care (PC), professionalism, interpersonal and communication skills (ICS), systems-based practice (SBP), practice-based learning and improvement (PBLI).[6] The OSCE is considered to be a valid tool for assessing PC, ICS and professionalism. It is also a reliable method for the evaluation of PBLI and SBP, but not MK.[6],[7] Thus, OSCEs are considered by some medical educators to be the gold standard of assessment methods.[8]

The OSCE allows different aspects of clinical competence to be assessed in a comprehensive, consistent, controlled, and objective manner. The development of quality assessment methods is important and needs to be dynamically improved to achieve excellence in conducting standardized OSCEs. The advantages of the OSCE are that it ensures a uniform marking scheme and consistent examination set-ups for both examiners and students. A formative OSCE allows immediate feedback, which enhances the student's learning experience and improves his/her proficiency in the following stations. The OSCE, unlike long case exams, eliminates prejudice and allows all students to go through the same criteria for assessment. It objectively assesses necessary facets of clinical competence, such as physical examination and history taking skills, as well as problem-solving, decision-making abilities, patient treatment, interactive competencies and thus has a powerful educational influence.[9]

The OSCE was used to measure the performance of 117 second year medical students at the end of introductory courses at the Bowman Gray School of Medicine to test their knowledge on differential and physical diagnosis. It has been used as an assessment tool for two continuous years. Benefits were assessed through a questionnaire with an 80% response rate from the faculty involved. Despite the high expenses involved in conducting the OSCE, it was reported that it was worth the time invested in evaluating the students and encouraged its use in the future. The majority of the examinees were satisfied with the examination and thought that it should continue to be used as an assessment tool.[10]

Despite the indisputable fact that the OSCE is a potent assessment tool in medical education, it also has some reported disadvantages. Although the OSCE is reported to be a reliable, valid and objective tool, it is expensive and time-consuming, which are two major drawbacks.[11]

The high costs are primarily related to manpower (examiners, patients, coordinators), resources, time and space as well as the extensive organization required. The non-integrated type of OSCE assesses students on certain medical tasks, and this compartmentalizes their skills and may encourage them to evaluate patients partially rather than as a whole. Albeit, students are being assessed on different areas of knowledge and skills, it is still considered a narrow scope especially in terms of history taking and physical examination.[12]


  Methods Top


In April 2014, the Department of Internal Medicine in King Fahd Hospital of the University held its second OSCE for 5th year medical students. Twenty-nine female students were examined in the short stay ward. The OSCE included 21 active stations, comprising 10 history-taking stations, 6 physical examination stations and 5 data interpretation stations [Table 1]. The examination lasted for 174 min.
Table 1: Blueprint Objective Structured Clinical Examination April 2014

Click here to view


After the exam, the students were handed the questionnaire [Table 2]. A different survey was given to 15 participating faculty members [Table 3].
Table 2: Participant survey evaluation of the objective structured clinical examination experience

Click here to view
Table 3: Faculty survey evaluation of the objective structured clinical examination experience

Click here to view


The students' surveys were based on a self-assessment of their learning experience as well as their opinion on the level of exposure they had to similar cases during their rotation and their perception on the level of their performance.

The faculty surveys captured their thoughts on the educational value OSCE provided to the students and if it offered any added value in the evaluation of students' knowledge levels. Both surveys included an item to assess the comfort level during the OSCE. This research was approved by the Medical Ethics Committee.


  Results Top


The surveys were collected from the 29 students and 15 faculty members and analyzed, with no other variables being considered. On analyzing the results of the students' survey, two main observations were made. First, 63.2% of students and 80% of faculty concurred that the OSCE was a fair assessment of clinical skills. The OSCE was perceived as a better assessment tool than the traditional long/short case exams by 80% of faculty [Figure 1] and [Figure 2].
Figure 1: Students' feedback

Click here to view
Figure 2: Facilities feedback

Click here to view


Another finding was the amount of clinical exposure and its impact on students' perception of their performance in the exam. The majority of students were exposed to General Medicine and Gastroenterology examinations, 68.4% and 57.9%, respectively. The results showed that 89.5% of students had no exposure to pulmonary data interpretation (chest x-ray) and 84.2% had no exposure to nephrology physical examination and hematology history-taking [Figure 3]. The rating scale of performance was divided into excellent, good, fair and poor. When asked about their perception on their performance in the exam, 52.6% thought they performed well in the gastrointestinal examination station, whereas 47.4% reported a fair performance in the general medical examination.
Figure 3: Students feedback on their pre examination clinical exposure to similar cases

Click here to view


About 31.6% of students thought they did well, and another 31.6% reported a fair performance in the hematology history taking station whereas, 52.6% ranked their performance as fair in the pulmonary data interpretation [Figure 4].
Figure 4: Students' feedback on their performance in objective structured clinical examination stations

Click here to view



  Discussion Top


The traditional clinical examinations and the written examinations test a limited range of cognitive and clinical skills. The traditional clinical exam, which depends on two examiners observing and testing mainly history-taking and physical examination and clinical reasoning skills of a student, has been deemed unreliable due to the margin of variability between the two examiners. Therefore, it was necessary to change this approach to reliably assess the other essential skills a medical student ought to have.

According to a study published in the Archives of Disease in Childhood, which assessed 229 final year medical students, a more positive correlation was found between OSCE and other forms of assessment, and showed little correlation between the OSCE and viva voce results. This supports OSCE as an acceptable alternative if not superior to traditional assessment tools.[13] A study was published in the Saudi Medical Journal, which assessed 64 students undergoing their final year surgical clerkship. This study found that the OSCE is a reliable and a valid format for testing clinical skills.[14] A comparison between the performance of 3rd year medical students in the OSCE and their subsequent performance in clinical examinations in year 4 and 5 of the course was explored in the Medical Education Journal, revealing that the OSCE predicted the students' performance in a subsequent clinical examination, thereby proving that the OSCE is a valid assessment tool.[15] Our study showed that OSCE is perceived as a fair assessment tool by both students and faculty members [Figure 1]. This study was useful in shedding light on clinical training and the level of students' exposure to certain cases during their clinical rotation prior to entering the examination and their perception of their performance. The majority of students concurred that they were most exposed to general medical (68.4%) and gastroenterology examinations (57.9%) during their clinical rotation. There was obvious lack of exposure to pulmonary data interpretation (chest x-ray), nephrology examination and in hematology history-taking [Figure 3].

A valid explanation may be that general examination is covered in every single bedside teaching session and is, therefore, a technique that the students have mastered. Whereas, the nephrology examination is not covered by all academic staff. Another reason is that patients with renal disease are usually too sick to tolerate being examined by large groups of medical students and physical signs of renal disease are sparse. Lack of exposure to hematology history-taking is surprising since a considerable number of sickle cell disease patients are admitted to the internal medicine ward. With regard to pulmonary data interpretation, namely chest x-ray, it is usually not covered during bedside teaching and is mainly discussed with groups of students who are being taught by a pulmonologist or a general medicine faculty.

The students' perceptions of their performance in examination stations followed the same trend as their exposure to the relevant subjects. One exception stood out, which was the general medicine station where 68% of the students were exposed to general medicine cases. However, their perception of their performance scored a satisfaction rate of only 37% [Figure 5]. This triggers a question about the design of the general medicine station, clarity of instructions, as well as the individual variability among examiners.
Figure 5: Exposure prior to objective structured clinical examination and students' perception of their performance

Click here to view


In order for OSCEs to be variable and reliable, a careful review of test content and design, training raters, and, as well as implementation factors must be made.[16] A study in Mashhad University of Medical Sciences concluded that the majority of the students (94.5%) had a positive perception of the OSCE. The OSCE as a method of assessment can be recommended if standardization can be achieved and continued.[17]

Overall, surveys used to evaluate the OSCE showed that undergraduate students from various medical schools perceived it positively though certain negative observations such as stress and difficulty were repeatedly stated by them.[18]

A study from King Khalid University in Abha explored students' acceptance of the OSCE as a Method of assessment of clinical competence in internal medicine using self-administered surveys. The majority perceived the OSCE in internal medicine as fair (53%) and comprehensive (56%), albeit stressful.[19]


  Conclusion Top


The significance of students' feedback regarding the usage of assessment tools in the undergraduate medical education is being increasingly recognized and their view on the methods used to assess their skills and their understanding of their curriculum is regarded as an efficient method toward a more successful teaching approach.[20] The results of this study revealed that students assessed by OSCE were generally satisfied, as indicated by their positive feedback which can be utilized to improve our performance in setting a standardized OSCE. Our study was limited by sample size which can be increased in future studies to improve the generalizability of our results. This is still a growing field at the University of Dammam and is a promising area of research. The University of Dammam intends to build on the results of this study to further improve the OSCEs, which will be implemented as an assessment tool in all clinical years.

Acknowledgments

We would like to thank Dr. Waleed Albaker, associate professor of endocrinology and the Chairman of the Internal Medicine Department at King Fahd University Hospital, University of Dammam for his expert advice and encouragement throughout this process and would also like to thank Dr. Aisha Al Osail Consultant General Medicine, at King Fahd Hospital of the University for her assistance and support, as well as Mr. Mohammed Zeshan for his dedication and assistance.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

 
  References Top

1.
Bajammal S, Zaini R, Abuznadah W, Al-Rukban M, Aly SM, Boker A, et al. The need for national medical licensing examination in Saudi Arabia. BMC Med Educ 2008;8:53.  Back to cited text no. 1
    
2.
Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447-51.  Back to cited text no. 2
    
3.
Mc Faul PB, Taylor DJ, Howie PW. The assessment of clinical competence in obstetrics and gynaecology in two medical schools by an objective structured clinical examination. Br J obstet Gynaecol 1993;100:842-6.  Back to cited text no. 3
    
4.
Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65 9 Suppl: S63-7.  Back to cited text no. 4
    
5.
Al-Naami MY, El-Tinay OF, Khairy GA, Mofti SS, Anjum MN. Improvement of psychometric properties of the objective structured clinical examination when assessing problem solving skills of surgical clerkship. Saudi Med J 2011;32:300-4.  Back to cited text no. 5
    
6.
Chan CY. Is OSCE valid for evaluation of the six ACGME general competencies? J Chin Med Assoc 2011;74:193-4.  Back to cited text no. 6
    
7.
Varkey P, Natt N, Lesnick T, Downing S, Yudkowski R. Validity evidence for an OSCE to assess competency in systems-based practice and practice-based learning and improvement: A preliminary investigation. Acad Med 2008;83:775-80  Back to cited text no. 7
    
8.
Swing SR. Assessing the ACGME general competencies: general considerations and assessment methods. Acad Emerg Med 2002;9:1278-88.  Back to cited text no. 8
    
9.
Khan KZ, Ramachandran S, Gaunt K, Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE Guide No 81. Part I: An historical and theoretical perspective. Med Teach 2013;35:e1437-46.  Back to cited text no. 9
    
10.
Frye AW, Richards BF, Philp EB, Philp JR. Is it worth it? A look at the costs and benefits of an OSCE for second-year medical students. Med Teach 1989;11:291-3.  Back to cited text no. 10
    
11.
Barman A. Critiques on the objective structured clinical examination. Ann Acad Med Singapore 2005;34:478-82.  Back to cited text no. 11
    
12.
Zayyan M. Objective structured clinical examination: The assessment of choice. Oman Med J 2011;26:219-22.  Back to cited text no. 12
    
13.
Smith LJ, Price DA, Houston IB. Objective structured clinical examination compared with other forms of student assessment. Arch Dis Child 1984;59:1173-6.  Back to cited text no. 13
    
14.
Al-Naami MY. Reliability, validity, and feasibility of the objective structured clinical examination in assessing clinical skills of final year surgical clerkship. Saudi Med J 2008;29:1802-7.  Back to cited text no. 14
    
15.
Martin IG, Jolly B. Predictive validity and estimated cut score of an objective structured clinical examination (OSCE) used as an assessment of clinical skills at the end of the first clinical year. Med Educ 2002;36:418-25.  Back to cited text no. 15
    
16.
Turner JL, Dankoski ME. Objective structured clinical exams: A critical review. Fam Med 2008;40:574-8.  Back to cited text no. 16
    
17.
Khosravi Khorashad A, Salari S, Baharvahdat H, Hejazi S, Lari SM, Salari M, et al. The assessment of undergraduate medical students' satisfaction levels with the objective structured clinical examination. Iran Red Crescent Med J 2014;16:e13088.  Back to cited text no. 17
    
18.
Raheel H, Naeem N. Assessing the objective structured clinical examination: Saudi family medicine undergraduate medical students' perceptions of the tool. J Pak Med Assoc 2013;63:1281-4.  Back to cited text no. 18
    
19.
Elfaki O, Al-Humayed S. Medical students' perception of OSCE at the department of internal medicine, college of medicine, King Khalid University, Abha, Saudi Arabia. J Coll Physicians Surg Pak 2016 Feb; 26 (2):158-9. Can be accessed through The Online Educational Research Journal (OERJ) www.OERJ.org [Last accessed on 2016 Mar 12].  Back to cited text no. 19
    
20.
Al-Mously N, Nabil N, Salem R. Student feedback on OSPE: An experience of a new medical school in Saudi Arabia. J Int Assoc Med Sci Educ 2012;22:10-6.  Back to cited text no. 20
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4], [Figure 5]
 
 
    Tables

  [Table 1], [Table 2], [Table 3]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
   Abstract
  Introduction
  Methods
  Results
  Discussion
  Conclusion
   References
   Article Figures
   Article Tables

 Article Access Statistics
    Viewed410    
    Printed4    
    Emailed0    
    PDF Downloaded85    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]