Home Print this page Email this page Users Online: 382
Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 


 
 Table of Contents  
REVIEW ARTICLE
Year : 2016  |  Volume : 4  |  Issue : 2  |  Page : 79-88

Systematic review of computer based assessments in medical education


Deanship of Information and Communications Technology, University of Dammam, Dammam, Saudi Arabia

Date of Web Publication9-Mar-2016

Correspondence Address:
Zahid Ali
Deanship of Information and Communications Technology, University of Dammam, P. O. Box 1982, 31441, Dammam
Saudi Arabia
Login to access the Email id

DOI: 10.4103/1658-631X.178288

PMID: 30787703

Rights and Permissions
  Abstract 

Medical schools, postgraduate training institutes, licensing and certification bodies have developed and implemented many new methods for accurate, reliable, and timely assessments of the competence of medical professionals and practicing physicians. The underlying objective of all these assessments is to not only evaluate the students' learning and educational goals but also to establish the graduating individual's skills and professionalism. Computer based assessment (CBA) has emerged in recent years as a viable alternative to traditional assessment techniques. It has also infiltrated and influenced the medical curriculum where it has been employed for assessment tasks. This study presents how CBA offers pedagogical opportunities and analyzes its usage pattern over the past three decades. We examined 47 CBAs in medical education and analyzed several assessment components, including application area, assessment purpose, assessment type, assessment format, student level, and emphasized the interplay among these components. Our analysis determined that formative assessment is the most frequently used type and 75% of all assessment types employed used the multiple choice questions format.

  Abstract in Arabic 

ملخص البحث :

طورت كليات الطب والمراكز الصحية والتراخيص المهنية طرق جديدة وموثوقة لتقييم المهارات الطبية للأطباء والممارسين الصحيين. يكمن الهدف الرئيس في كل طرق التقييم المستخدمة لتقييم مدى تحقق الأهداف التعليمية وأيضا لتنمية مهارات الفرد ومهنيته. تطورت الاختبارات المحوسبة في الفترة الماضية لتصبح بديلا لطرق التقييم التقليدية وشاعت في المجال الطبي كوسيلة معتمدة للقياس والتقييم. تعرض هذه الدراسة كيف تقدم الاختبارات المحوسبة فرص تعليمية كبيرة وأيضا طرق استخداماتها الشائعة في العقود الثلاثة الماضية. قام الباحثان بتحليل وفحص الأجزاء المكونة لـ 74 اختبار محوسب في التعليم الطبي بما في ذلك التخصص المراد تطبيق الاختبار عليه ونوع الاختبار ونمطه ومستوى الطلاب المستهدفين ومدى الترابط بين هذه العناصر. خلصت الدراسة الي أن الاستخدام الأكثر شيوعا للاختبارات المحوسبة في التعليم الطبي هو التقييم التكويني وذلك بنسبة %57 وأن أكثر نوع من الأسئلة المستخدمة في الاختبارات المحوسبة في التعليم الطبي هو الاختيارات المتعددة.

Keywords: Computer based assessments, formative and summative assessment, self-assessment


How to cite this article:
Al-Amri S, Ali Z. Systematic review of computer based assessments in medical education. Saudi J Med Med Sci 2016;4:79-88

How to cite this URL:
Al-Amri S, Ali Z. Systematic review of computer based assessments in medical education. Saudi J Med Med Sci [serial online] 2016 [cited 2020 Dec 4];4:79-88. Available from: https://www.sjmms.net/text.asp?2016/4/2/79/178288


  Introduction Top


Measurement of learning competence and performance is an indispensable component of the education process. Computer based assessment (CBA) is an emerging technology that offers a range of advantages over traditional paper-pencil-based testing. These, among others, include rich educational assessment with dynamic sounds visuals, user interactivity, adaptability, improved reliability and impartiality. Near real-time score reporting, instantaneous personalized feedback, time and space independence, and efficient data collection for statistical analysis. [1],[2] The use of computers makes the assessment easier, relieves the faculty of the burdensome task of invigilation and grading. [3] However, some researchers have also discussed the associated disadvantages of using computer technology with the perceived validity of CBA. [4] Universities worldwide have implemented such computer-assisted assessment systems because of the obvious benefits when compared to traditional assessment methods both for formative, summative, and self-assessment purposes. [5] Studies have also been conducted to consider its use for students with disabilities. [6]

CBA has the potential to contribute to different facets of educational and professional testing and to effective learning. It has successfully been implemented for testing basic educational skills, college and university admissions, achievement levels, professional certification and licensing, clinical psychology, life sciences, law, intelligence, language, employment, and adult education. The use of information and communication technologies in medical education is not new as the adoption of CBA techniques has previously been evaluated in the context of medical curriculum teaching and learning, along with the effects of the development of pervasive, high speed information, and knowledge in clinical and medical backgrounds. [7]

An overview of assessments, including computer-based testing approaches in medicine, their advantages, disadvantages and other pertinent questions, has been researched presenting CBA as a qualitative shift away from traditional methods such as paper-based tests and suggests its use for diagnostic purposes for determining students' prior knowledge as well. [2],[8]

It also discusses assessment question type for medical and health professionals and their content to assess higher order intellectual skills and competences. An investigation into the use of CBA in health education suggested that it presents an alternative approach to paper-pencil based assessment. While both approaches show similar results, it can be concluded that the anxiety of computer use and experience in using computers are not related to student performance. [9] It also emphasized that the strength of multiple choice questions (MCQs) lies in the quality of the items being tested. College level medical students found CBA to be convenient in its accessibility and flexible with regard to time and space. [10]

Reports have indicated that medical students showed a keen interest in and had a positive experience using CBA, prompting a recommendation to introduce formative assessment early in higher education. [11] The preceding research also analyzes the opinions of medical students toward web-based assessment, including their reservations, which resulted in a finding that a high percentage of students showed a positive attitude toward it. A six-step approach for developing CBA for summative assessment in a medical college in Saudi Arabia reported that higher percentages of students approved CBA and suggested that undertaking a CBA pilot to acquaint the students with the new assessment tool would be beneficial. [12]

Different techniques have been employed in medical education assessment ranging from exploration based hypercube to case-based brainstorming and mind map pads, and from random based tests to fixed assessments. One study has identified ten different techniques for assessment and have classified these into three categories namely, exploration based, puzzle-based hierarchy, and case based methods. [13] A taxonomy of the application of CBA has been presented that showed the versatility and potential richness of CBA for educational assessment. [1]

Recently, simulation-based software has also been employed in clinical skills and diagnostics to collect data for assessment of medical students, providing feedback, and executing formative assessments. [14],[15] CBA realization and assessment related issues for undergraduate medical education, such as hardware requirements, the choice of software, types of test questions, security, integrity, technical knowledge, and skills are of paramount importance and need the utmost attention before undertaking any form of CBA. [16] Assessment has been applied not only to the medical professional learning assessment but also to assessing medical communication skills successfully. [17]

Very recently, medical schools in United Kingdom have developed projects to exploit the "customized Apps for smartphone" concept that not only provides continuous professional development and lifelong learning but also contains features such as recording evidence, assessing clinicians and healthcare professionals in near-patient environments through teacher uploaded exercises. [18] Such tools report the performance and instructor feedback to the students instantaneously. Virtual patient E-assessment systems have also been developed for assessing practical skills similar to those in a real time environment. [19]

The notion of clinical competence and class performance is embraced and articulated in assessments and evaluations both objectively and subjectively in medical education. [20] It has been reported that medical students perceived CBA more favorably than the traditional assessment methodology. Different models of CBA implementation have thus been proposed, ranging from single computer based to multi-purpose PC labs, and models based around personally owned internet-enabled portable devices. [21] Efforts have been made to integrate an assessment model for CBA in science and analyze its validity in the medical sciences. [22] Researchers have laid down guidelines for teachers regarding how to exploit the use of CBA in medical education. [23]

Objectives

The purpose of this review study is to delineate the ways in which current and potential uses of computer technologies are being employed to support assessment activities in undergraduate and postgraduate medical education. We have considered two different aspects of CBA in medical education: Assessments in class and self-assessments. The study also examines the assessment purposes, levels, types, formats, and the areas in which it is applied and the interplay of these components for assessment in medical education.


  Materials and Methods Top


Data collection

The focus of the current review is to investigate the use of technology applications in assessments in medical education. With this purpose in mind, we were particularly interested in papers that reported the use of CBA in medical education with empirical findings. To ensure the selection of relevant quality articles, we restricted our search for published papers in peer-reviewed academic journals and excluded conference proceedings, book chapters, unpublished manuscripts, dissertations, project reports, and position papers. The rationale behind such an approach is three-fold. First, the review process for publications other than journal papers are normally not that rigorous which may, in turn, lead to incomplete review and unconvincing conclusions. The journal articles undergo a rolling review schedule, with multiple review phases, ensuring the findings and conclusions about the reported assessments are valid, methodological and comprehensive. [22]

Second, the journal articles are usually longer than conference papers and hence present detailed information about the assessments. Also, these other types of publications are not easy to access and may result in asymmetrical studies. Moreover, journal articles provide detailed and comprehensive information regarding the assessment presented. Although focusing only on journal articles allows a consistent and systematic review, this may omit some important research work in these publications and limit the generalizability of this finding. To gather a sufficiently comprehensive corpus for the study, we undertook extensive research on a number of available sources. This included multiple electronic databases such as Summon Web Scale Discovery, Scopus, Web of Knowledge and the Saudi digital library for relevant journal articles published between 1987 and 2013.

Search criteria

The process of search was initiated with a systematic identification of articles with relevant keywords in journals of educational research, educational technology in medicine, and technology-enhanced medical learning. The journals considered during this study are listed in [Table 1]. Although the primary search emphasis was CBAs in medical education, we also considered articles that captured other variations of technology-based assessment in medical education, such as comparison of CBAs to traditional paper-pencil versions. Once the screening process was completed, we proceeded to review the references of selected articles with the aim of identifying new resources for further information regarding assessments. There were journal articles related to CBA in general: Some of these were review articles and the rest were analyzed for this review. The whole process as shown in preferred reporting items for systematic reviews and meta-analyses [Figure 1] yielded 47 assessments in 85 articles as these provided sufficient information about the assessment even if the main focus was on other aspects of measurement practices. Each paper was then read and analyzed for the assessment purpose, type and format; participant level, and application area. This synthesis resulted in the coding scheme described in [Appendix I [Additional file 1]] [63].
Figure 1: Preferred reporting items for systematic reviews and meta-analyses flow diagram

Click here to view
Table 1: Journals searched

Click here to view


Coding for potential moderators

The potential moderators that were identified for this research were the characteristics associated with CBA across the study conducted in medical education. All the coded categories carry a common first author and publication year code. The most important category is the area in which such an assessment has been performed. We identified journal articles from diverse areas in medical education for evaluation purposes. The first of the coded moderators is the assessment category. For the sake of making a distinction, we focused on two broad categories: In-class formative or summative assessments and self-assessments. The next category to be coded is a measure of assessment purpose. This includes assessment of conceptual and factual knowledge and synthesis and applied knowledge where an examinee is required to apply prior concepts to the information presented in the question item. Problem-solving items require solutions in the context of the problem. Other types included skills test and suitability testing and those for which the purpose of assessment had not been specified. The third category to be measured is the assessment format. For this category we took a subset of item types presented by Scalise and Gifford [23] and applied these in the context of medical education. Another category is the level of the assessment applied and it spans the duration of the student's course of study. These categories are presented in [Table 2].
Table 2: Coding scheme

Click here to view


Data analysis

After a rigorous search, we selected 47 assessments in medical journals for study using the following procedures. The foremost consideration given to the assessments were based on the coding criteria defined earlier in [Table 2]. We observed that multiple codes appeared for some assessments in each category as apparent for the assessment presented in Basu et al. [24] with both conventional multiple choice and the selection/correction assessment format. This was done for both the class-based assessments and self-assessments. Once the coding process was completed, a statistical analysis was performed to identify various emerging patterns in the use of CBAs in medical education. The analysis has been divided into tables for multiple categories in the form of percentages. It represents a holistic picture of how CBAs have proliferated in medical education and the emerging patterns.


  Results and Discussion Top


In this section, we present findings from our analysis based on the criteria established. These findings focus on specific areas in medicine, assessment category and purpose, and assessment format. We report our analysis in terms of percentages of the assessments considered in [Table 3] and [Table 4].
Table 3: In-class assessments

Click here to view
Table 4: Summary of in-class assessments format

Click here to view


Assessment disciplines

It can be seen from Appendix I that CBA has been successfully implemented in almost all disciplines related to medical education. It has been applied to in-class as well as self-assessments. As far as the former assessment is concerned, CBA applications range from assessment, and training in clinical skills and practice, internal medicine, nursing, and diagnostics competence to the communication skills. It reveals the proliferation of technology in assessment and evaluation and equips the teachers with a theoretical and practical steering instrument for measuring competence for continuous development and evaluation of learning outcomes. The same trend has been observed for the self-assessments.

Assessment constructs

Overall, in the 47 assessments considered for analysis, the most common type of assessment performed was formative as shown in [Table 3]. Nearly 59% of the assessments were formative in nature with 41% summative. This shows that the CBA is mostly employed as an indicator of overall learning and progress. A large proportion (about 38.3%) of the 47 selected assessments (formative = 26.5%, summative = 11.8%) were based on assessment of conceptual and factual knowledge. The second category for which CBA is extensively adopted for assessment is synthesized and applied knowledge with an overall proportion of 35.5%, contributing 20.5% of formative, and 14.7% of summative assessments. Clinical skills testing have been employed for 11.8% while problem solving based assessment has been used the least (5.9%), with 8.8% of the formats not specified. It can be concluded that formative assessment is the preferred mode of assessment in medical education as it reinforces students' intrinsic motivation toward learning and performance. [34],[37]

Another notable aspect of the analysis sheds light on the fact that medical faculties are more inclined to strengthen conceptual knowledge and information retrieval. It helps the students to adjust performance based on current understanding and achievements. This use shows the degree of reliance on computers for both the formative and summative assessments. Assessments involving test items based on problem solving and skills testing using computers are not a favored mode.

Assessment format

Assessment in class

The validity and reliability of the assessments are crucial and relates to the type of items being used for assessments. We observe that a range of test items have been used in medical assessments. When we look at [Table 4], which lists the assessment formats used for a particular type of assessment, we immediately noticed that multiple choice are favored over other formats, with a high percentage of 72.1%. These could be MCQs with conventional four to five option text answer format or medical context-based figurative MCQs. There are many occasions when extended MCQs were the preferred format type. This is followed by constructed response type format with a proportion of more than 14%. Items based on true/false represent only 6.3% of the total. This indicates that the assessments are more focused on assessing examinees' learning through MCQs than a simple true/false scenario. Also, very few assessments used the selection/identification format (2.1%) or reordering/rearrangements (4.2%). The same can be said about the checklists and substitution/correction (both 4.2%). The test item of presentation based on images or video clips has also been used and contributed to 8.3% of the item types.

Self-assessments

A similar pattern is observed for item types in self-assessment, where the most commonly used item is ranking/sequencing, with a high percentage of approximately 44% [Table 5]. Another commonly-used item is again the multiple choice type, representing 25%. A new item, based on audio video media type, has also been used for self-assessments. This accounts for 12.5% of the item types.
Table 5: Summary of self-assessments format

Click here to view


Participant level

Our analysis has shown that the CBA has been used at both the undergraduate and graduate levels, though most of the published articles have reported CBA use at the undergraduate level. Also, CBA has been utilized across a spectrum of courses, laboratories, and training. This also indicates that technology is permeating not only in learning but also for assessment and evaluation.


  Conclusion Top


Through this study, we have conducted a review of the potential contributions of CBAs to medical education in the last three decades with a special focus on in-class and self-assessments. We have found that CBA is being extensively used for assessments and enhancing learning opportunities. It has a major spillover effect in almost all areas of medical curricula and clinical skills, professional competence testing, and practice. It has been applied both in formative and summative manners to assess factual and applied knowledge. Formative assessments are also being used as a prelude to summative assessments since it motivates students to improve their performance and inspires them to achieve higher professional competence. It has been found from the analysis that MCQ-based assessment formats remain the most commonly used in in-class, self-assessment, and simulation-based assessments. There is a higher percentage of the CBA assessment applied at the undergraduate level in medical education.

We conclude our study by observing that assessment in medical education remains an area of complex competencies. It requires quantitative and qualitative information to be analyzed carefully. When choosing CBA as an assessment instrument, one must ensure that it links the educational objectives with the assessment contents.

Acknowledgment

The authors would like to acknowledge the support and guidance of Prof. Fahad Al-Muhanna, Vice President, University of Dammam, in compiling this article.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

 
  References Top

1.
Thelwall M. Computer-based assessment: A versatile educational tool. Comput Educ 2000;34:37-49.  Back to cited text no. 1
    
2.
Cantillon P, Irish B, Sales D. Using computers for assessment in medicine. BMJ 2004;329:606-9.  Back to cited text no. 2
    
3.
Bull J. Computer-assisted assessment: Impact on higher education institutions. Educ Technol Soc 1999;2:123-6.  Back to cited text no. 3
    
4.
Sheader E, Gouldsborough I, Grady R. Staff and student perceptions of computer-assisted assessment for physiology practical classes. Adv Physiol Educ 2006;30:174-80.  Back to cited text no. 4
    
5.
Zakrzewski S, Bull J. The mass implementation and evaluation of computer-based assessments. Assess Eval High Educ 1998;23:141-52.  Back to cited text no. 5
    
6.
Thompson S, Thurlow M, Moore M. Using computer-based tests with students with disabilities (Policy Directions No. 15). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes; 2002. Available from: http://files.eric.ed.gov/fulltext/ED475318.pdf. [Last retrieved on 2003 Jan 13].  Back to cited text no. 6
    
7.
Ward JP, Gordon J, Field MJ, Lehmann HP. Communication and information technology in medical education. Lancet 2001;357:792-6.  Back to cited text no. 7
    
8.
Norcini JJ, McKinley DW. Assessment methods in medical education. Teach Teach Educ 2007;23:239-50.  Back to cited text no. 8
    
9.
Lee G, Weerakoon P. The role of computer-aided assessment in health professional education: A comparison of student performance in computer-based and paper-and-pen multiple-choice tests. Med Teach 2001;23:152-157.  Back to cited text no. 9
    
10.
Rudland JR, Schwartz P, Ali A. Moving a formative test from a paper-based to a computer-based format. A student viewpoint. Med Teach 2011;33:738-43.  Back to cited text no. 10
    
11.
Deutsch T, Herrmann K, Frese T, Sandholzer H. Implementing computer-based assessment - A web-based mock examination changes attitudes. Comput Educ 2012;58:1068-75.  Back to cited text no. 11
    
12.
Hassanien MA, Al-Hayani A, Abu-Kamer R, Almazrooa A. A six step approach for developing computer based assessment in medical education. Med Teach 2013;35 Suppl 1:S15-9.  Back to cited text no. 12
    
13.
Mooney G. Some techniques for computer-based assessment in medical education. Med Teach 1998;20:560-6.  Back to cited text no. 13
    
14.
Kreiter CD, Haugen T, Leaven T, Goerdt C, Rosenthal N, McGaghie WC, et al. A report on the piloting of a novel computer-based medical case simulation for teaching and formative assessment of diagnostic laboratory testing. Med Educ Online 2011;16;1-7.  Back to cited text no. 14
    
15.
Devitt P, Palmer E. Computers in medical education 3: A possible tool for the assessment of clinical competence? Aust N Z J Surg 1998;68:602-4.  Back to cited text no. 15
    
16.
Hols-Elders W, Bloemendaal P, Bos N, Quaak M, Sijstermans R, De Jong P. Twelve tips for computer-based assessment in medical education. Med Teach 2008;30:673-8.  Back to cited text no. 16
    
17.
Hulsman RL, Mollema ED, Hoos AM, de Haes JC, Donnison-Speijer JD. Assessment of medical communication skills by computer: Assessment method and student experiences. Med Educ 2004;38:813-24.  Back to cited text no. 17
    
18.
MyKnowledgeMap Ltd. - Enhancing Assessment in Medical Education: A MyKnowledgeMap White Paper; 2012. Available from: http://www.myknowledgemap.com/media/76465/enhancing-assessment-med-web-final.pdf. [Last accessed on 2014 Feb 27].  Back to cited text no. 18
    
19.
Scarlat R, Stanescu L, Popescu E, Burdescu DD. Case-Based Medical E-assessment System. Advanced Learning Technologies (ICALT), 2010 IEEE 10 th International Conference on, IEEE; 2010.  Back to cited text no. 19
    
20.
Hibbert KM, Van Deven T, Ros S. Fundamentals of Assessment and Evaluation: Clarifying Terminology. Radiology Education. Berlin, Heidelberg: Springer; 2012. p. 11-9.  Back to cited text no. 20
    
21.
Luecht RM, Sireci S. A Review of Models for Computer-Based Testing; 2012. Available from: http://www.research.collegeboard.org/publications/content/2012/05/review-models-computer-based-testing. [Last accessed on 2014 Mar 12].  Back to cited text no. 21
    
22.
Kuo CY, Wu HK. Toward an integrated model for designing assessment systems: An analysis of the current status of computer-based assessments in science. Comput Educ 2013;68:388-403.  Back to cited text no. 22
    
23.
Scalise K, Gifford B. Computer-based assessment in E-learning: A framework for constructing "intermediate constraint" questions and tasks for technology platforms. J Technol Learn Assess 2006;4:6. Available from: http://www.escholarship.bc.edu/cgi/viewcontent.cgi?article=1036&context=jtla. [Last accessed on 2014 Mar15].  Back to cited text no. 23
    
24.
Basu S, Roberts C, Newble DI, Snaith M. Competence in the musculoskeletal system: Assessing the progression of knowledge through an undergraduate medical course. Med Educ 2004;38:1253-60.  Back to cited text no. 24
    
25.
Chen HY, Chuang CH. The learning effectiveness of nursing students using online testing as an assistant tool: A cluster randomized controlled trial. Nurse Educ Today 2012;32:208-13.  Back to cited text no. 25
    
26.
Wheeler DW, Whittlestone KD, Smith HL, Gupta AK, Menon DK; East Anglian Peri-Operative Medicine Undergraduate Teaching Forum. A web-based system for teaching, assessment and examination of the undergraduate peri-operative medicine curriculum. Anaesthesia 2003;58:1079-86.  Back to cited text no. 26
    
27.
Beullens J, Van Damme B, Jaspaert H, Janssen PJ. Are extended-matching multiple-choice items appropriate for a final test in medical education? Med Teach 2002;24:390-5.  Back to cited text no. 27
    
28.
Beullens J, Struyf E, Van Damme B. Do extended matching multiple-choice questions measure clinical reasoning? Med Educ 2005;39:410-7.  Back to cited text no. 28
    
29.
Siriwardena AN, Dixon H, Blow C, Irish B, Milne P. Performance and views of examiners in the Applied Knowledge Test for the nMRCGP licensing examination. Br J Gen Pract 2009;59:e38-43.  Back to cited text no. 29
    
30.
Gilmer MJ, Murley J, Kyzer E. Web-based testing procedure for nursing students. J Nurs Educ 2003;42:377-80.  Back to cited text no. 30
    
31.
Gordon S, Eisenberg H. Computer-based instruction in clinical medical education: A pulmonary medicine self-assessment. Int J Clin Monit Comput 1987;4:111-3.  Back to cited text no. 31
[PUBMED]    
32.
Devitt P, Palmer E. Computers in medical education 1: Evaluation of a problem-orientated learning package. Aust N Z J Surg 1998;68:284-7.  Back to cited text no. 32
    
33.
Vioreanu MH, O′Daly BJ, Shelly MJ, Devitt BM, O′Byrne JM. Design, implementation and prospective evaluation of a new interactive musculoskeletal module for medical students in Ireland. Ir J Med Sci 2013;182:191-9.  Back to cited text no. 33
    
34.
Krasne S, Wimmers PF, Relan A, Drake TA. Differential effects of two types of formative assessment in predicting performance of first-year medical students. Adv Health Sci Educ Theory Pract 2006;11:155-71.  Back to cited text no. 34
    
35.
Paschal CB. Formative assessment in physiology teaching using a wireless classroom communication system. Adv Physiol Educ 2002;26:299-308.  Back to cited text no. 35
    
36.
Manikam L, Blackwell N, Banerjee J, Nightingale P, Lakhanpaul M. Improving assessment of paediatric acute breathing difficulties in medical education: A cluster randomized controlled trial. Acta Paediatr 2013;102:e205-9.  Back to cited text no. 36
    
37.
Velan GM, Jones P, McNeil HP, Kumar RK. Integrated online formative assessments in the biomedical sciences for medical students: Benefits for learning. BMC Med Educ 2008;8:52.  Back to cited text no. 37
    
38.
Asman P, Lindén C. Internet-based assessment of medical students′ ophthalmoscopy skills. Acta Ophthalmol 2010;88:854-7.  Back to cited text no. 38
    
39.
Lieberman SA, Frye AW, Litwins SD, Rasmusson KA, Boulet JR. Introduction of patient video clips into computer-based testing: Effects on item statistics and reliability estimates. Acad Med 2003;78 10 Suppl: S48-51.  Back to cited text no. 39
    
40.
Ferenchick GS, Solomon D, Foreback J, Towfiq B, Kavanaugh K, Warbasse L, et al. Mobile technology for the facilitation of direct observation and assessment of student performance. Teach Learn Med 2013;25:292-9.  Back to cited text no. 40
    
41.
Rotthoff T, Baehring T, Dicken HD, Fahron U, Richter B, Fischer MR, et al. Comparison between Long-Menu and Open-Ended Questions in computerized medical assessments. A randomized controlled trial. BMC Med Educ 2006;6:50.  Back to cited text no. 41
    
42.
Bernardo V, Ramos MP, Plapler H, De Figueiredo LF, Nader HB, Anção MS, et al. Web-based learning in undergraduate medical education: Development and assessment of an online course on experimental surgery. Int J Med Inform 2004;73:731-42.  Back to cited text no. 42
    
43.
El Shallaly GE, Mekki AM. Use of computer-based clinical examination to assess medical students in surgery. Educ Health (Abingdon) 2012;25:148-52.  Back to cited text no. 43
    
44.
Humphris GM, Kaney S. The Liverpool brief assessment system for communication skills in the making of doctors. Adv Health Sci Educ Theory Pract 2001;6:69-80.  Back to cited text no. 44
    
45.
Leaf DE, Leo J, Smith PR, Yee H, Stern A, Rosenthal PB, et al. SOMOSAT: Utility of a web-based self-assessment tool in undergraduate medical education. Med Teach 2009;31:e211-9.  Back to cited text no. 45
    
46.
Ganguli S, Camacho M, Yam CS, Pedrosa I. Preparing first-year radiology residents and assessing their readiness for on-call responsibilities: Results over 5 years. AJR Am J Roentgenol 2009;192:539-44.  Back to cited text no. 46
    
47.
Swagerty D Jr, Studenski S, Laird R, Rigler S. A case-oriented web-based curriculum in geriatrics for third-year medical students. J Am Geriatr Soc 2000;48:1507-12.  Back to cited text no. 47
    
48.
Feldman MJ, Barnett GO, Link DA, Coleman MA, Lowe JA, O′Rourke EJ. Evaluation of the Clinical Assessment project: A computer-based multimedia tool to assess problem-solving ability in medical students. Pediatrics 2006;118:1380-7.  Back to cited text no. 48
    
49.
Bhakta B, Tennant A, Horton M, Lawton G, Andrich D. Using item response theory to explore the psychometric properties of extended matching questions examination in undergraduate medical education. BMC Med Educ 2005;5:9.  Back to cited text no. 49
    
50.
Peat M, Franklin S. Supporting student learning: The use of computer-based formative assessment modules. Br J Educ Technol 2002;33:515-23.  Back to cited text no. 50
    
51.
Inuwa IM, Taranikanti V, Al-Rawahy M, Habbal O. Anatomy practical examinations: How does student performance on computerized evaluation compare with the traditional format? Anat Sci Educ 2012;5:27-32.  Back to cited text no. 51
    
52.
Antonelli MA. Accuracy of second-year medical students′ self-assessment of clinical skills. Acad Med 1997;72 10 Suppl 1:S63-5.  Back to cited text no. 52
    
53.
Albanese M, Dottl S, Mejicano G, Zakowski L, Seibert C, Van Eyck S, et al. Distorted perceptions of competence and incompetence are more than regression effects. Adv Health Sci Educ Theory Pract 2006;11:267-78.  Back to cited text no. 53
    
54.
Vivekananda-Schmidt P, Lewis M, Hassell AB, Coady D, Walker D, Kay L, et al. Validation of MSAT: An instrument to measure medical students′ self-assessed confidence in musculoskeletal examination skills. Med Educ 2007;41:402-10.  Back to cited text no. 54
    
55.
Eva KW, Cunnington JP, Reiter HI, Keane DR, Norman GR. How can I know what I don′t know? Poor self assessment in a well-defined domain. Adv Health Sci Educ Theory Pract 2004;9:211-24.  Back to cited text no. 55
    
56.
Fitzgerald JT, Gruppen LD, White CB. The influence of task formats on the accuracy of medical students′ self-assessments. Acad Med 2000;75:737-41.  Back to cited text no. 56
    
57.
Weiss PM, Koller CA, Hess LW, Wasser T. How do medical student self-assessments compare with their final clerkship grades? Med Teach 2005;27:445-9.  Back to cited text no. 57
    
58.
Pierre RB, Wierenga A, Barton M, Thame K, Branday JM, Christie CD. Student self-assessment in a paediatric objective structured clinical examination. West Indian Med J 2005;54:144-8.  Back to cited text no. 58
    
59.
Hodges B, Regehr G, Martin D. Difficulties in recognizing one′s own incompetence: Novice physicians who are unskilled and unaware of it. Acad Med 2001;76 10 Suppl: S87-9.  Back to cited text no. 59
    
60.
Tousignant M, DesMarchais JE. Accuracy of student self-assessment ability compared to their own performance in a problem-based learning medical program: A correlation study. Adv Health Sci Educ Theory Pract 2002;7:19-27.  Back to cited text no. 60
    
61.
Lind DS, Rekkas S, Bui V, Lam T, Beierle E, Copeland EM 3 rd . Competency-based student self-assessment on a surgery rotation. J Surg Res 2002;105:31-4.  Back to cited text no. 61
    
62.
Bernard AW, Balodis A, Kman NE, Caterino JM, Khandelwal S. Medical student self-assessment narratives: Perceived educational needs during fourth-year emergency medicine clerkship. Teach Learn Med 2013;25:24-30.  Back to cited text no. 62
    
63.
Abadel FT, Hattab AS. How does the medical graduates′ self-assessment of their clinical competency differ from experts′ assessment? BMC Med Educ 2013;13:24.  Back to cited text no. 63
    


    Figures

  [Figure 1]
 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
   Abstract
  Introduction
   Materials and Me...
   Results and Disc...
  Conclusion
   References
   Article Figures
   Article Tables

 Article Access Statistics
    Viewed3508    
    Printed69    
    Emailed0    
    PDF Downloaded389    
    Comments [Add]    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]