lt has been an issue whether the current medical examination system can evaluate medical students' competencies efficiently. This study was performed to survey on the satisfaction for the current medical examination system and present situation for clinical skill test in medical schools. We conducted a survey for this research and the subjects of this study were deans, medical professors, resident and medical students. We met with interesting results. First, most respondents answered the current medical examination system couldn't evaluate the medical students' competencies efficiently. Second, many residents thought preparing for paper-pencil test was not helpful for training, while experiencing clinical skill test was helpful for it. Third, the current contents and methods to evaluate clinical skill in the medical schools were variable and desirable. We concluded it was high time to change our medical examination system for evaluating the clinical skill performance of medical students.
Citations
Citations to this article as recorded by
The impact of introducing the Korean Medical Licensing Examination clinical skills assessment on medical education Hoon-Ki Park Journal of the Korean Medical Association.2012; 55(2): 116. CrossRef
Analysis of First Clinical Skills Examination in the Korean Medical Licensing Examination: Focus on Examinees' Experience in a Medical School Kyung Ae Jun, Sang Yop Shin Korean Journal of Medical Education.2011; 23(3): 203. CrossRef
The results of the 64th and 65th Korean Medical Licensing Examination were analyzed according to the classical test theory and item response theory in order to know the possibility of applying item response theory to item analys and to suggest its applicability to computerized adaptive test. The correlation coefficiency of difficulty index, discriminating index and ability parameter between two kinds of analysis were got using computer programs such as Analyst 4.0, Bilog and Xcalibre. Correlation coefficiencies of difficulty index were equal to or more than 0.75; those of discriminating index were between - 0.023 and 0.753; those of ability parameter were equal to or more than 0.90. Those results suggested that the item analysis according to item response theory showed the comparable results with that according to classical test theory except discriminating index. Since the ability parameter is most widely used in the criteria-reference test, the high correlation between ability parameter and total score can provide the validity of computerized adaptive test utilizing item response theory.
Citations
Citations to this article as recorded by
Analysis on Validity and Academic Competency of Mock Test for Korean Medicine National Licensing Examination Using Item Response Theory Han Chae, Eunbyul Cho, SeonKyoung Kim, DaHye Choi, Seul Lee Keimyung Medical Journal.2023; 42(1): 7. CrossRef
Item difficulty index, discrimination index, and reliability of the 26 health professions licensing examinations in 2022, Korea: a psychometric study Yoon Hee Kim, Bo Hyun Kim, Joonki Kim, Bokyoung Jung, Sangyoung Bae Journal of Educational Evaluation for Health Professions.2023; 20: 31. CrossRef
Can computerized tests be introduced to the Korean Medical Licensing Examination? Sun Huh Journal of the Korean Medical Association.2012; 55(2): 124. CrossRef