Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
11 "Mi Kyoung Yim"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Research articles
Similarity of the cut score in test sets with different item amounts using the modified Angoff, modified Ebel, and Hofstee standard-setting methods for the Korean Medical Licensing Examination  
Janghee Park, Mi Kyoung Yim, Na Jin Kim, Duck Sun Ahn, Young-Min Kim
J Educ Eval Health Prof. 2020;17:28.   Published online October 5, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.28
  • 7,064 View
  • 199 Download
  • 7 Web of Science
  • 6 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
The Korea Medical Licensing Exam (KMLE) typically contains a large number of items. The purpose of this study was to investigate whether there is a difference in the cut score between evaluating all items of the exam and evaluating only some items when conducting standard-setting.
Methods
We divided the item sets that appeared on 3 recent KMLEs for the past 3 years into 4 subsets of each year of 25% each based on their item content categories, discrimination index, and difficulty index. The entire panel of 15 members assessed all the items (360 items, 100%) of the year 2017. In split-half set 1, each item set contained 184 (51%) items of year 2018 and each set from split-half set 2 contained 182 (51%) items of the year 2019 using the same method. We used the modified Angoff, modified Ebel, and Hofstee methods in the standard-setting process.
Results
Less than a 1% cut score difference was observed when the same method was used to stratify item subsets containing 25%, 51%, or 100% of the entire set. When rating fewer items, higher rater reliability was observed.
Conclusion
When the entire item set was divided into equivalent subsets, assessing the exam using a portion of the item set (90 out of 360 items) yielded similar cut scores to those derived using the entire item set. There was a higher correlation between panelists’ individual assessments and the overall assessments.

Citations

Citations to this article as recorded by  
  • Application of computer-based testing in the Korean Medical Licensing Examination, the emergence of the metaverse in medical education, journal metrics and statistics, and appreciation to reviewers and volunteers
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2022; 19: 2.     CrossRef
  • Possibility of using the yes/no Angoff method as a substitute for the percent Angoff method for estimating the cutoff score of the Korean Medical Licensing Examination: a simulation study
    Janghee Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 23.     CrossRef
  • Equal Z standard-setting method to estimate the minimum number of panelists for a medical school’s objective structured clinical examination in Taiwan: a simulation study
    Ying-Ying Yang, Pin-Hsiang Huang, Ling-Yu Yang, Chia-Chang Huang, Chih-Wei Liu, Shiau-Shian Huang, Chen-Huan Chen, Fa-Yauh Lee, Shou-Yen Kao, Boaz Shulruf
    Journal of Educational Evaluation for Health Professions.2022; 19: 27.     CrossRef
  • Possibility of independent use of the yes/no Angoff and Hofstee methods for the standard setting of the Korean Medical Licensing Examination written test: a descriptive study
    Do-Hwan Kim, Ye Ji Kang, Hoon-Ki Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 33.     CrossRef
  • Presidential address: Quarantine guidelines to protect examinees from coronavirus disease 2019, clinical skills examination for dental licensing, and computer-based testing for medical, dental, and oriental medicine licensing
    Yoon-Seong Lee
    Journal of Educational Evaluation for Health Professions.2021; 18: 1.     CrossRef
  • Comparing the cut score for the borderline group method and borderline regression method with norm-referenced standard setting in an objective structured clinical examination in medical school in Korea
    Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
    Journal of Educational Evaluation for Health Professions.2021; 18: 25.     CrossRef
Using the Angoff method to set a standard on mock exams for the Korean Nursing Licensing Examination  
Mi Kyoung Yim, Sujin Shin
J Educ Eval Health Prof. 2020;17:14.   Published online April 22, 2020
DOI: https://doi.org/10.3352/jeehp.2020.17.14
  • 8,746 View
  • 202 Download
  • 9 Web of Science
  • 7 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study explored the possibility of using the Angoff method, in which panel experts determine the cut score of an exam, for the Korean Nursing Licensing Examination (KNLE). Two mock exams for the KNLE were analyzed. The Angoff standard setting procedure was conducted and the results were analyzed. We also aimed to examine the procedural validity of applying the Angoff method in this context.
Methods
For both mock exams, we set a pass-fail cut score using the Angoff method. The standard setting panel consisted of 16 nursing professors. After the Angoff procedure, the procedural validity of establishing the standard was evaluated by investigating the responses of the standard setters.
Results
The descriptions of the minimally competent person for the KNLE were presented at the levels of general and subject performance. The cut scores of first and second mock exams were 74.4 and 76.8, respectively. These were higher than the traditional cut score (60% of the total score of the KNLE). The panel survey showed very positive responses, with scores higher than 4 out of 5 points on a Likert scale.
Conclusion
The scores calculated for both mock tests were similar, and were much higher than the existing cut scores. In the second simulation, the standard deviation of the Angoff rating was lower than in the first simulation. According to the survey results, procedural validity was acceptable, as shown by a high level of confidence. The results show that determining cut scores by an expert panel is an applicable method.

Citations

Citations to this article as recorded by  
  • Experts’ prediction of item difficulty of multiple-choice questions in the Ethiopian Undergraduate Medicine Licensure Examination
    Shewatatek Gedamu Wonde, Tefera Tadesse, Belay Moges, Stefan K. Schauber
    BMC Medical Education.2024;[Epub]     CrossRef
  • Comparing Estimated and Real Item Difficulty Using Multi-Facet Rasch Analysis
    Ayfer SAYIN, Sebahat GÖREN
    Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi.2023; 14(4): 440.     CrossRef
  • Application of computer-based testing in the Korean Medical Licensing Examination, the emergence of the metaverse in medical education, journal metrics and statistics, and appreciation to reviewers and volunteers
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2022; 19: 2.     CrossRef
  • Possibility of using the yes/no Angoff method as a substitute for the percent Angoff method for estimating the cutoff score of the Korean Medical Licensing Examination: a simulation study
    Janghee Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 23.     CrossRef
  • Development of examination objectives based on nursing competency for the Korean Nursing Licensing Examination: a validity study
    Sujin Shin, Gwang Suk Kim, Jun-Ah Song, Inyoung Lee
    Journal of Educational Evaluation for Health Professions.2022; 19: 19.     CrossRef
  • Possibility of independent use of the yes/no Angoff and Hofstee methods for the standard setting of the Korean Medical Licensing Examination written test: a descriptive study
    Do-Hwan Kim, Ye Ji Kang, Hoon-Ki Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 33.     CrossRef
  • Comparing the cut score for the borderline group method and borderline regression method with norm-referenced standard setting in an objective structured clinical examination in medical school in Korea
    Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
    Journal of Educational Evaluation for Health Professions.2021; 18: 25.     CrossRef
The relationship of examinees’ individual characteristics and perceived acceptability of smart device-based testing to test scores on the practice test of the Korea Emergency Medicine Technician Licensing Examination  
Eun Young Lim, Mi Kyoung Yim, Sun Huh
J Educ Eval Health Prof. 2018;15:33.   Published online December 27, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.33
  • 19,659 View
  • 237 Download
  • 2 Web of Science
  • 2 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Smart device-based testing (SBT) is being introduced into the Republic of Korea’s high-stakes examination system, starting with the Korean Emergency Medicine Technician Licensing Examination (KEMTLE) in December 2017. In order to minimize the effects of variation in examinees’ environment on test scores, this study aimed to identify any associations of variables related to examinees’ individual characteristics and their perceived acceptability of SBT with their SBT practice test scores.
Methods
Of the 569 candidate students who took the KEMTLE on September 12, 2015, 560 responded to a survey questionnaire on the acceptability of SBT after the examination. The questionnaire addressed 8 individual characteristics and contained 2 satisfaction, 9 convenience, and 9 preference items. A comparative analysis according to individual variables was performed. Furthermore, a generalized linear model (GLM) analysis was conducted to identify the effects of individual characteristics and perceived acceptability of SBT on test scores.
Results
Among those who preferred SBT over paper-and-pencil testing, test scores were higher for male participants (mean± standard deviation [SD], 4.36± 0.72) than for female participants (mean± SD, 4.21± 0.73). According to the GLM, no variables evaluated— including gender and experience with computer-based testing, SBT, or using a tablet PC—showed a statistically significant relationship with the total score, scores on multimedia items, or scores on text items.
Conclusion
Individual characteristics and perceived acceptability of SBT did not affect the SBT practice test scores of emergency medicine technician students in Korea. It should be possible to adopt SBT for the KEMTLE without interference from the variables examined in this study.

Citations

Citations to this article as recorded by  
  • Application of computer-based testing in the Korean Medical Licensing Examination, the emergence of the metaverse in medical education, journal metrics and statistics, and appreciation to reviewers and volunteers
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2022; 19: 2.     CrossRef
  • Evaluation of Student Satisfaction with Ubiquitous-Based Tests in Women’s Health Nursing Course
    Mi-Young An, Yun-Mi Kim
    Healthcare.2021; 9(12): 1664.     CrossRef
Comparison of standard-setting methods for the Korean Radiological Technologist Licensing Examination: Angoff, Ebel, bookmark, and Hofstee  
Janghee Park, Duck-Sun Ahn, Mi Kyoung Yim, Jaehyoung Lee
J Educ Eval Health Prof. 2018;15:32.   Published online December 26, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.32
  • 19,786 View
  • 260 Download
  • 11 Web of Science
  • 9 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
This study aimed to compare the possible standard-setting methods for the Korean Radiological Technologist Licensing Examination, which has a fixed cut score, and to suggest the most appropriate method.
Methods
Six radiological technology professors set standards for 250 items on the Korean Radiological Technologist Licensing Examination administered in December 2016 using the Angoff, Ebel, bookmark, and Hofstee methods.
Results
With a maximum percentile score of 100, the cut score for the examination was 71.27 using the Angoff method, 62.2 using the Ebel method, 64.49 using the bookmark method, and 62 using the Hofstee method. Based on the Hofstee method, an acceptable cut score for the examination would be between 52.83 and 70, but the cut score was 71.27 using the Angoff method.
Conclusion
The above results suggest that the best standard-setting method to determine the cut score would be a panel discussion with the modified Angoff or Ebel method, with verification of the rated results by the Hofstee method. Since no standard-setting method has yet been adopted for the Korean Radiological Technologist Licensing Examination, this study will be able to provide practical guidance for introducing a standard-setting process.

Citations

Citations to this article as recorded by  
  • Setting standards for a diagnostic test of aviation English for student pilots
    Maria Treadaway, John Read
    Language Testing.2024; 41(3): 557.     CrossRef
  • Third Year Veterinary Student Academic Encumbrances and Tenacity: Navigating Clinical Skills Curricula and Assessment
    Saundra H. Sample, Elpida Artemiou, Darlene J. Donszelmann, Cindy Adams
    Journal of Veterinary Medical Education.2024;[Epub]     CrossRef
  • The challenges inherent with anchor-based approaches to the interpretation of important change in clinical outcome assessments
    Kathleen W. Wyrwich, Geoffrey R. Norman
    Quality of Life Research.2023; 32(5): 1239.     CrossRef
  • Possibility of independent use of the yes/no Angoff and Hofstee methods for the standard setting of the Korean Medical Licensing Examination written test: a descriptive study
    Do-Hwan Kim, Ye Ji Kang, Hoon-Ki Park
    Journal of Educational Evaluation for Health Professions.2022; 19: 33.     CrossRef
  • Comparison of the validity of bookmark and Angoff standard setting methods in medical performance tests
    Majid Yousefi Afrashteh
    BMC Medical Education.2021;[Epub]     CrossRef
  • Comparing the cut score for the borderline group method and borderline regression method with norm-referenced standard setting in an objective structured clinical examination in medical school in Korea
    Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
    Journal of Educational Evaluation for Health Professions.2021; 18: 25.     CrossRef
  • Using the Angoff method to set a standard on mock exams for the Korean Nursing Licensing Examination
    Mi Kyoung Yim, Sujin Shin
    Journal of Educational Evaluation for Health Professions.2020; 17: 14.     CrossRef
  • Performance of the Ebel standard-setting method for the spring 2019 Royal College of Physicians and Surgeons of Canada internal medicine certification examination consisting of multiple-choice questions
    Jimmy Bourque, Haley Skinner, Jonathan Dupré, Maria Bacchus, Martha Ainslie, Irene W. Y. Ma, Gary Cole
    Journal of Educational Evaluation for Health Professions.2020; 17: 12.     CrossRef
  • Similarity of the cut score in test sets with different item amounts using the modified Angoff, modified Ebel, and Hofstee standard-setting methods for the Korean Medical Licensing Examination
    Janghee Park, Mi Kyoung Yim, Na Jin Kim, Duck Sun Ahn, Young-Min Kim
    Journal of Educational Evaluation for Health Professions.2020; 17: 28.     CrossRef
Perception survey on the introduction of clinical performance examination as part of the national nursing licensing examination in Korea  
Su Jin Shin, Yeong Kyeong Kim, Soon-Rim Suh, Duk Yoo Jung, Yunju Kim, Mi Kyoung Yim
J Educ Eval Health Prof. 2017;14:26.   Published online October 25, 2017
DOI: https://doi.org/10.3352/jeehp.2017.14.26
  • 32,456 View
  • 304 Download
  • 2 Web of Science
  • 5 Crossref
AbstractAbstract PDF
Purpose
The purpose of this study was to analyze opinions about the action plan for implementation of clinical performance exam as part of the national nursing licensing examination and presents the expected effects of the performance exam and aspects to consider regarding its implementation.
Methods
This study used a mixed-methods design. Quantitative data were collected by a questionnaire survey, while qualitative data were collected by focus group interviews with experts. The survey targeted 200 nursing professors and clinical nurses with more than 5 years of work experience, and the focus group interviews were conducted with 28 of professors, clinical instructors, and nurses at hospitals.
Results
First, nursing professors and clinical specialists agreed that the current written tests have limitations in evaluating examinees’ ability, and that the introduction of a clinical performance exam will yield positive results. Clinical performance exam is necessary to evaluate and improve nurses’ work ability, which means that the implementation of a performance exam is advisable if its credibility and validity can be verified. Second, most respondents chose direct performance exams using simulators or standardized patients as the most suitable format of the test.
Conclusion
In conclusion, the current national nursing licensing exam is somewhat limited in its ability to identify competent nurses. Thus, the time has come for us to seriously consider the introduction of a performance exam. The prerequisites for successfully implementing clinical performance exam as part of the national nursing licensing exam are a professional training process and forming a consortium to standardize practical training.

Citations

Citations to this article as recorded by  
  • The Clinical Nursing Competency Assessment System of Ghana: Perspectives of Key Informants
    Oboshie Anim-Boamah, Christmal Dela Christmals, Susan Jennifer Armstrong
    Sage Open.2022;[Epub]     CrossRef
  • Adaptation of Extended Reality Smart Glasses for Core Nursing Skill Training Among Undergraduate Nursing Students: Usability and Feasibility Study
    Sun Kyung Kim, Youngho Lee, Hyoseok Yoon, Jongmyung Choi
    Journal of Medical Internet Research.2021; 23(3): e24313.     CrossRef
  • Nursing Students’ Experiences on Clinical Competency Assessment in Ghana
    Oboshie Anim-Boamah, Christmal Dela Christmals, Susan Jennifer Armstrong
    Nurse Media Journal of Nursing.2021; 11(3): 278.     CrossRef
  • Clinical nursing competency assessment: a scoping review
    Oboshie Anim-Boamah, Christmal Dela Christmals, Susan Jennifer Armstrong
    Frontiers of Nursing.2021; 8(4): 341.     CrossRef
  • Factors Influencing the Success of the National Nursing Competency Examination taken by the Nursing Diploma Students in Yogyakarta
    Yulia Wardani
    Jurnal Ners.2020; 14(2): 172.     CrossRef
Brief report
Smart device-based testing for medical students in Korea: satisfaction, convenience, and advantages  
Eun Young Lim, Mi Kyoung Yim, Sun Huh
J Educ Eval Health Prof. 2017;14:7.   Published online April 24, 2017
DOI: https://doi.org/10.3352/jeehp.2017.14.7
  • 34,324 View
  • 293 Download
  • 7 Web of Science
  • 11 Crossref
AbstractAbstract PDF
The aim of this study was to investigate respondents’ satisfaction with smart device-based testing (SBT), as well as its convenience and advantages, in order to improve its implementation. The survey was conducted among 108 junior medical students at Kyungpook National University School of Medicine, Korea, who took a practice licensing examination using SBT in September 2015. The survey contained 28 items scored using a 5-point Likert scale. The items were divided into the following three categories: satisfaction with SBT administration, convenience of SBT features, and advantages of SBT compared to paper-and-pencil testing or computer-based testing. The reliability of the survey was 0.95. Of the three categories, the convenience of the SBT features received the highest mean (M) score (M= 3.75, standard deviation [SD]= 0.69), while the category of satisfaction with SBT received the lowest (M= 3.13, SD= 1.07). No statistically significant differences across these categories with respect to sex, age, or experience were observed. These results indicate that SBT was practical and effective to take and to administer.

Citations

Citations to this article as recorded by  
  • (NON)COMPUTER-ORIENTED TESTING IN HIGHER EDUCATION: VIEWS OF THE PARTICIPANTS OF THE EDUCATIONAL PROCESS ON (IN)CONVENIENCE USING
    Volodymyr Starosta
    OPEN EDUCATIONAL E-ENVIRONMENT OF MODERN UNIVERSITY.2024; (16): 173.     CrossRef
  • Survey of dental students’ perception of ubiquitous-based test (UBT)
    Hyoung Seok Shin, Jae-Hoon Kim
    The Journal of The Korean Dental Association.2024; 62(5): 270.     CrossRef
  • Development and application of a mobile-based multimedia nursing competency evaluation system for nursing students: A mixed-method randomized controlled study
    Soyoung Jang, Eunyoung E. Suh
    Nurse Education in Practice.2022; 64: 103458.     CrossRef
  • Effects of School-Based Exercise Program on Obesity and Physical Fitness of Urban Youth: A Quasi-Experiment
    Ji Hwan Song, Ho Hyun Song, Sukwon Kim
    Healthcare.2021; 9(3): 358.     CrossRef
  • Development, Application, and Effectiveness of a Smart Device-based Nursing Competency Evaluation Test
    Soyoung Jang, Eunyoung E. Suh
    CIN: Computers, Informatics, Nursing.2021; 39(11): 634.     CrossRef
  • Evaluation of Student Satisfaction with Ubiquitous-Based Tests in Women’s Health Nursing Course
    Mi-Young An, Yun-Mi Kim
    Healthcare.2021; 9(12): 1664.     CrossRef
  • How to Deal with the Concept of Authorship and the Approval of an Institutional Review Board When Writing and Editing Journal Articles
    Sun Huh
    Laboratory Medicine and Quality Assurance.2020; 42(2): 63.     CrossRef
  • Evaluation of usefulness of smart device-based testing: a survey study of Korean medical students
    Youngsup Christopher Lee, Oh Young Kwon, Ho Jin Hwang, Seok Hoon Ko
    Korean Journal of Medical Education.2020; 32(3): 213.     CrossRef
  • Presidential address: Preparing for permanent test centers and computerized adaptive testing
    Chang Hwi Kim
    Journal of Educational Evaluation for Health Professions.2018; 15: 1.     CrossRef
  • Journal Metrics of Infection & Chemotherapy and Current Scholarly Journal Publication Issues
    Sun Huh
    Infection & Chemotherapy.2018; 50(3): 219.     CrossRef
  • The relationship of examinees’ individual characteristics and perceived acceptability of smart device-based testing to test scores on the practice test of the Korea Emergency Medicine Technician Licensing Examination
    Eun Young Lim, Mi Kyoung Yim, Sun Huh
    Journal of Educational Evaluation for Health Professions.2018; 15: 33.     CrossRef
Technical Report
Reforms of the Korean Medical Licensing Examination regarding item development and performance evaluation  
Mi Kyoung Yim
J Educ Eval Health Prof. 2015;12:6.   Published online March 17, 2015
DOI: https://doi.org/10.3352/jeehp.2015.12.6
  • 36,838 View
  • 194 Download
  • 6 Web of Science
  • 7 Crossref
AbstractAbstract PDF
Purpose
The Korean Medical Licensing Examination (KMLE) has undergone a variety of innovative reforms implemented by the National Health Personnel Licensing Examination Board (NHPLEB) in order to make it a competency-based test. The purpose of this article is to describe the ways in which the KMLE has been reformed and the effect of those innovations on medical education in Korea. Methods: Changes in the KMLE were traced from 1994 to 2014 by reviewing the adoption of new policies by the NHPLEB and the relevant literature. Results: The most important reforms that turned the examination into a competency-based test were the following: First, the subjects tested on the exam were revised; second, R-type items were introduced; third, the proportion of items involving problem-solving skills was increased; and fourth, a clinical skills test was introduced in addition to the written test. The literature shows that the above reforms have resulted in more rigorous licensure standards and have improved the educational environment of medical schools in Korea. Conclusion: The reforms of the KMLE have led to improvements in how the competency of examinees is evaluated, as well as improvements in the educational system in medical schools in Korea.

Citations

Citations to this article as recorded by  
  • Navigating protean career paths in medical education: insights from outstanding medical educators in South Korea
    Bora Lee, Danbi Lee, Hyekyung Shin, Sohee Park, Eunbae B. Yang
    BMC Medical Education.2024;[Epub]     CrossRef
  • A Survey on Perceptions of the Direction of Korean Medicine Education and National Licensing Examination
    Han-Byul Cho, Won-Suk Sung, Jiseong Hong, Yeonseok Kang, Eun-Jung Kim
    Healthcare.2023; 11(12): 1685.     CrossRef
  • Impact of anesthetist licensing examination on quality of education in Ethiopia: a qualitative study of faculty and student perceptions
    Yohannes Molla Asemu, Tegbar Yigzaw, Firew Ayalew Desta, Tewodros Abebaw Melese, Leulayehu Akalu Gemeda, Fedde Scheele, Thomas van den Akker
    BMC Medical Education.2023;[Epub]     CrossRef
  • Attitudes to proposed assessment of pharmacy skills in Korean pharmacist licensure examination
    Joo Hee Kim, Ju-Yeun Lee, Young Sook Lee, Chul-Soon Yong, Nayoung Han, Hye Sun Gwak, Jungmi Oh, Byung Koo Lee, Sukhyang Lee
    Journal of Educational Evaluation for Health Professions.2017; 14: 6.     CrossRef
  • Is there an agreement among the items of the Korean physical therapist licensing examination, learning objectives of class subjects, and physical therapists’ job descriptions?
    Min-Hyeok Kang, Oh-Yun Kwon, Yong-Wook Kim, Ji-Won Kim, Tae-Ho Kim, Tae-Young Oh, Jong-Hyuk Weon, Tae-Sik Lee, Jae-Seop Oh
    Journal of Educational Evaluation for Health Professions.2016; 13: 3.     CrossRef
  • The past, present, and future of traditional medicine education in Korea
    Sang Yun Han, Hee Young Kim, Jung Hwa Lim, Jinhong Cheon, Young Kyu Kwon, Hyungwoo Kim, Gi Young Yang, Han Chae
    Integrative Medicine Research.2016; 5(2): 73.     CrossRef
  • Medical students’ satisfaction with the Applied Basic Clinical Seminar with Scenarios for Students, a novel simulation-based learning method in Greece
    Panteleimon Pantelidis, Nikolaos Staikoglou, Georgios Paparoidamis, Christos Drosos, Stefanos Karamaroudis, Athina Samara, Christodoulos Keskinis, Michail Sideris, George Giannakoulas, Georgios Tsoulfas, Asterios Karagiannis
    Journal of Educational Evaluation for Health Professions.2016; 13: 13.     CrossRef
Research Articles
Exploration of examinees’ traits that affect the score of Korean Medical Licensing Examination  
Mi Kyoung Yim
J Educ Eval Health Prof. 2015;12:5.   Published online March 16, 2015
DOI: https://doi.org/10.3352/jeehp.2015.12.5
  • 28,806 View
  • 166 Download
  • 2 Crossref
AbstractAbstract PDF
Purpose
It aims to identify the effect of five variables to score of the Korean Medical Licensing Examinations (KMLE) for three consecutive years from 2011 to 2013.
Methods
The number of examinees for each examination was 3,364 in 2011 3,177 in 2012, and 3,287 in 2013. Five characteristics of examinees were set as variables: gender, age, graduation status, written test result (pass or fail), and city of medical school. A regression model was established, with the score of a written test as a dependent variable and with examinees’ traits as variables.
Results
The regression coefficients in all variables, except the city of medical school, were statistically significant. The variable’s effect in three examinations appeared in the following order: result of written test, graduation status, age, gender, and city of medical school.
Conclusion
written test scores of the KMLE revealed that female students, younger examinees, and first-time examinees had higher performances.

Citations

Citations to this article as recorded by  
  • Gender bias in the medical school admission system in Japan
    Kayo Fukami, Kae Okoshi, Yasuko Tomizawa
    SN Social Sciences.2022;[Epub]     CrossRef
  • A Comparative Study of Predictive Factors for Passing the National Physical Therapy Examination using Logistic Regression Analysis and Decision Tree Analysis
    So Hyun Kim, Sung Hyoun Cho
    Physical Therapy Rehabilitation Science.2022; 11(3): 285.     CrossRef
Test Equating of the Medical Licensing Examination in 2003 and 2004 Based on the Item Response Theory
Mi Kyoung Yim, Sun Huh
J Educ Eval Health Prof. 2006;3:2.   Published online July 31, 2006
DOI: https://doi.org/10.3352/jeehp.2006.3.2
  • 30,596 View
  • 137 Download
  • 2 Crossref
AbstractAbstract PDF
The passing rate of the Medical Licensing Examination has been variable, which probably originated from the difference in the difficulty of items and/or difference in the ability level of examinees. We tried to explain the origin of the difference using the test equating method based on the item response theory. The number of items and examinees were 500, 3,647 in 2003 and 550, 3,879 in 2004. Common item nonequivalent group design was used for 30 common items. Item and ability parameters were calculated by three parametric logistic models using ICL. Scale transformation and true score equating were executed using ST and PIE. The mean of difficulty index of the year 2003 was ??.957 (SD 2.628) and that of 2004 after equating was ??.456 (SD 3.399). The mean of discrimination index of year 2003 was 0.487 (SD 0.242) and that of 2004 was 0.363 (SD 0.193). The mean of ability parameter of year 2003 was 0.00617 (SD 0.96605) and that of year 2004 was 0.94636 (SD 1.32960). The difference of the equated true score at the same ability level was high at the range of score of 200??50. The reason for the difference in passing rates over two consecutive years was due to the fact that the Examination in 2004 was easier and the abilities of the examinees in 2004 were higher. In addition, the passing rates of examinees with score of 270??94 in 2003, and those with 322??43 in 2004, were affected by the examination year.

Citations

Citations to this article as recorded by  
  • Comparison of proficiency in an anesthesiology course across distinct medical student cohorts: Psychometric approaches to test equating
    Shu-Wei Liao, Kuang-Yi Chang, Chien-Kun Ting, Mei-Yung Tsou, En-Tzu Chen, Kwok-Hon Chan, Wen-Kuei Chang
    Journal of the Chinese Medical Association.2014; 77(3): 150.     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef
Original Articles
Construct Validity of the Korean Dental Licensing Examination using Confirmatory Factor Analysis
Mi Kyoung Yim, Yoon Hee Kim
J Educ Eval Health Prof. 2005;2(1):75-86.   Published online June 30, 2005
DOI: https://doi.org/10.3352/jeehp.2005.2.1.75
  • 34,871 View
  • 164 Download
AbstractAbstract PDF
Confirmatory factor analysis based on a measurement model of a structural equation model was used to test the construct validity of 13 subjects in the Korean Dental Licensing Examination (KDLE). The results of 1,086 examinees who wrote the KDLE in 2004 were analyzed. The thirteen subjects were classified into 62 major categories and 122 intermediate categories. There were 364 items. A hierarchical model was constructed, including major and intermediate categories. The impact of the variables was determined by the standardized regression coefficient that related latent and measured variables in the measurement model. The KDLE showed a high goodness-of-fit with a root mean square error of approximation of 0.030 and a non-normed fit index of 0.998. When the latent variables for the major and intermediate categories were analyzed, the standardized regression coefficients of all of the subjects, with the exception of Health and Medical Legislation, were significant. From the result, we concluded that the 13 subjects showed constructive validity. In addition, the study model and data were very compatible. The subject Health and Medical Legislation had a low explanatory impact with respect to testing the ability of dentists to perform their jobs. This study suggests that similar psychometric studies are needed before integrating or deleting subjects on the KDLE, and to improve item development.
Applicability of Item Response Theory to the Korean Nurses' Licensing Examination
Geum-Hee Jeong, Mi Kyoung Yim
J Educ Eval Health Prof. 2005;2(1):23-29.   Published online June 30, 2005
DOI: https://doi.org/10.3352/jeehp.2005.2.1.23
  • 35,612 View
  • 168 Download
  • 3 Crossref
AbstractAbstract PDF
To test the applicability of item response theory (IRT) to the Korean Nurses' Licensing Examination (KNLE), item analysis was performed after testing the unidimensionality and goodness-of-fit. The results were compared with those based on classical test theory. The results of the 330-item KNLE administered to 12,024 examinees in January 2004 were analyzed. Unidimensionality was tested using DETECT and the goodness-of-fit was tested using WINSTEPS for the Rasch model and Bilog-MG for the two-parameter logistic model. Item analysis and ability estimation were done using WINSTEPS. Using DETECT, Dmax ranged from 0.1 to 0.23 for each subject. The mean square value of the infit and outfit values of all items using WINSTEPS ranged from 0.1 to 1.5, except for one item in pediatric nursing, which scored 1.53. Of the 330 items, 218 (42.7%) were misfit using the two-parameter logistic model of Bilog-MG. The correlation coefficients between the difficulty parameter using the Rasch model and the difficulty index from classical test theory ranged from 0.9039 to 0.9699. The correlation between the ability parameter using the Rasch model and the total score from classical test theory ranged from 0.9776 to 0.9984. Therefore, the results of the KNLE fit unidimensionality and goodness-of-fit for the Rasch model. The KNLE should be a good sample for analysis according to the IRT Rasch model, so further research using IRT is possible.

Citations

Citations to this article as recorded by  
  • Item difficulty index, discrimination index, and reliability of the 26 health professions licensing examinations in 2022, Korea: a psychometric study
    Yoon Hee Kim, Bo Hyun Kim, Joonki Kim, Bokyoung Jung, Sangyoung Bae
    Journal of Educational Evaluation for Health Professions.2023; 20: 31.     CrossRef
  • Study on the Academic Competency Assessment of Herbology Test using Rasch Model
    Han Chae, Soo Jin Lee, Chang-ho Han, Young Il Cho, Hyungwoo Kim
    Journal of Korean Medicine.2022; 43(2): 27.     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions
TOP