Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
3 "Eun Young Lim"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Research article
The relationship of examinees’ individual characteristics and perceived acceptability of smart device-based testing to test scores on the practice test of the Korea Emergency Medicine Technician Licensing Examination  
Eun Young Lim, Mi Kyoung Yim, Sun Huh
J Educ Eval Health Prof. 2018;15:33.   Published online December 27, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.33
  • 19,659 View
  • 237 Download
  • 2 Web of Science
  • 2 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Smart device-based testing (SBT) is being introduced into the Republic of Korea’s high-stakes examination system, starting with the Korean Emergency Medicine Technician Licensing Examination (KEMTLE) in December 2017. In order to minimize the effects of variation in examinees’ environment on test scores, this study aimed to identify any associations of variables related to examinees’ individual characteristics and their perceived acceptability of SBT with their SBT practice test scores.
Methods
Of the 569 candidate students who took the KEMTLE on September 12, 2015, 560 responded to a survey questionnaire on the acceptability of SBT after the examination. The questionnaire addressed 8 individual characteristics and contained 2 satisfaction, 9 convenience, and 9 preference items. A comparative analysis according to individual variables was performed. Furthermore, a generalized linear model (GLM) analysis was conducted to identify the effects of individual characteristics and perceived acceptability of SBT on test scores.
Results
Among those who preferred SBT over paper-and-pencil testing, test scores were higher for male participants (mean± standard deviation [SD], 4.36± 0.72) than for female participants (mean± SD, 4.21± 0.73). According to the GLM, no variables evaluated— including gender and experience with computer-based testing, SBT, or using a tablet PC—showed a statistically significant relationship with the total score, scores on multimedia items, or scores on text items.
Conclusion
Individual characteristics and perceived acceptability of SBT did not affect the SBT practice test scores of emergency medicine technician students in Korea. It should be possible to adopt SBT for the KEMTLE without interference from the variables examined in this study.

Citations

Citations to this article as recorded by  
  • Application of computer-based testing in the Korean Medical Licensing Examination, the emergence of the metaverse in medical education, journal metrics and statistics, and appreciation to reviewers and volunteers
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2022; 19: 2.     CrossRef
  • Evaluation of Student Satisfaction with Ubiquitous-Based Tests in Women’s Health Nursing Course
    Mi-Young An, Yun-Mi Kim
    Healthcare.2021; 9(12): 1664.     CrossRef
Brief report
Smart device-based testing for medical students in Korea: satisfaction, convenience, and advantages  
Eun Young Lim, Mi Kyoung Yim, Sun Huh
J Educ Eval Health Prof. 2017;14:7.   Published online April 24, 2017
DOI: https://doi.org/10.3352/jeehp.2017.14.7
  • 34,323 View
  • 293 Download
  • 7 Web of Science
  • 11 Crossref
AbstractAbstract PDF
The aim of this study was to investigate respondents’ satisfaction with smart device-based testing (SBT), as well as its convenience and advantages, in order to improve its implementation. The survey was conducted among 108 junior medical students at Kyungpook National University School of Medicine, Korea, who took a practice licensing examination using SBT in September 2015. The survey contained 28 items scored using a 5-point Likert scale. The items were divided into the following three categories: satisfaction with SBT administration, convenience of SBT features, and advantages of SBT compared to paper-and-pencil testing or computer-based testing. The reliability of the survey was 0.95. Of the three categories, the convenience of the SBT features received the highest mean (M) score (M= 3.75, standard deviation [SD]= 0.69), while the category of satisfaction with SBT received the lowest (M= 3.13, SD= 1.07). No statistically significant differences across these categories with respect to sex, age, or experience were observed. These results indicate that SBT was practical and effective to take and to administer.

Citations

Citations to this article as recorded by  
  • (NON)COMPUTER-ORIENTED TESTING IN HIGHER EDUCATION: VIEWS OF THE PARTICIPANTS OF THE EDUCATIONAL PROCESS ON (IN)CONVENIENCE USING
    Volodymyr Starosta
    OPEN EDUCATIONAL E-ENVIRONMENT OF MODERN UNIVERSITY.2024; (16): 173.     CrossRef
  • Survey of dental students’ perception of ubiquitous-based test (UBT)
    Hyoung Seok Shin, Jae-Hoon Kim
    The Journal of The Korean Dental Association.2024; 62(5): 270.     CrossRef
  • Development and application of a mobile-based multimedia nursing competency evaluation system for nursing students: A mixed-method randomized controlled study
    Soyoung Jang, Eunyoung E. Suh
    Nurse Education in Practice.2022; 64: 103458.     CrossRef
  • Effects of School-Based Exercise Program on Obesity and Physical Fitness of Urban Youth: A Quasi-Experiment
    Ji Hwan Song, Ho Hyun Song, Sukwon Kim
    Healthcare.2021; 9(3): 358.     CrossRef
  • Development, Application, and Effectiveness of a Smart Device-based Nursing Competency Evaluation Test
    Soyoung Jang, Eunyoung E. Suh
    CIN: Computers, Informatics, Nursing.2021; 39(11): 634.     CrossRef
  • Evaluation of Student Satisfaction with Ubiquitous-Based Tests in Women’s Health Nursing Course
    Mi-Young An, Yun-Mi Kim
    Healthcare.2021; 9(12): 1664.     CrossRef
  • How to Deal with the Concept of Authorship and the Approval of an Institutional Review Board When Writing and Editing Journal Articles
    Sun Huh
    Laboratory Medicine and Quality Assurance.2020; 42(2): 63.     CrossRef
  • Evaluation of usefulness of smart device-based testing: a survey study of Korean medical students
    Youngsup Christopher Lee, Oh Young Kwon, Ho Jin Hwang, Seok Hoon Ko
    Korean Journal of Medical Education.2020; 32(3): 213.     CrossRef
  • Presidential address: Preparing for permanent test centers and computerized adaptive testing
    Chang Hwi Kim
    Journal of Educational Evaluation for Health Professions.2018; 15: 1.     CrossRef
  • Journal Metrics of Infection & Chemotherapy and Current Scholarly Journal Publication Issues
    Sun Huh
    Infection & Chemotherapy.2018; 50(3): 219.     CrossRef
  • The relationship of examinees’ individual characteristics and perceived acceptability of smart device-based testing to test scores on the practice test of the Korea Emergency Medicine Technician Licensing Examination
    Eun Young Lim, Mi Kyoung Yim, Sun Huh
    Journal of Educational Evaluation for Health Professions.2018; 15: 33.     CrossRef
Original Article
Comparison of item analysis results of Korean Medical Licensing Examination according to classical test theory and item response theory
Eun Young Lim, Jang Hee Park, ll Kwon, Gue Lim Song, Sun Huh
J Educ Eval Health Prof. 2004;1(1):67-76.   Published online January 31, 2004
DOI: https://doi.org/10.3352/jeehp.2004.1.1.67
  • 31,189 View
  • 222 Download
  • 4 Crossref
AbstractAbstract PDF
The results of the 64th and 65th Korean Medical Licensing Examination were analyzed according to the classical test theory and item response theory in order to know the possibility of applying item response theory to item analys and to suggest its applicability to computerized adaptive test. The correlation coefficiency of difficulty index, discriminating index and ability parameter between two kinds of analysis were got using computer programs such as Analyst 4.0, Bilog and Xcalibre. Correlation coefficiencies of difficulty index were equal to or more than 0.75; those of discriminating index were between - 0.023 and 0.753; those of ability parameter were equal to or more than 0.90. Those results suggested that the item analysis according to item response theory showed the comparable results with that according to classical test theory except discriminating index. Since the ability parameter is most widely used in the criteria-reference test, the high correlation between ability parameter and total score can provide the validity of computerized adaptive test utilizing item response theory.

Citations

Citations to this article as recorded by  
  • Journal of Educational Evaluation for Health Professions received the top-ranking Journal Impact Factor―9.3—in the category of Education, Scientific Disciplines in the 2023 Journal Citation Ranking by Clarivate
    Sun Huh
    Journal of Educational Evaluation for Health Professions.2024; 21: 16.     CrossRef
  • Analysis on Validity and Academic Competency of Mock Test for Korean Medicine National Licensing Examination Using Item Response Theory
    Han Chae, Eunbyul Cho, SeonKyoung Kim, DaHye Choi, Seul Lee
    Keimyung Medical Journal.2023; 42(1): 7.     CrossRef
  • Item difficulty index, discrimination index, and reliability of the 26 health professions licensing examinations in 2022, Korea: a psychometric study
    Yoon Hee Kim, Bo Hyun Kim, Joonki Kim, Bokyoung Jung, Sangyoung Bae
    Journal of Educational Evaluation for Health Professions.2023; 20: 31.     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions
TOP