Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
2 "Classical test theory"
Filter
Filter
Article category
Keywords
Publication year
Authors
Original Articles
Applicability of Item Response Theory to the Korean Nurses' Licensing Examination
Geum-Hee Jeong, Mi Kyoung Yim
J Educ Eval Health Prof. 2005;2(1):23-29.   Published online June 30, 2005
DOI: https://doi.org/10.3352/jeehp.2005.2.1.23
  • 35,066 View
  • 162 Download
  • 3 Crossref
AbstractAbstract PDF
To test the applicability of item response theory (IRT) to the Korean Nurses' Licensing Examination (KNLE), item analysis was performed after testing the unidimensionality and goodness-of-fit. The results were compared with those based on classical test theory. The results of the 330-item KNLE administered to 12,024 examinees in January 2004 were analyzed. Unidimensionality was tested using DETECT and the goodness-of-fit was tested using WINSTEPS for the Rasch model and Bilog-MG for the two-parameter logistic model. Item analysis and ability estimation were done using WINSTEPS. Using DETECT, Dmax ranged from 0.1 to 0.23 for each subject. The mean square value of the infit and outfit values of all items using WINSTEPS ranged from 0.1 to 1.5, except for one item in pediatric nursing, which scored 1.53. Of the 330 items, 218 (42.7%) were misfit using the two-parameter logistic model of Bilog-MG. The correlation coefficients between the difficulty parameter using the Rasch model and the difficulty index from classical test theory ranged from 0.9039 to 0.9699. The correlation between the ability parameter using the Rasch model and the total score from classical test theory ranged from 0.9776 to 0.9984. Therefore, the results of the KNLE fit unidimensionality and goodness-of-fit for the Rasch model. The KNLE should be a good sample for analysis according to the IRT Rasch model, so further research using IRT is possible.

Citations

Citations to this article as recorded by  
  • Item difficulty index, discrimination index, and reliability of the 26 health professions licensing examinations in 2022, Korea: a psychometric study
    Yoon Hee Kim, Bo Hyun Kim, Joonki Kim, Bokyoung Jung, Sangyoung Bae
    Journal of Educational Evaluation for Health Professions.2023; 20: 31.     CrossRef
  • Study on the Academic Competency Assessment of Herbology Test using Rasch Model
    Han Chae, Soo Jin Lee, Chang-ho Han, Young Il Cho, Hyungwoo Kim
    Journal of Korean Medicine.2022; 43(2): 27.     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef
Comparison of item analysis results of Korean Medical Licensing Examination according to classical test theory and item response theory
Eun Young Lim, Jang Hee Park, ll Kwon, Gue Lim Song, Sun Huh
J Educ Eval Health Prof. 2004;1(1):67-76.   Published online January 31, 2004
DOI: https://doi.org/10.3352/jeehp.2004.1.1.67
  • 30,603 View
  • 214 Download
  • 3 Crossref
AbstractAbstract PDF
The results of the 64th and 65th Korean Medical Licensing Examination were analyzed according to the classical test theory and item response theory in order to know the possibility of applying item response theory to item analys and to suggest its applicability to computerized adaptive test. The correlation coefficiency of difficulty index, discriminating index and ability parameter between two kinds of analysis were got using computer programs such as Analyst 4.0, Bilog and Xcalibre. Correlation coefficiencies of difficulty index were equal to or more than 0.75; those of discriminating index were between - 0.023 and 0.753; those of ability parameter were equal to or more than 0.90. Those results suggested that the item analysis according to item response theory showed the comparable results with that according to classical test theory except discriminating index. Since the ability parameter is most widely used in the criteria-reference test, the high correlation between ability parameter and total score can provide the validity of computerized adaptive test utilizing item response theory.

Citations

Citations to this article as recorded by  
  • Analysis on Validity and Academic Competency of Mock Test for Korean Medicine National Licensing Examination Using Item Response Theory
    Han Chae, Eunbyul Cho, SeonKyoung Kim, DaHye Choi, Seul Lee
    Keimyung Medical Journal.2023; 42(1): 7.     CrossRef
  • Item difficulty index, discrimination index, and reliability of the 26 health professions licensing examinations in 2022, Korea: a psychometric study
    Yoon Hee Kim, Bo Hyun Kim, Joonki Kim, Bokyoung Jung, Sangyoung Bae
    Journal of Educational Evaluation for Health Professions.2023; 20: 31.     CrossRef
  • Can computerized tests be introduced to the Korean Medical Licensing Examination?
    Sun Huh
    Journal of the Korean Medical Association.2012; 55(2): 124.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions