Purpose With the coronavirus disease 2019 pandemic, online high-stakes exams have become a viable alternative. This study evaluated the feasibility of computer-based testing (CBT) for medical residency applications in Brazil and its impacts on item quality and applicants’ access compared to paper-based testing.
Methods In 2020, an online CBT was conducted in a Ribeirao Preto Clinical Hospital in Brazil. In total, 120 multiple-choice question items were constructed. Two years later, the exam was performed as paper-based testing. Item construction processes were similar for both exams. Difficulty and discrimination indexes, point-biserial coefficient, difficulty, discrimination, guessing parameters, and Cronbach’s α coefficient were measured based on the item response and classical test theories. Internet stability for applicants was monitored.
Results In 2020, 4,846 individuals (57.1% female, mean age of 26.64±3.37 years) applied to the residency program, versus 2,196 individuals (55.2% female, mean age of 26.47±3.20 years) in 2022. For CBT, there was an increase of 2,650 applicants (120.7%), albeit with significant differences in demographic characteristics. There was a significant increase in applicants from more distant and lower-income Brazilian regions, such as the North (5.6% vs. 2.7%) and Northeast (16.9% vs. 9.0%). No significant differences were found in difficulty and discrimination indexes, point-biserial coefficients, and Cronbach’s α coefficients between the 2 exams.
Conclusion Online CBT with multiple-choice questions was a viable format for a residency application exam, improving accessibility without compromising exam integrity and quality.
Purpose There is limited literature related to the assessment of electronic medical record (EMR)-related competencies. To address this gap, this study explored the feasibility of an EMR objective structured clinical examination (OSCE) station to evaluate medical students’ communication skills by psychometric analyses and standardized patients’ (SPs) perspectives on EMR use in an OSCE.
Methods An OSCE station that incorporated the use of an EMR was developed and pilot-tested in March 2020. Students’ communication skills were assessed by SPs and physician examiners. Students’ scores were compared between the EMR station and 9 other stations. A psychometric analysis, including item total correlation, was done. SPs participated in a post-OSCE focus group to discuss their perception of EMRs’ effect on communication.
Results Ninety-nine 3rd-year medical students participated in a 10-station OSCE that included the use of the EMR station. The EMR station had an acceptable item total correlation (0.217). Students who leveraged graphical displays in counseling received higher OSCE station scores from the SPs (P=0.041). The thematic analysis of SPs’ perceptions of students’ EMR use from the focus group revealed the following domains of themes: technology, communication, case design, ownership of health information, and timing of EMR usage.
Conclusion This study demonstrated the feasibility of incorporating EMR in assessing learner communication skills in an OSCE. The EMR station had acceptable psychometric characteristics. Some medical students were able to efficiently use the EMRs as an aid in patient counseling. Teaching students how to be patient-centered even in the presence of technology may promote engagement.
Citations
Citations to this article as recorded by
Usage and perception of electronic medical records (EMR) among medical students in southwestern Nigeria A. A. Adeyeye, A. O. Ajose, O. M. Oduola, B. A. Akodu, A. Olufadeji Discover Public Health.2024;[Epub] CrossRef
Strong partnerships between academic health professions programs and clinical practice settings, termed academic-clinical partnerships, are essential in providing quality clinical training experiences. However, the literature does not operationalize a model by which an academic program may identify priority attributes and evaluate its partnerships. This study aimed to develop a values-based academic-clinical partnership evaluation approach, rooted in methodologies from the field of evaluation and implemented in the context of an academic Doctor of Physical Therapy clinical education program. The authors developed a semi-quantitative evaluation approach incorporating concepts from multi-attribute utility analysis (MAUA) that enabled consistent, values-based partnership evaluation. Data-informed actions led to improved overall partnership effectiveness. Pilot outcomes support the feasibility and desirability of moving toward MAUA as a potential methodological framework. Further research may lead to the development of a standardized process for any academic health profession program to perform a values-based evaluation of their academic-clinical partnerships to guide decision-making.
Citations
Citations to this article as recorded by
Application of Multi-Attribute Utility Analysis as a Methodological Framework in Academic–Clinical Partnership Evaluation Sara E. North American Journal of Evaluation.2024; 45(4): 562. CrossRef
Multi-attribute monitoring applications in biopharmaceutical analysis Anurag S. Rathore, Deepika Sarin, Sanghati Bhattacharya, Sunil Kumar Journal of Chromatography Open.2024; 6: 100166. CrossRef
Advancing Value-Based Academic–Clinical Partnership Evaluation in Physical Therapy Education: Multiattribute Utility Analysis as a Contextual Methodological Approach Sara North Journal of Physical Therapy Education.2024;[Epub] CrossRef