Purpose There is limited literature related to the assessment of electronic medical record (EMR)-related competencies. To address this gap, this study explored the feasibility of an EMR objective structured clinical examination (OSCE) station to evaluate medical students’ communication skills by psychometric analyses and standardized patients’ (SPs) perspectives on EMR use in an OSCE.
Methods An OSCE station that incorporated the use of an EMR was developed and pilot-tested in March 2020. Students’ communication skills were assessed by SPs and physician examiners. Students’ scores were compared between the EMR station and 9 other stations. A psychometric analysis, including item total correlation, was done. SPs participated in a post-OSCE focus group to discuss their perception of EMRs’ effect on communication.
Results Ninety-nine 3rd-year medical students participated in a 10-station OSCE that included the use of the EMR station. The EMR station had an acceptable item total correlation (0.217). Students who leveraged graphical displays in counseling received higher OSCE station scores from the SPs (P=0.041). The thematic analysis of SPs’ perceptions of students’ EMR use from the focus group revealed the following domains of themes: technology, communication, case design, ownership of health information, and timing of EMR usage.
Conclusion This study demonstrated the feasibility of incorporating EMR in assessing learner communication skills in an OSCE. The EMR station had acceptable psychometric characteristics. Some medical students were able to efficiently use the EMRs as an aid in patient counseling. Teaching students how to be patient-centered even in the presence of technology may promote engagement.
Citations
Citations to this article as recorded by
Usage and perception of electronic medical records (EMR) among medical students in southwestern Nigeria A. A. Adeyeye, A. O. Ajose, O. M. Oduola, B. A. Akodu, A. Olufadeji Discover Public Health.2024;[Epub] CrossRef
Feedback has been shown to be an important driver for learning. However, many factors, such as the emotional reactions feedback evokes, may impact its effect. This study aimed to explore medical students’ perspectives on the verbal feedback they receive during an objective structured clinical examination (OSCE); their emotional reaction to this; and its impact on their subsequent performance. To do this, medical students enrolled at 4 Canadian medical schools were invited to complete a web-based survey regarding their experiences. One hundred and fifty-eight participants completed the survey. Twenty-nine percent of respondents asserted that they had experienced emotional reactions to verbal feedback received in an OSCE setting. The most common emotional responses reported were embarrassment and anxiousness. Some students (n=20) reported that the feedback they received negatively impacted subsequent OSCE performance. This study demonstrates that feedback provided during an OSCE can evoke an emotional response in students and potentially impact subsequent performance.
Citations
Citations to this article as recorded by
Memory, credibility and insight: How video-based feedback promotes deeper reflection and learning in objective structured clinical exams Alexandra Makrides, Peter Yeates Medical Teacher.2022; 44(6): 664. CrossRef
Objective structured clinical examination in fundamentals of nursing and obstetric care as method of verification and assessing the degree of achievement of learning outcomes Lucyna Sochocka, Teresa Niechwiadowicz-Czapka, Mariola Wojtal, Monika Przestrzelska, Iwona Kiersnowska, Katarzyna Szwamel Pielegniarstwo XXI wieku / Nursing in the 21st Century.2021; 20(3): 190. CrossRef
Student-led peer-assisted mock objective structured clinical examinations (MOSCEs) have been used in various settings to help students prepare for subsequent higher-stakes, faculty-run OSCEs. MOSCE participants generally valued feedback from peers and reported benefits to learning. Our study investigated whether participation in a peer-assisted MOSCE affected subsequent OSCE performance. To determine whether mean OSCE scores differed depending on whether medical students participated in the MOSCE, we conducted a between-subjects analysis of variance, with cohort (2016 vs. 2017) and MOSCE participation (MOSCE vs. no MOSCE) as independent variables and the mean OSCE score as the dependent variable. Participation in the MOSCE had no influence on mean OSCE scores (P=0.19). There was a significant correlation between mean MOSCE scores and mean OSCE scores (Pearson r=0.52, P<0.001). Although previous studies described self-reported benefits from participation in student-led MOSCEs, it was not associated with objective benefits in this study.
Citations
Citations to this article as recorded by
Impact of familiarity with the format of the exam on performance in the OSCE of undergraduate medical students – an interventional study Hannes Neuwirt, Iris E. Eder, Philipp Gauckler, Lena Horvath, Stefan Koeck, Maria Noflatscher, Benedikt Schaefer, Anja Simeon, Verena Petzer, Wolfgang M. Prodinger, Christoph Berendonk BMC Medical Education.2024;[Epub] CrossRef
Perceived and actual value of Student‐led Objective Structured Clinical Examinations Brandon Stretton, Adam Montagu, Aline Kunnel, Jenni Louise, Nathan Behrendt, Joshua Kovoor, Stephen Bacchi, Josephine Thomas, Ellen Davies The Clinical Teacher.2024;[Epub] CrossRef
Improving preparation for pharmacy entry-to-practice OSCE using a participatory action research Catherine Huneault, Philippe Haeberli, Alexandra Mühle, Philippe Laurent, Jérôme Berger Currents in Pharmacy Teaching and Learning.2024; 16(11): 102152. CrossRef
Benefits of semiology taught using near-peer tutoring are sustainable Benjamin Gripay, Thomas André, Marie De Laval, Brice Peneau, Alexandre Secourgeon, Nicolas Lerolle, Cédric Annweiler, Grégoire Justeau, Laurent Connan, Ludovic Martin, Loïc Bière BMC Medical Education.2022;[Epub] CrossRef
Identification des facteurs associés à la réussite aux examens cliniques objectifs et structurés dans la faculté de médecine de Rouen M. Leclercq, M. Vannier, Y. Benhamou, A. Liard, V. Gilard, I. Auquit-Auckbur, H. Levesque, L. Sibert, P. Schneider La Revue de Médecine Interne.2022; 43(5): 278. CrossRef
Evaluation of the Experience of Peer-led Mock Objective Structured Practical Examination for First- and Second-year Medical Students Faisal Alsaif, Lamia Alkuwaiz, Mohammed Alhumud, Reem Idris, Lina Neel, Mansour Aljabry, Mona Soliman Advances in Medical Education and Practice.2022; Volume 13: 987. CrossRef
The use of a formative OSCE to prepare emergency medicine residents for summative OSCEs: a mixed-methods cohort study Magdalene Hui Min Lee, Dong Haur Phua, Kenneth Wei Jian Heng International Journal of Emergency Medicine.2021;[Epub] CrossRef
Tutor–Student Partnership in Practice OSCE to Enhance Medical Education Eve Cosker, Valentin Favier, Patrice Gallet, Francis Raphael, Emmanuelle Moussier, Louise Tyvaert, Marc Braun, Eva Feigerlova Medical Science Educator.2021; 31(6): 1803. CrossRef
Peers as OSCE assessors for junior medical students – a review of routine use: a mixed methods study Simon Schwill, Johanna Fahrbach-Veeser, Andreas Moeltner, Christiane Eicher, Sonia Kurczyk, David Pfisterer, Joachim Szecsenyi, Svetla Loukanova BMC Medical Education.2020;[Epub] CrossRef
Improving the reliability and consistency of objective structured clinical examination (OSCE) raters’ marking poses a continual challenge in medical education. The purpose of this study was to evaluate an e-Learning training module for OSCE raters who participated in the assessment of third-year medical students at the University of Ottawa, Canada. The effects of online training and those of traditional in-person (face-to-face) orientation were compared. Of the 90 physicians recruited as raters for this OSCE, 60 consented to participate (67.7%) in the study in March 2017. Of the 60 participants, 55 rated students during the OSCE, while the remaining 5 were back-up raters. The number of raters in the online training group was 41, while that in the traditional in-person training group was 19. Of those with prior OSCE experience (n= 18) who participated in the online group, 13 (68%) reported that they preferred this format to the in-person orientation. The total average time needed to complete the online module was 15 minutes. Furthermore, 89% of the participants felt the module provided clarity in the rater training process. There was no significant difference in the number of missing ratings based on the type of orientation that raters received. Our study indicates that online OSCE rater training is comparable to traditional face-to-face orientation.
Citations
Citations to this article as recorded by
Raters and examinees training for objective structured clinical examination: comparing the effectiveness of three instructional methodologies Jefferson Garcia Guerrero, Ayidah Sanad Alqarni, Lorraine Turiano Estadilla, Lizy Sonia Benjamin, Vanitha Innocent Rani BMC Nursing.2024;[Epub] CrossRef
Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen Journal of Educational Evaluation for Health Professions.2021; 18: 11. CrossRef
Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen Journal of Educational Evaluation for Health Professions.2021; 18: 23. CrossRef
No observed effect of a student-led mock objective structured clinical examination on subsequent performance scores in medical students in Canada Lorenzo Madrazo, Claire Bo Lee, Meghan McConnell, Karima Khamisa, Debra Pugh Journal of Educational Evaluation for Health Professions.2019; 16: 14. CrossRef
ОБ’ЄКТИВНИЙ СТРУКТУРОВАНИЙ КЛІНІЧНИЙ ІСПИТ ЯК ВИМІР ПРАКТИЧНОЇ ПІДГОТОВКИ МАЙБУТНЬОГО ЛІКАРЯ M. M. Korda, A. H. Shulhai, N. V. Pasyaka, N. V. Petrenko, N. V. Haliyash, N. A. Bilkevich Медична освіта.2019; (3): 19. CrossRef