Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
2 "Isabelle Desjardins"
Filter
Filter
Article category
Keywords
Publication year
Authors
Funded articles
Research article
Experience of introducing an electronic health records station in an objective structured clinical examination to evaluate medical students’ communication skills in Canada: a descriptive study  
Kuan-chin Jean Chen, Ilona Bartman, Debra Pugh, David Topps, Isabelle Desjardins, Melissa Forgie, Douglas Archibald
J Educ Eval Health Prof. 2023;20:22.   Published online July 4, 2023
DOI: https://doi.org/10.3352/jeehp.2023.20.22
  • 3,722 View
  • 150 Download
  • 1 Web of Science
AbstractAbstract PDFSupplementary Material
Purpose
There is limited literature related to the assessment of electronic medical record (EMR)-related competencies. To address this gap, this study explored the feasibility of an EMR objective structured clinical examination (OSCE) station to evaluate medical students’ communication skills by psychometric analyses and standardized patients’ (SPs) perspectives on EMR use in an OSCE.
Methods
An OSCE station that incorporated the use of an EMR was developed and pilot-tested in March 2020. Students’ communication skills were assessed by SPs and physician examiners. Students’ scores were compared between the EMR station and 9 other stations. A psychometric analysis, including item total correlation, was done. SPs participated in a post-OSCE focus group to discuss their perception of EMRs’ effect on communication.
Results
Ninety-nine 3rd-year medical students participated in a 10-station OSCE that included the use of the EMR station. The EMR station had an acceptable item total correlation (0.217). Students who leveraged graphical displays in counseling received higher OSCE station scores from the SPs (P=0.041). The thematic analysis of SPs’ perceptions of students’ EMR use from the focus group revealed the following domains of themes: technology, communication, case design, ownership of health information, and timing of EMR usage.
Conclusion
This study demonstrated the feasibility of incorporating EMR in assessing learner communication skills in an OSCE. The EMR station had acceptable psychometric characteristics. Some medical students were able to efficiently use the EMRs as an aid in patient counseling. Teaching students how to be patient-centered even in the presence of technology may promote engagement.
Brief report
The implementation and evaluation of an e-Learning training module for objective structured clinical examination raters in Canada  
Karima Khamisa, Samantha Halman, Isabelle Desjardins, Mireille St. Jean, Debra Pugh
J Educ Eval Health Prof. 2018;15:18.   Published online August 6, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.18
  • 23,364 View
  • 260 Download
  • 4 Web of Science
  • 5 Crossref
AbstractAbstract PDFSupplementary Material
Improving the reliability and consistency of objective structured clinical examination (OSCE) raters’ marking poses a continual challenge in medical education. The purpose of this study was to evaluate an e-Learning training module for OSCE raters who participated in the assessment of third-year medical students at the University of Ottawa, Canada. The effects of online training and those of traditional in-person (face-to-face) orientation were compared. Of the 90 physicians recruited as raters for this OSCE, 60 consented to participate (67.7%) in the study in March 2017. Of the 60 participants, 55 rated students during the OSCE, while the remaining 5 were back-up raters. The number of raters in the online training group was 41, while that in the traditional in-person training group was 19. Of those with prior OSCE experience (n= 18) who participated in the online group, 13 (68%) reported that they preferred this format to the in-person orientation. The total average time needed to complete the online module was 15 minutes. Furthermore, 89% of the participants felt the module provided clarity in the rater training process. There was no significant difference in the number of missing ratings based on the type of orientation that raters received. Our study indicates that online OSCE rater training is comparable to traditional face-to-face orientation.

Citations

Citations to this article as recorded by  
  • Raters and examinees training for objective structured clinical examination: comparing the effectiveness of three instructional methodologies
    Jefferson Garcia Guerrero, Ayidah Sanad Alqarni, Lorraine Turiano Estadilla, Lizy Sonia Benjamin, Vanitha Innocent Rani
    BMC Nursing.2024;[Epub]     CrossRef
  • Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review
    Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
    Journal of Educational Evaluation for Health Professions.2021; 18: 11.     CrossRef
  • Empirical analysis comparing the tele-objective structured clinical examination and the in-person assessment in Australia
    Jonathan Zachary Felthun, Silas Taylor, Boaz Shulruf, Digby Wigram Allen
    Journal of Educational Evaluation for Health Professions.2021; 18: 23.     CrossRef
  • No observed effect of a student-led mock objective structured clinical examination on subsequent performance scores in medical students in Canada
    Lorenzo Madrazo, Claire Bo Lee, Meghan McConnell, Karima Khamisa, Debra Pugh
    Journal of Educational Evaluation for Health Professions.2019; 16: 14.     CrossRef
  • ОБ’ЄКТИВНИЙ СТРУКТУРОВАНИЙ КЛІНІЧНИЙ ІСПИТ ЯК ВИМІР ПРАКТИЧНОЇ ПІДГОТОВКИ МАЙБУТНЬОГО ЛІКАРЯ
    M. M. Korda, A. H. Shulhai, N. V. Pasyaka, N. V. Petrenko, N. V. Haliyash, N. A. Bilkevich
    Медична освіта.2019; (3): 19.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions
TOP