Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Articles

Page Path
HOME > J Educ Eval Health Prof > Volume 13; 2016 > Article
Research article
Psychometric properties of a novel knowledge assessment tool of mechanical ventilation for emergency medicine residents in the northeastern United States
Jeremy B. Richards1*orcid, Tania D. Strout2orcid, Todd A. Seigel3orcid, Susan R. Wilcox1,4orcid

DOI: https://doi.org/10.3352/jeehp.2016.13.10
Published online: February 16, 2016

1Division of Pulmonary and Critical Care Medicine, Medical University of South Carolina, Charleston, SC, USA

2Tufts University School of Medicine and Department of Emergency Medicine, Maine Medical Center; Portland, ME, USA

3Oakland and Richmond Medical Centers; Oakland, CA, USA

4Division of Emergency Medicine, Medical University of South Carolina; Charleston, SC, USA

*Corresponding email: richarje@musc.edu

: 

• Received: January 10, 2016   • Accepted: February 14, 2016

© 2016, Korea Health Personnel Licensing Examination Institute

This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  • 27,584 Views
  • 171 Download
  • 4 Web of Science
  • 4 Crossref
  • 5 Scopus
  • Purpose:
    Prior descriptions of the psychometric properties of validated knowledge assessment tools designed to determine Emergency medicine (EM) residents understanding of physiologic and clinical concepts related to mechanical ventilation are lacking. In this setting, we have performed this study to describe the psychometric and performance properties of a novel knowledge assessment tool that measures EM residents’ knowledge of topics in mechanical ventilation.
  • Methods:
    Results from a multicenter, prospective, survey study involving 219 EM residents from 8 academic hospitals in northeastern United States were analyzed to quantify reliability, item difficulty, and item discrimination of each of the 9 questions included in the knowledge assessment tool for 3 weeks, beginning in January 2013.
  • Results:
    The response rate for residents completing the knowledge assessment tool was 68.6% (214 out of 312 EM residents). Reliability was assessed by both Cronbach’s alpha coefficient (0.6293) and the Spearman-Brown coefficient (0.6437). Item difficulty ranged from 0.39 to 0.96, with a mean item difficulty of 0.75 for all 9 questions. Uncorrected item discrimination values ranged from 0.111 to 0.556. Corrected item-total correlations were determined by removing the question being assessed from analysis, resulting in a range of item discrimination from 0.139 to 0.498.
  • Conclusion:
    Reliability, item difficulty and item discrimination were within satisfactory ranges in this study, demonstrating acceptable psychometric properties of this knowledge assessment tool. This assessment indicates that this knowledge assessment tool is sufficiently rigorous for use in future research studies or for assessment of EM residents for evaluative purposes.
Management of mechanical ventilation is an important aspect of caring for intubated, critically ill patients and adhering to evidence-based practices can improve patient outcomes [1-3]. Emergency medicine (EM) providers are responsible for caring for intubated, mechanically ventilated patients in the emergency department (ED), and during this time EM providers may have to adjust mechanical ventilation settings in response to physiologic or clinical conditions. EM residents provide a substantial portion of the clinical care of patients in the ED, including caring for critically ill, mechanically ventilated patients [4,5]. Therefore, accurately and reliably determining EM residents’ knowledge and understanding of principles of mechanical ventilation is important for both education and clinical care. Determining EM residents’ overall knowledge of management of mechanical ventilation can guide curricular design and the need for teaching on these principles. Furthermore, focused and topical education regarding clinically relevant issues in mechanical ventilation has the potential to improve residents’ clinical care and patient outcomes [6-8] .
To assess EM residents’ attitudes towards and understanding of the clinical management of mechanical ventilation, we designed a survey to determine EM residents’ experience with mechanically ventilated patients, their perception of the frequency of teaching they receive about mechanical ventilation, and their comfort with caring for mechanically ventilated patients [9]. We also developed a complimentary knowledge assessment tool to identify specific areas of strength or weakness in EM residents’ understanding of concepts related to the management of mechanical ventilation. In this study, we report the psychometric characteristics of the knowledge assessment tool, including reliability, item difficulty, and item discrimination, as understanding the performance characteristics of this tool will strengthen its utility in generalized clinical educational settings and future research studies. Of note, validity of this tool has previously been described with regard to construct, internal structure, and relationship to other variables [9].
Materials and subjects
As previously described, we developed, pre-tested, piloted and used a survey tool to quantify EM residents’ training experiences with, education about, and perceived comfort with caring for mechanically ventilated patients [9]. Concomitantly, we generated an assessment instrument with questions specific to mechanical ventilation in EM that included topics of respiratory physiology, modes of mechanical ventilation, and complications of mechanical ventilation [9,10]. The knowledge assessment tool was also pre-tested, piloted, and used to assess EM residents’ knowledge of principles in management of mechanically ventilated patients (Table 1) [9]. The study protocol was approved by the institutional review boards of all participating institutions (IRB Numbers of participating institutes: 2012-P-000769/1 [Massachusetts General Hospital, Bay State Medical Center, Brown University, and University of Massachusetts Medical Center], H-31757 [Boston University Medical Center], 4026X [Maine Medical Center], 1205010209 [Yale]). The survey and knowledge assessment tools were sent to all EM residents at 8 academic hospitals centered in the northeastern United States by email. The email invitations to participate in the study were sent once weekly for 3 weeks, beginning in January 2013. Consent for participation in the study and publication of results was obtained from each subject at the time of enrollment, as the survey introduction stated that partaking of the survey indicated consent.
Statistical analysis
Study data were exported into a Microsoft Excel (Microsoft Corp., Redmond, WA) spreadsheet program and were then transferred into SPSS (v. 11.0, SPSS, Inc., Chicago, IL) for analysis. For all variables, missing data were excluded on a case-by-case basis. We performed Classical Test Theory-based psychometric analysis, including item and reliability analyses, of the knowledge assessment tool. Item analysis included computation of item difficulties, item discrimination values, and corrected item-total correlations. For dichotomous data (correct or incorrect response), item difficulties are a ratio of correct to all responses. Item discrimination describes how well items discriminate between test-takers. Uncorrected item discrimination values because the item of interest was included in the computation of the uncorrected correlation between the item and the total test score (r), the r value was artificially inflated. To address this, the correlation was re-calculated without the item of interest, resulting in corrected item-total correlations. Reliability analysis was performed using both Cronbach’s alpha and an estimate of split-half reliability as measured by the Spearman–Brown coefficient for unequal lengths [11].
With regard to response rate, 219 of 312 residents (70.2%) started the survey and knowledge assessment tools, and 214 completing both instruments (68.6%) [9]. Number of missing responses for individual items on the knowledge portion of the survey ranged from 1 of 218 (0.46%, difference between assist-control and pressure support) to 8 of 218 (3.7%, key physiology terminology; ventilating patients with obstructive disease and high resistance; and mechanical ventilation immediate troubleshooting questions). Item difficulties ranged from 0.39 to 0.96, representing a reasonable mix of items spanning a difficulty range of more simple to quite difficult to answer correctly. Mean item difficulty for all 9 items was 0.75.
Uncorrected item discrimination values ranged from 0.111 to 0.556. Corrected item-total correlations ranged from 0.139 to 0.498. No item discrimination value was below the minimum acceptable point biserial correlation value of 0.1, indicating that all items discriminate between individual survey respondents in a desirable manner (Table 2). Internal consistency reliability, as measured by Cronbach’s alpha coefficient, was 0.6293 for the knowledge assessment. Split-half reliability, estimated using the Spearman-Brown coefficient for unequal lengths, was 0.6437.
In this study, we describe the psychometric and performance characteristics of a knowledge assessment tool designed to assess EM residents’ understanding of concepts in mechanical ventilation. In a cohort of residents from multiple medical centers with a high response rate, we demonstrate an acceptable range of item difficulty and good item discrimination. Reliability, as demonstrated by Cronbach’s alpha and Spearman-Brown coefficient, was also acceptable [12]. Validity has previously been described with regard to construct, internal structure and relationship to other variables [9].
As management of mechanically ventilated patients involves a variety of clinical and physiologic considerations, our knowledge assessment tool involved 9 discrete conceptual domains (Table 1). Item difficulty for all questions in our knowledge assessment tool ranged from 0.39 to 0.96, with a mean item difficulty of 0.7516 which is within acceptable range for assessment tools [13]. No question had item difficulty of less than 0.25 in this cohort of residents, demonstrating that no item was markedly affected by test-taker misinterpretation of the question or answer options.
Item discrimination for all 9 questions in our knowledge assessment tool was positive, with values of greater than 0.20 for the majority of questions [14,15]. This finding indicates that the questions included in the knowledge assessment tool adequately differentiate between residents with lower and higher levels of knowledge. Given that we hypothesized that residents who more frequently manage mechanically ventilated patients would perform better on a baseline knowledge assessment, this survey also served as a means to collect validity evidence for our novel assessment instrument [16].
These findings, coupled with acceptable reliability indices and appropriate construct validity, demonstrate that this knowledge assessment tool’s performance characteristics are likely to be durable in other populations. The item difficulty, item discrimination, reliability and validity results may inform future use of this knowledge assessment tool for evaluation or research purposes.
Future work could include assessing knowledge of mechanical ventilation in other cohorts of EM residents, from other geographic regions or backgrounds, to further characterize EM resident knowledge of mechanical ventilation and further describe the performance and psychometric properties of the knowledge assessment tool. Furthermore, the knowledge assessment tool could be used with other EM providers, such as attendings, physician assistants, nurses, or other health care professionals. Describing the performance and psychometric properties of our knowledge assessment tool in these populations would provide more insight into the tool’s generalizability and usefulness in populations beyond EM residents.
There are limitations to our study. The psychometric analyses presented here reflect a single cohort of EM residents, although the data were obtained in a multicenter study with 9 different medical centers and EM residents from all years of post-graduate training. Furthermore, while the response rate was high with 68.6% of the cohort completing the survey, non-responder bias may affect item difficulty and discrimination analyses. In addition, we did not formally assess acceptability of the knowledge assessment tool, although both pre-testing and the high response rate imply adequate acceptability. Finally, generalizability of our results to other cohorts of EM residents or other EM providers has not been assessed.
In conclusion, psychometric analyses of our knowledge assessment tool for mechanical ventilation in the ED demonstrates acceptable item difficulty, good item discrimination, and satisfactory reliability. These acceptable psychometric parameters indicate that our knowledge assessment tool is sufficiently valid and reliable for use in future research studies or for assessment of EM residents for evaluative purposes.

Conflict of interest

No potential conflict of interest relevant to this article was reported.

Audio recording of the abstract.
jeehp-13-10-abstract-recording.avi
Table 1.
Major domains assessed with the knowledge assessment tool of mechanical ventilation for emergency medicine residents in the northeastern United States (2013)
Knowledge of vent modes - Assist control vs. pressure support
Pneumonia with hypoxemic respiratory failure - management of oxygenation and decreasing FiO2 when indicated
Management of ventilation for a patient being over-ventilated
Understanding key principles of acute respiratory distress syndrome (ARDS)
Understanding key principles of ARDS - Management of elevated plateau pressure
Overventilation in traumatic brain injury - management of traumatic brain injury
Pulmonary physiology in patient with asthma - understanding resistance
Management of vent in an asthmatic patient
Approach to trouble-shooting – managing vent alarms
Table 2.
Item difficulty, uncorrected and corrected item-total correlations for the knowledge survey of mechanical ventilation for emergency medicine residents in the northeastern United States (2013)
Item N Item difficulty Uncorrected item-total correlationa) Corrected item-total correlationb)
Assist control vs. pressure support 217 0.6682 0.5315 0.3521
Understanding determinants of oxygenation 215 0.9069 0.3396 0.1964
Understanding determinants of ventilation 214 0.9626 0.2467 0.1622
Principles of ventilating patients with acute respiratory distress syndrome (ARDS) 213 0.8403 0.5352 0.3738
Concepts regarding ventilation in ARDS, plateau pressures 212 0.7264 0.6564 0.4895
Ventilation in brain injury 211 0.3981 0.3825 0.1390
Critical physiology terminology 210 0.6190 0.5647 0.3367
Ventilation in obstructive disease, high resistance 210 0.8428 0.6209 0.4712
Immediate troubleshooting 210 0.8000 0.4852 0.2881

a) Uncorrected item-total correlation: Pearson’s correlation coefficient (r) for each item and the total knowledge survey score.

b) Corrected item-total correlation: inflation corrected point biserial correlation value (σpbis) for each item and the total knowledge survey score.

Figure & Data

References

    Citations

    Citations to this article as recorded by  
    • Comparison of three methods for teaching mechanical ventilation in an emergency setting to sixth-year medical students: a randomized trial
      Fernando Sabia Tallo, Letícia Sandre Vendrame, André Luciano Baitello
      Revista da Associação Médica Brasileira.2020; 66(10): 1409.     CrossRef
    • Critical Appraisal of Emergency Medicine Educational Research: The Best Publications of 2016
      Nicole M. Dubosh, Jaime Jordan, Lalena M. Yarris, Edward Ullman, Joshua Kornegay, Daniel Runde, Amy Miller Juve, Jonathan Fisher, Teresa Chan
      AEM Education and Training.2019; 3(1): 58.     CrossRef
    • Mechanical Ventilation Training During Graduate Medical Education: Perspectives and Review of the Literature
      Jonathan M. Keller, Dru Claar, Juliana Carvalho Ferreira, David C. Chu, Tanzib Hossain, William Graham Carlos, Jeffrey A. Gold, Stephanie A. Nonas, Nitin Seam
      Journal of Graduate Medical Education.2019; 11(4): 389.     CrossRef
    • Development and validation of a questionnaire to assess the knowledge of mechanical ventilation in urgent care among students in their last-year medical course in Brazil
      Fernando Sabia Tallo, Simone de Campos Vieira Abib, Andre Luciano Baitello, Renato Delascio Lopes
      Clinics.2019; 74: e663.     CrossRef

    We recommend

    JEEHP : Journal of Educational Evaluation for Health Professions