jeehp Search

CLOSE


J Educ Eval Health Prof > Volume 13; 2016 > Article
Richards, Strout, Seigel, and Wilcox: Psychometric properties of a novel knowledge assessment tool of mechanical ventilation for emergency medicine residents in the northeastern United States

Abstract

Purpose:

Prior descriptions of the psychometric properties of validated knowledge assessment tools designed to determine Emergency medicine (EM) residents understanding of physiologic and clinical concepts related to mechanical ventilation are lacking. In this setting, we have performed this study to describe the psychometric and performance properties of a novel knowledge assessment tool that measures EM residents’ knowledge of topics in mechanical ventilation.

Methods:

Results from a multicenter, prospective, survey study involving 219 EM residents from 8 academic hospitals in northeastern United States were analyzed to quantify reliability, item difficulty, and item discrimination of each of the 9 questions included in the knowledge assessment tool for 3 weeks, beginning in January 2013.

Results:

The response rate for residents completing the knowledge assessment tool was 68.6% (214 out of 312 EM residents). Reliability was assessed by both Cronbach’s alpha coefficient (0.6293) and the Spearman-Brown coefficient (0.6437). Item difficulty ranged from 0.39 to 0.96, with a mean item difficulty of 0.75 for all 9 questions. Uncorrected item discrimination values ranged from 0.111 to 0.556. Corrected item-total correlations were determined by removing the question being assessed from analysis, resulting in a range of item discrimination from 0.139 to 0.498.

Conclusion:

Reliability, item difficulty and item discrimination were within satisfactory ranges in this study, demonstrating acceptable psychometric properties of this knowledge assessment tool. This assessment indicates that this knowledge assessment tool is sufficiently rigorous for use in future research studies or for assessment of EM residents for evaluative purposes.

Introduction

Management of mechanical ventilation is an important aspect of caring for intubated, critically ill patients and adhering to evidence-based practices can improve patient outcomes [1-3]. Emergency medicine (EM) providers are responsible for caring for intubated, mechanically ventilated patients in the emergency department (ED), and during this time EM providers may have to adjust mechanical ventilation settings in response to physiologic or clinical conditions. EM residents provide a substantial portion of the clinical care of patients in the ED, including caring for critically ill, mechanically ventilated patients [4,5]. Therefore, accurately and reliably determining EM residents’ knowledge and understanding of principles of mechanical ventilation is important for both education and clinical care. Determining EM residents’ overall knowledge of management of mechanical ventilation can guide curricular design and the need for teaching on these principles. Furthermore, focused and topical education regarding clinically relevant issues in mechanical ventilation has the potential to improve residents’ clinical care and patient outcomes [6-8] .
To assess EM residents’ attitudes towards and understanding of the clinical management of mechanical ventilation, we designed a survey to determine EM residents’ experience with mechanically ventilated patients, their perception of the frequency of teaching they receive about mechanical ventilation, and their comfort with caring for mechanically ventilated patients [9]. We also developed a complimentary knowledge assessment tool to identify specific areas of strength or weakness in EM residents’ understanding of concepts related to the management of mechanical ventilation. In this study, we report the psychometric characteristics of the knowledge assessment tool, including reliability, item difficulty, and item discrimination, as understanding the performance characteristics of this tool will strengthen its utility in generalized clinical educational settings and future research studies. Of note, validity of this tool has previously been described with regard to construct, internal structure, and relationship to other variables [9].

Methods

Materials and subjects

As previously described, we developed, pre-tested, piloted and used a survey tool to quantify EM residents’ training experiences with, education about, and perceived comfort with caring for mechanically ventilated patients [9]. Concomitantly, we generated an assessment instrument with questions specific to mechanical ventilation in EM that included topics of respiratory physiology, modes of mechanical ventilation, and complications of mechanical ventilation [9,10]. The knowledge assessment tool was also pre-tested, piloted, and used to assess EM residents’ knowledge of principles in management of mechanically ventilated patients (Table 1) [9]. The study protocol was approved by the institutional review boards of all participating institutions (IRB Numbers of participating institutes: 2012-P-000769/1 [Massachusetts General Hospital, Bay State Medical Center, Brown University, and University of Massachusetts Medical Center], H-31757 [Boston University Medical Center], 4026X [Maine Medical Center], 1205010209 [Yale]). The survey and knowledge assessment tools were sent to all EM residents at 8 academic hospitals centered in the northeastern United States by email. The email invitations to participate in the study were sent once weekly for 3 weeks, beginning in January 2013. Consent for participation in the study and publication of results was obtained from each subject at the time of enrollment, as the survey introduction stated that partaking of the survey indicated consent.

Statistical analysis

Study data were exported into a Microsoft Excel (Microsoft Corp., Redmond, WA) spreadsheet program and were then transferred into SPSS (v. 11.0, SPSS, Inc., Chicago, IL) for analysis. For all variables, missing data were excluded on a case-by-case basis. We performed Classical Test Theory-based psychometric analysis, including item and reliability analyses, of the knowledge assessment tool. Item analysis included computation of item difficulties, item discrimination values, and corrected item-total correlations. For dichotomous data (correct or incorrect response), item difficulties are a ratio of correct to all responses. Item discrimination describes how well items discriminate between test-takers. Uncorrected item discrimination values because the item of interest was included in the computation of the uncorrected correlation between the item and the total test score (r), the r value was artificially inflated. To address this, the correlation was re-calculated without the item of interest, resulting in corrected item-total correlations. Reliability analysis was performed using both Cronbach’s alpha and an estimate of split-half reliability as measured by the Spearman-Brown coefficient for unequal lengths [11].

Results

With regard to response rate, 219 of 312 residents (70.2%) started the survey and knowledge assessment tools, and 214 completing both instruments (68.6%) [9]. Number of missing responses for individual items on the knowledge portion of the survey ranged from 1 of 218 (0.46%, difference between assist-control and pressure support) to 8 of 218 (3.7%, key physiology terminology; ventilating patients with obstructive disease and high resistance; and mechanical ventilation immediate troubleshooting questions). Item difficulties ranged from 0.39 to 0.96, representing a reasonable mix of items spanning a difficulty range of more simple to quite difficult to answer correctly. Mean item difficulty for all 9 items was 0.75.
Uncorrected item discrimination values ranged from 0.111 to 0.556. Corrected item-total correlations ranged from 0.139 to 0.498. No item discrimination value was below the minimum acceptable point biserial correlation value of 0.1, indicating that all items discriminate between individual survey respondents in a desirable manner (Table 2). Internal consistency reliability, as measured by Cronbach’s alpha coefficient, was 0.6293 for the knowledge assessment. Split-half reliability, estimated using the Spearman-Brown coefficient for unequal lengths, was 0.6437.

Discussion

In this study, we describe the psychometric and performance characteristics of a knowledge assessment tool designed to assess EM residents’ understanding of concepts in mechanical ventilation. In a cohort of residents from multiple medical centers with a high response rate, we demonstrate an acceptable range of item difficulty and good item discrimination. Reliability, as demonstrated by Cronbach’s alpha and Spearman-Brown coefficient, was also acceptable [12]. Validity has previously been described with regard to construct, internal structure and relationship to other variables [9].
As management of mechanically ventilated patients involves a variety of clinical and physiologic considerations, our knowledge assessment tool involved 9 discrete conceptual domains (Table 1). Item difficulty for all questions in our knowledge assessment tool ranged from 0.39 to 0.96, with a mean item difficulty of 0.7516 which is within acceptable range for assessment tools [13]. No question had item difficulty of less than 0.25 in this cohort of residents, demonstrating that no item was markedly affected by test-taker misinterpretation of the question or answer options.
Item discrimination for all 9 questions in our knowledge assessment tool was positive, with values of greater than 0.20 for the majority of questions [14,15]. This finding indicates that the questions included in the knowledge assessment tool adequately differentiate between residents with lower and higher levels of knowledge. Given that we hypothesized that residents who more frequently manage mechanically ventilated patients would perform better on a baseline knowledge assessment, this survey also served as a means to collect validity evidence for our novel assessment instrument [16].
These findings, coupled with acceptable reliability indices and appropriate construct validity, demonstrate that this knowledge assessment tool’s performance characteristics are likely to be durable in other populations. The item difficulty, item discrimination, reliability and validity results may inform future use of this knowledge assessment tool for evaluation or research purposes.
Future work could include assessing knowledge of mechanical ventilation in other cohorts of EM residents, from other geographic regions or backgrounds, to further characterize EM resident knowledge of mechanical ventilation and further describe the performance and psychometric properties of the knowledge assessment tool. Furthermore, the knowledge assessment tool could be used with other EM providers, such as attendings, physician assistants, nurses, or other health care professionals. Describing the performance and psychometric properties of our knowledge assessment tool in these populations would provide more insight into the tool’s generalizability and usefulness in populations beyond EM residents.
There are limitations to our study. The psychometric analyses presented here reflect a single cohort of EM residents, although the data were obtained in a multicenter study with 9 different medical centers and EM residents from all years of post-graduate training. Furthermore, while the response rate was high with 68.6% of the cohort completing the survey, non-responder bias may affect item difficulty and discrimination analyses. In addition, we did not formally assess acceptability of the knowledge assessment tool, although both pre-testing and the high response rate imply adequate acceptability. Finally, generalizability of our results to other cohorts of EM residents or other EM providers has not been assessed.
In conclusion, psychometric analyses of our knowledge assessment tool for mechanical ventilation in the ED demonstrates acceptable item difficulty, good item discrimination, and satisfactory reliability. These acceptable psychometric parameters indicate that our knowledge assessment tool is sufficiently valid and reliable for use in future research studies or for assessment of EM residents for evaluative purposes.

Notes

Conflict of interest

No potential conflict of interest relevant to this article was reported.

Supplementary material

Audio recording of the abstract.
jeehp-13-10-abstract-recording.avi

References

1. Wood S, Winters ME. Care of the intubated emergency department patient. J Emerg Med 2011;40:419-427. http://dx.doi.org/10.1016/j.jemermed.2010.02.021
crossref pmid
2. Fuller BM, Mohr NM, Miller CN, Deitchman AR, Levine BJ, Castagno N, Hassebroek EC, Dhedhi A, Scott-Wittenborn N, Grace E, Lehew C, Kollef MH. Mechanical ventilation and ARDS in the ED: a multi-center, observational, prospective, cross-sectional study. Chest 2015;148:365-374. http://dx.doi.org/10.1378/chest.14-2476
crossref pmid pmc
3. Allison MG, Scott MC, Hu KM, Witting MD, Winters ME. High initial tidal volumes in emergency department patients at risk for acute respiratory distress syndrome. J Crit Care 2015;30:341-343. http://dx.doi.org/10.1016/j.jcrc.2014.12.004
crossref pmid
4. Brennan DF, Silvestri S, Sun JY, Papa L. Progression of emergency medicine resident productivity. Acad Emerg Med 2007;14:790-794. http://dx.doi.org/10.1111/j.1553-2712.2007.tb02353.x
crossref pmid
5. Cushman JT, Witting MD. Emergency medicine resident scheduling and patient exposure. Acad Emerg Med 2003;10:816-818. http://dx.doi.org/DOI:10.1197/aemj.10.7.816
crossref pmid
6. Gajic O, Frutos-Vivar F, Esteban A, Hubmayr RD, Anzueto A. Ventilator settings as a risk factor for acute respiratory distress syndrome in mechanically ventilated patients. Intensive Care Med 2005;31:922-926. http://dx.doi.org/10.1007/s00134-005-2625-1
crossref pmid
7. Hodder R, Lougheed MD, FitzGerald JM, Rowe BH, Kaplan AG, McIvor RA. Management of acute asthma in adults in the emergency department: assisted ventilation. CMAJ 2010;182:265-272. http://dx.doi.org/10.1503/cmaj.080073
crossref pmid pmc
8. Warner KJ, Cuschieri J, Copass MK, Jurkovich GJ, Bulger EM. Emergency department ventilation effects outcome in severe traumatic brain injury. J Trauma 2008;64:341-347. http://dx.doi.org/10.1097/TA.0b013e318160dfb3
crossref pmid
9. Wilcox SR, Seigel TA, Strout TD, Schneider JI, Mitchell PM, Marcolini EG, Cocchi MN, Smithline HA, Lutfy-Clayton L, Mullen M, Ilgen JS, Richards JB. Emergency medicine residents’ knowledge of mechanical ventilation. J Emerg Med 2015;48:481-491. http://dx.doi.org/10.1016/j.jemermed.2014.09.059
crossref pmid
10. Goligher EC, Ferguson ND, Kenny LP. Core competency in mechanical ventilation: Development of educational objectives using the delphi technique. Crit Care Med 2012;40:2828-2832. http://dx.doi.org/10.1097/CCM.0b013e31825bc695
crossref pmid
11. Crocker L, Algina J. Introduction to classical and modern test theory. Mason, OH: Cengage Learning; 2008.

12. Bland JM, Altman DG. Cronbach’s alpha. BMJ 1997;314:572. http://dx.doi.org/10.1136/bmj.314.7080.572
crossref pmid pmc
13. Tavakol M, Dennick R. Psychometric evaluation of a knowledge based examination using Rasch analysis: an illustrative guide: AMEE guide no 72. Med Teach 2013;35:3838-3848. http://dx.doi.org/10.3109/0142159X.2012.737488
crossref
14. Bechger TM, Maris G, Verstralen HHFM, Beguin AA. Using classical test theory in combination with item response theory. Appl Psychol Meas 2003;27:319-334. http://dx.doi.org/10.1177/0146621603257518
crossref
15. Downing SM. Item response theory: applications of modern test theory in medical education. Med Educ 2003;37:739-745. http://dx.doi.org/10.1046/j.1365-2923.2003.01587.x
crossref pmid
16. Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ 2003;37:830-837. http://dx.doi.org/10.1046/j.1365-2923.2003.01594.x
crossref pmid

Table 1.
Major domains assessed with the knowledge assessment tool of mechanical ventilation for emergency medicine residents in the northeastern United States (2013)
Knowledge of vent modes - Assist control vs. pressure support
Pneumonia with hypoxemic respiratory failure - management of oxygenation and decreasing FiO2 when indicated
Management of ventilation for a patient being over-ventilated
Understanding key principles of acute respiratory distress syndrome (ARDS)
Understanding key principles of ARDS - Management of elevated plateau pressure
Overventilation in traumatic brain injury - management of traumatic brain injury
Pulmonary physiology in patient with asthma - understanding resistance
Management of vent in an asthmatic patient
Approach to trouble-shooting - managing vent alarms
Table 2.
Item difficulty, uncorrected and corrected item-total correlations for the knowledge survey of mechanical ventilation for emergency medicine residents in the northeastern United States (2013)
Item N Item difficulty Uncorrected item-total correlationa) Corrected item-total correlationb)
Assist control vs. pressure support 217 0.6682 0.5315 0.3521
Understanding determinants of oxygenation 215 0.9069 0.3396 0.1964
Understanding determinants of ventilation 214 0.9626 0.2467 0.1622
Principles of ventilating patients with acute respiratory distress syndrome (ARDS) 213 0.8403 0.5352 0.3738
Concepts regarding ventilation in ARDS, plateau pressures 212 0.7264 0.6564 0.4895
Ventilation in brain injury 211 0.3981 0.3825 0.1390
Critical physiology terminology 210 0.6190 0.5647 0.3367
Ventilation in obstructive disease, high resistance 210 0.8428 0.6209 0.4712
Immediate troubleshooting 210 0.8000 0.4852 0.2881

a) Uncorrected item-total correlation: Pearson’s correlation coefficient (r) for each item and the total knowledge survey score.

b) Corrected item-total correlation: inflation corrected point biserial correlation value (σpbis) for each item and the total knowledge survey score.

TOOLS
Share :
Facebook Twitter Linked In Google+
METRICS Graph View
  • 3 Crossref
  • 0 Scopus
  • 22,585 View
  • 137 Download
We recommend
Related articles in JEEHP

Efficacy of an asynchronous electronic curriculum in emergency medicine education in the United States2017 ;14



BROWSE ARTICLES
PUBLICATION ETHICS
FOR CONTRIBUTORS
ARTICLE CATEGORY
ABOUT
Editorial Office
Institute of Medical Education, College of Medicine, Hallym Unversity, Hallymdaehak-gil 1, Chuncheon 24252, Korea
TEL: +82-33-248-2652   

Copyright © 2020 by Korea Health Personnel Licensing Examination Institute. All rights reserved.

Developed in M2community

Close layer
prev next