Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions



Page Path
HOME > J Educ Eval Health Prof > Volume 19; 2022 > Article
Research article
Content validity test of a safety checklist for simulated participants in simulation-based education in the United Kingdom: a methodological study
Matthew Bradley*orcid

Published online: August 25, 2022

The University Hospitals Bristol and Weston NHS Foundation Trust Simulation Services, Bristol, UK

*Corresponding email:

Editor: Sun Huh, Hallym University, Korea

• Received: May 24, 2022   • Accepted: July 19, 2022

© 2022 Korea Health Personnel Licensing Examination Institute

This is an open-access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  • 155 Download
  • Purpose
    Simulation training is an ever-growing means of healthcare education and often involves simulated participants (SPs), commonly known as actors. Simulation-based education (SBE) can sometimes endanger SPs, and as such we have created a safety checklist for them to follow. This study describes how we developed the checklist through a quality improvement project, and then evaluated feedback responses to assess whether SPs felt our checklist was safe.
  • Methods
    The checklist was provided to SPs working in an acute trust simulation service when delivering multidisciplinary SBE over 4 months. Using multiple plan–do–study–act cycles, the checklist was refined by reflecting on SP feedback to ensure that the standards of the safe simulation were met. We collected 21 responses from September to December 2021 after SPs completed an SBE event.
  • Results
    The responses showed that 100% of SPs felt safe during SBE when using our checklist. The average “confidence in safety” rating before using the checklist was 6.8/10, which increased significantly to 9.2/10 after using the checklist (P<0.0005). The checklist was refined throughout the 4 months and implemented in adult and pediatric SBE as a standard operating procedure.
  • Conclusion
    We recommend using our safety checklist as a standard operating procedure to improve the confidence and safety of SPs during safe and effective simulations.
Our healthcare system has undergone a vast transformation in recent times, with changes in service demand and provision requiring an effective and resourceful approach [1]. The current challenges we face highlight the need for dynamic and innovative ways of training healthcare professionals. Simulation-based medical education is growing as a model of training and is pivotal in adapting education to meet new healthcare demands and improve standards of care.
More recently, trends in simulation-based education (SBE) have involved adaptations to deliver increasingly realistic simulation, with the use of human role players, known as simulated participants (SPs) at an all-time high [2]. An SP refers to a person trained to portray a patient or another person (such as a relative, member of the public or allied healthcare professional) in a simulation environment. The term “actor” is often used to describe an SP involved in a simulation, but the role of an SP differs from traditional acting roles in that they are part of the educational team and have a role in achieving the learning outcomes of each specific simulation (Table 1) [3-5].
The use of an SP creates a more realistic interaction between the patient and learner and enhances the training experience when compared to using manikins [4]. Furthermore, an SP provides feedback when debriefing a simulation, including how it felt to interact with the learners as a patient. This can provide significant value if the SBE aims to develop non-technical skills including communication, teamwork, empathy, education, and holistic care [5].
As a specialist group of faculty, SPs should be supported with the same considerations as learners and other faculty members. Clinical practice is often invasive and involves physical and emotional interactions with patients. In SBE, there is a risk that SPs may have interventions performed on them, such as oxygen delivery, venipuncture, and even administration of airway adjuncts. To provide a simulation that adheres to the highest possible standards, we need to ensure SPs are safe throughout the simulation and are well-prepared to contribute to an effective simulation experience.
This study aimed to determine the validity of a checklist to ensure the safety of SPs when performing roles such as patients, nurses, and relatives during SBE. We supported the safety of SPs within the SBE setting with the creation of a standard operating procedure (SOP). One format of SOPs that enables clear, detailed instructions is a checklist, and we used a checklist format to provide a concise, stepwise framework to follow. Developing the safety checklist that could then be used as an SOP was an essential point of this work. This would provide important foundations for new and experienced SPs to work in a safe environment. If this checklist provides SPs with confidence in safety during SBE, that would serve as supportive evidence for implementing the checklist in SBE.
Ethics statement
Before all SBE sessions, we obtained informed consent from SPs for using the checklist, completing feedback, and for anonymized data collection. As the project was set within a simulation program, consent was also obtained for SPs and learners to participate in SBE. No formal ethics approval was sought as per advice from the Health Research Authority of the National Health Service, United Kingdom.
Study design
This is a methodological study of a measurement tool through a survey completed by SPs.
The initial SP checklist was created and utilized within the University Hospitals Bristol and Weston (UHBW) Simulation Services (Supplement 1), a tertiary teaching hospital in the United Kingdom. This institution encompasses general inpatient and outpatient medical, surgical services, in addition to tertiary pediatric, heart, eye, dental, hematology and oncology centers. The study period lasted 4 months, from September 2021 to December 2021.
Initially, the setting for using the checklist was SBE on adult inpatient wards from all of the aforementioned services. Over 3 months, the checklist was also integrated into pediatric SBEs. The cohort of SPs consisted of members of the simulation services from both clinical and non-clinical backgrounds.
Data sources/measurement
This checklist consisted of a laminated A4 sheet that contained objectives that the SP and simulation facilitator discussed and then checked once completed. To support SPs within the SBE setting, the checklist incorporated both the Association of Standardized Patient Educators Standards of Best Practice and the Association for Simulated Practice in Healthcare standards for healthcare simulation [5,6]. Using these standards as a framework enabled the development of a checklist that promoted best practice and considered the key principles of safety, accountability, quality, professionalism, and collaboration.
We grouped our objectives into 5 “domains” of SBE: preparation, pre-simulation pre-brief with SP, delivery, debrief, and evaluation. The first 2 domains were to be completed before the simulation event, and the final 3 domains afterward (i.e., they were checked in retrospect). In each domain, there were several concise and pertinent statements for the SP and facilitator to discuss and act upon. Each statement was followed by a check box to ensure full coverage so the checklist could be used in its entirety. A comment box for each statement welcomed any suggestions for improvement, or other notes relating to the SBE, such as safe words.
As a model for improvement, the plan–do–study–act (PDSA) cycle was used to refine the checklist with small-scale changes after each use, if appropriate [7]. After each simulation, the facilitator and SP would complete the evaluation domain of the checklist. This allowed us to collect feedback, via a post-SBE questionnaire, on what could be improved. Through reflecting on the simulation, SPs using the checklist would comment on ideas for improvement after each use, initiating a new PDSA cycle of change (Fig. 1). Any following simulation would then use the new checklist, and to determine whether the changes made were an improvement, an evaluation was carried out after the subsequent simulation.
Anonymized data were collected through feedback questionnaires from all SPs involved in the simulation program (Supplement 2). Data were gathered on whether the SP felt safe throughout the SBE, the SP’s role during the simulation, scenario type, prior experience, confidence in safety before and after using the checklist, and comments for improvements in general. If post-simulation feedback stated that the SP felt safe throughout the SBE, we could be confident that safety remained satisfactory. When reflecting on things that could be improved after each SBE, we used rapid cycle evaluation [8]. This allowed regular changes to the checklist, which included editing or adding carefully phrased sentences that increased the chances of safe future simulation. Given there were numerous individual PDSA cycles, they were grouped into 4 “rounds” of cycles to compartmentalize the stages of checklist refinement.
There was no bias in selecting participants; all SPs involved in SBE were selected and agreed to complete the questionnaire.
The first main variable was SP safety, where we asked SPs whether they felt safe throughout the simulation or not. The second main variable was confidence in safety, whereby SPs reported their confidence in safety before and after using the checklist. The other variables compared were the type of SP, previous experience, and checklist safety rating.
Validity and reliability test results of the measurement tools
The authors agreed on content validity, which was associated with our simulation program, prior to the first author creating the questionnaire. The 3 variables (type of SP, previous experience, and checklist safety rating) had significant reliability (Cronbach α=0.2730, degrees of freedom=819, P<0.05). Raw data for the reliability test are available in Dataset 1.
Study size
All SPs within the simulation program were invited to use the checklist and complete feedback following SBE completion over the 4 months. The G*Power tool (Heinrich-Heine-Universität Düsseldorf, Düsseldorf, Germany; was used for post hoc analysis to estimate sample size power [9], and this showed sufficient power for SP confidence in safety before and after using the checklist for tests assessing statistical significance. Using the matched-pairs Wilcoxon signed-rank test for SP confidence before and after checklist use, the power was 0.9994. The effect size was also calculated using G*Power 1.2294. The input values were: a 2-tailed test; parent distribution, Laplace; effect size 1.2294; alpha error probability 0.05; and sample size 21.
Statistical methods
After 4 months, all responses were collated in an Excel spreadsheet (available in Dataset 2). Using the 2-tailed Wilcoxon signed-rank tests, statistical analysis was performed to compare the average confidence before and after using the checklist to determine significance, where P<0.05 was considered a significant value.
PDSA cycle: round 1
Initial feedback was obtained through discussing our plans for the checklist with simulation services colleagues and experienced external SPs. This allowed us to clarify the initial structure of all 5 domains and that the actions of the SP during simulation were agreed upon prior to commencing the SBE. The first few uses of the checklist were by SPs in adult point-of-care SBEs. The scenarios involving deteriorating patients allowed us to reflect on the physical boundaries we set between SPs and learners. This was because many SBEs involve vital signs being taken, and putting on monitoring equipment can feel invasive. Interventions include putting on oxygen masks, which run the risk of oxygen being delivered to an SP. It was important that the checklist reflected these risks and that the pre-brief allowed the SP and facilitator to agree on physical and emotional boundaries. The boundaries were then explained to learners with clear examples of acceptable and unacceptable interactions, to prevent safety from being compromised.
PDSA cycle: round 2
An emotionally challenging simulation allowed us to highlight the use of a safe word and empower the SP to pause or stop the SBE at any time, as discussed in the pre-brief. Within this domain, we also provided an opportunity to ensure the SPs were aware of the scenario well in advance and could opt out if they felt the SBE might trigger any emotional or physical distress. With these key changes and at the end of the first month of using the checklist, responses to feedback showed that 100% of checklist users felt safe throughout the simulation. This feedback provided a foundation to allow us to introduce the checklist to pediatric SBEs.
PDSA cycle: round 3
The checklist was now used by a variety of SPs both within and external to the simulation services team. Roles included simulated patients, parents, and relatives in adult and pediatric SBEs. Feedback involved seeking clarity on terminology, which triggered the addition of a glossary on the back of the checklist. This was to explain key terms that are used on the checklist and during SBEs, which allowed the checklist to be more compatible with new and inexperienced SPs. In addition, it is imperative for SPs to understand their role as a patient, relative, or professional. Stating this specifically in the “preparation” domain allows the SP to fully appreciate their role in advance and to prepare accordingly. The increasing number of users warranted the addition of a QR code on the back of the checklist, which any SP could then scan to fill in the feedback form.
PDSA cycle: round 4
At the end of the second month, the entirety of the UHBW simulation services team was familiar with the checklist and was comfortable using it as an SOP. The checklist was edited to reflect this with UHBW logos and was now felt to be practical, robust, and compatible enough for wider use, with few changes being made at this point onwards. Through a regional SP network, the checklist was introduced to other NHS trusts within the South-West region and a local academic institution, with plans for regional uptake. The checklist was discussed with these regional areas, who can change the logo and QR code to help implement the checklist as an SOP and collect their own feedback from SPs.
Data analysis
Feedback responses demonstrated an overall safety of 100% throughout the simulation events (Dataset 2). Over the 4-month period, feedback was collated from 21 SPs who used the checklist and completed the feedback form. Thirteen were patients, 1 was a confederate nurse, and the remainder portrayed various relatives. The majority (17/21) had previous experience of the role of an SP.
The mean rating for being confident of safety prior to using the checklist was 6.8/10, and after using the checklist this increased to a mean confidence in safety of 9.2/10. The Wilcoxon signed-rank test showed that a statistically significant improvement in confidence in safety occurred through using the checklist (Z=-3.5162, P<0.0005).
Key results
We achieved our primary objective of providing safe simulations, as evidenced by 100% of SPs feeling safe throughout the simulations. Furthermore, we demonstrated a significant increase in the confidence of the SPs in staying safe through the SBE, by using the checklist as an SOP after multiple PDSA cycles. These findings suggest that our checklist is a key component to providing SPs with safety and confidence during SBE and should therefore be used as an SOP.
These results demonstrating safety are likely due to an experienced SP cohort, expert facilitators within the simulation services, and a checklist incorporating integral standards of good simulation practice. The fact that we had enough confidence in our checklist to implement it as an SOP in both adult and pediatric point-of-care simulations shows the value of our quality improvement methodology. The structure of the PDSA cycle allowed us to stick to small-scale changes that we could observe as effective. Using predictions based on previous SBE allowed for rational, experience-based refinements to the checklist. The analysis involved a thorough evaluation and reflection on feedback, with a conclusion resulting in the implementation of improvements if deemed beneficial. This then allowed us to follow on to the next cycle and help predict the next changes to apply. Sticking to this structure as a scientific process allowed us to use the data collected effectively to help decide on the most appropriate next step.
The general benefits of using the PDSA cycle are well-documented and include an improvement in confidence that each cycle helped to develop our checklist [10]. We constantly learned, reflected, and adapted the SOP to tailor the checklist to the aims of the project—primarily, safe simulations—and following this, we developed a universal checklist that was compatible with all types of SBE within the trust and beyond. Some small-scale changes did not obviously improve the checklist, but through using a structured PDSA approach we were able to learn from these changes and carry out informed actions to aim for improvements in further cycles.
Both the checklist and the project have several limitations, which must be mentioned. Firstly, although the checklist aims to reduce the risk of unsafe actions as much as possible, it cannot guarantee safety. When immersed in a realistic scenario, a learner may do something unsafe despite a thorough and well-informed pre-brief and vigilant facilitator and SP. It is also difficult to analyze the effect of small, frequent improvements on an individual basis. For example, changes may not be relevant to the following simulation and thus it may not be apparent whether a change was successful. Safety was always reported as 100%, which is itself subjective. We therefore were unable to determine if SPs would have felt safe regardless, as we had no control group performing SBE without the checklist. Nonetheless, we have demonstrated safety while using this checklist in both experienced and inexperienced SPs.
Future directions
In addition to implementing the checklist as an SOP throughout regional trusts, there is an opportunity to use it in performance-based clinical examinations of both undergraduates and postgraduates. In this environment, the use of SPs has been shown to improve clinical competence and skill acquisition [11,12]. Another area of interest is the recent creation of regional courses for those wanting to become both SPs and facilitators. It is hoped that these courses can introduce the checklist to provide familiarity with using it in scenarios and to ensure a safe, immersive, and effective simulation from the beginning for those new to SBE.
This project supports the use of our checklist as an SOP to improve the confidence and safety of SPs. Ongoing development and refinement are crucial to continue improving the checklist for use in a wider region, within a variety of simulation-based educative events. With this, we hope that a new generation of SPs will be well prepared to confidently contribute to safe and effective simulation.

Authors’ contributions

All the work was done by Matthew Bradley.

Conflict of interest

No potential conflict of interest relevant to this article was reported.



Data availability

Data files are available from Harvard Dataverse:

Dataset 1. Raw data from checklist feedback.


Dataset 2. Response data from the simulated participants, including questionnaire.


Data files are available from Harvard Dataverse:
Supplement 1. Simulated participant checklist.
Supplement 2. Feedback questionnaire completed by simulated participants.
Supplement 3. Audio recording of the abstract.
Fig. 1.
The plan–do–study–act (PDSA) cycle used throughout the project. SBE, simulation-based education; SP, simulated participant.
Table 1.
describes the various terminologies used in simulation-based education [3-5]
Term Definition Roles
Actor A person who has probably not experienced the situation before, but brings skill to the role they portray In lay terms, this refers to anyone playing a role
Simulated participant A universal term describing all human role players within a simulation May include patients, family members, visitors, and other members of the healthcare team
Standardized patient A person who learns and follows a script rigidly during the scenario In theory, multiple standardized patients can repeatedly play the role with the same response during a simulation
Simulated patient A trained person who is part of the educational team, and is provided with information to play the role of a patient, relative, or other people within a simulation A core script may be followed with a degree of improvisation; when repeated, exchanges may differ slightly each time
  • 1. Arisha A, Rashwan W. Modeling of healthcare systems: past, current and future trends. Proceedings of the 2016 Winter Simulation Conference (WSC); 2016 Dec 11-14; Washington DC, USA: Piscataway (NJ): IEEE; 2016. 1523-1534. Article
  • 2. Cowperthwait A. NLN/Jeffries simulation framework for simulated participant methodology. Clin Simul Nurs 2020;42:12-21. Article
  • 3. Gantt LT, Young HM. Healthcare simulation: a guide for operations specialists. Hoboken (NJ): John Wiley & Sons Inc.; 2016. p. 170 p.
  • 4. Meerdink M, Khan J. Comparison of the use of manikins and simulated patients in a multidisciplinary in situ medical simulation program for healthcare professionals in the United Kingdom. J Educ Eval Health Prof 2021;18:8. ArticlePubMedPMC
  • 5. Lewis KL, Bohnert CA, Gammon WL, Holzer H, Lyman L, Smith C, Thompson TM, Wallace A, Gliva-McConvey G. The association of standardized patient educators (ASPE) standards of best practice (SOBP). Adv Simul (Lond) 2017;2:10. ArticlePubMedPMC
  • 6. Purva M, Baxendale B, Scales E. Simulation-based education in healthcare: standards framework and guidance [Internet]. Lichfield: Association for Simulated Practice in Healthcare; 2016 [cited 2022 May 20]. Available from:
  • 7. Langley GJ, Moen RD, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. 2nd ed. San Francisco (CA): Jossey-Bass; 2009.
  • 8. Shrank W. The Center for Medicare and Medicaid Innovation’s blueprint for rapid-cycle evaluation of new care and payment models. Health Aff (Millwood) 2013;32:807-812. ArticlePubMed
  • 9. Faul F, Erdfelder E, Lang AG, Buchner A. G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 2007;39:175-191. ArticlePubMed
  • 10. Leis JA, Shojania KG. A primer on PDSA: executing plan-do-study-act cycles in practice, not just in name. BMJ Qual Saf 2017;26:572-577. ArticlePubMed
  • 11. Williams B, Song JJ. Are simulated patients effective in facilitating development of clinical competence for healthcare students?: a scoping review. Adv Simul (Lond) 2016;1:6. ArticlePubMedPMC
  • 12. Quail M, Brundage SB, Spitalnick J, Allen PJ, Beilby J. Student self-reported communication skills, knowledge and confidence across standardised patient, virtual and traditional clinical learning environments. BMC Med Educ 2016;16:73. ArticlePubMedPMC

Figure & Data



    Citations to this article as recorded by  

      We recommend
      Related articles

      JEEHP : Journal of Educational Evaluation for Health Professions