Partnering Simulation with Human Factors: Shift Change Safety Checks
TimeFriday, April 162:20pm - 2:40pm EDT
LocationEducation and Simulation
DescriptionSimulation and human factors overlap significantly in primary goals: improving safety, satisfaction, and efficiency. Both fields use many related methods to evaluate systems and products in a controlled environment to proactively reduce or eliminate risk of harming end users and to recreate issues identified in other environments. However, many hospitals struggle to effectively partner simulation with human factors.
Children’s Hospital Colorado simulation team historically focused on interprofessional patient crisis management training (e.g., code simulations). This simulation team recently shifted to conducting proactive risk assessments and in-situ simulations through partnership with the patient safety team—including a Human Factors Engineer. This partnership promoted collaborations in operational readiness, improving event investigations, and identifying opportunities for proactive mitigations. One example of a project successfully incorporating simulation and patient safety—specifically human factors—was an in-situ study regarding nursing safety checks.
Case Study: Shift Change Safety Checks
Nursing safety checks are intended to ensure all life-critical equipment and supplies in the room are set up correctly at shift change in case of emergency. If there is a patient emergency and steps are not completed, this could lead to a serious safety event—significant patient harm or death. Upon observation, we identified risk for patient harm due to steps partially completed or missed entirely.
We introduced a checklist as a cognitive aid to address issues with shift change safety checks’ step completion. The design of the checklist aligned with principles from the literature (e.g., limited words per step) and provided patient-specific information to reduce overreliance on memory. For example, the checklist provided the patient’s picture, name, medical record number, and date of birth to assist with full patient identification. The checklist reduced mental math by providing a calculation of +/-10% of the patient’s weight to check the emergency drug code sheet. Additionally, automated inclusion/exclusion of specific steps on the checklist (e.g., check ventilators) and associated patient information (e.g., ventilator settings) based on orders in the electronic health record.
Checklist usage significantly improved step completion. We conducted four different trials in inpatient acute care units and an intensive care unit. In 145 observations of safety checks (938 total steps), fully completed steps increased from 59% to 83%, partially completed steps decreased from 27% to 15%, and missed steps dropped from 14% to 1.9%. However, many of the checklists spilled onto a second page due to design limitations, heavily reduced satisfaction with the interventions. Therefore, we proposed reducing the checklist content to ensure all checklists print on a single page.
Human Factors and Simulation Study
We decided to partner human factors with simulation to evaluate the shortened checklist prior to implementation in patient care areas. Notably, the reduction of content on the checklists could potentially decrease safety check performance, thus increasing risk for patient safety events. The major benefit of using a simulation study is the ability to proactively evaluate changes in a tool without risk of harming patients, including setting up errors in the room to compare quality of checks across nurses.
The simulation study occurred in inpatient rooms using real supplies and equipment with a mannequin to best capture actual processes (e.g., correct orientation of patient bed to equipment/supplies such as the emergency drug code sheet and airway supplies). Participants were 30 inpatient nurses from two units who completed two safety checks: once each with the shortened and longer checklists. One error was introduced in each safety check. Checklist type and errors were counterbalanced. Participants were instructed to perform safety checks as they normally would.
The Human Factors Engineer provided patient handoff to the participant during a simulated shift change, monitored participants in room for step completion and error detection, administered surveys, and provided briefings. The Simulation Educator oversaw room preparation while ensuring in-situ safety measures.
There were no differences in thoroughness of step completion (e.g., fully completed vs. missed steps) between the shortened and longer checklists. Interestingly, more errors were caught with the shortened checklist (19 of 30) than the longer checklist (12 of 30). We implemented the shortened checklist. Given the imperfect identification of errors, we concurrently implemented systems-level interventions to improve error detection beyond what a checklist could accomplish.
Simulation-Specific Study Considerations
Psychological safety—ensuring participants both feel safe and are safe—is a basic premise of simulation. Simulations typically avoid use of deception. This was necessary in this study to compare performance between the two checklists. In briefings we explicitly stated the study’s purpose to evaluate the tool rather than the participants performance. We placed a “simulation in progress” sign to improved awareness for non-participating staff. Participants were allowed to exit the study at any time without consequence.
Realism for the study was essential for ecological validity, even though this added cost (actual instead of expired supplies). Simulation in clinical environments carries risk of harm if simulated or expired supplies/medication/equipment are inadvertently introduced into a patient care space (e.g., Raemer, Hannenberg, & Mullen, 2014). Many participants commented on how the in-situ setup facilitated accurate safety check processes over use of a formal simulation lab.
Recruitment and scheduling for our study proved challenging. We used on-shift nurses requiring both a willingness and resources to participate. Participation was optional and unit leadership provided coverage.
Impact of Partnering Human Factors and Simulation
Overall, this study evaluated two cognitive aids (shortened checklist, longer checklist) without risk for harming patients. Use of the in-situ environment posed challenges, yet enabled high quality and realistic data collection. The partnership between human factors and simulation challenged both groups to collaborate on methods used in their respective fields and implementation considerations, such as error introduction and effective use of briefings. The shift change safety checklists project provided valuable insights into a patient safety risk while serving as a template for future work to evaluate and improve safety, satisfaction, and efficiency. Our system plans to continue collaborating due to the success of this project.