By: Alaina Herrington
This is the second part in the four-part series on the Evaluation of Educators Who Teach Simulation Pedagogy. Read part one of the series.
The INACSL Standards of Best Practice SM (2016) recognize the need for ongoing training for facilitators of simulation. In addition, the Society for Simulation in Healthcare Standards (2016) requires programs to ensure the ongoing development and competence of its simulation educators. Still, only a small percentage of simulation centers have an educator evaluation process in place. Part of the difficulty for simulation centers in implementing an evaluation process is that educators are often fearful of receiving feedback. This fear may be driven by lack of training and experience in receiving feedback. Following is a discussion of my experience in creating evaluation processes at a community college and a large academic medical center to ensure simulation standards are being met.
In my previous role, as founding simulation director of a community college simulation center, I built my department from the ground up. I created educator simulation training materials and saw many of the educators evolve into simulation experts over several years’ time. I worked with the administrators and simulation committee to create procedures where educators were evaluated at least annually on simulation skills.
To begin the evaluation process, I gave oral feedback to each educator after their debriefing. I also asked educators to complete a Center for Medical Simulation (n.d.) Debriefing Assessment for Simulation in Healthcare (DASH) self-assessment tool to become familiar with the rubric. This slow adaption process allowed the educators to become comfortable with the evaluation process. At first, the educators’ initial self-assessment was followed the next semester by an evaluation by an educator who had multiple years of simulation experience. Utilizing the DASH tool, most evaluators rated their peers a 7 (extremely effective and outstanding). We then conducted a peer feedback workshop, and evaluators learned the benefits of honest feedback.
Peer reviews became a semi-annual process tracked through an electronic database that allowed administrators to track educators’ performances. (We selected a web-app written using Django and the MySQL database. If you don’t have IT assistance, you could use Excel or Google Forms.) Educators had to score an average of 5 on each element. Simulation educators who did not achieve this score were required to complete a remediation process.
The remediation process included the observation of two debriefings by an experienced simulation educator followed by co-debriefing of two other simulations. In addition, as part of their annual employee evaluation, all faculty members who conducted simulations were required to provide course materials, learner outcomes, and student satisfaction surveys and other records of simulation teaching performance to their directors/chairs.
With this elaborate evaluation process in place, there were still areas needing improvement. To meet these learning needs, I worked with the skills lab manager, Rebecca Cockrell, to create the Hinds Community College Simulation Rubric (Herrington & Cockrell, 2017). The rubric was used to kick off a simulation competition complete with a paid day off for the winning simulation team. The competition was designed to highlight how simulation educators can incorporate the INACSL Standards of Best PracticeSM across the curriculum. Faculty were given the opportunity to partner in teams and implement a simulation in the classroom, skills lab, or simulation center. To engage faculty, the competition was announced at the beginning of football season at a spontaneous faculty pep rally. The goal was for faculty to “Beat the Challengers” in implementing simulation best practices. (see pictures)
At the large academic medical center where I work now, simulation educators from all over the world bring with them a diverse point of view on simulation education. In order for our simulation staff to assess their understanding, staff utilize the Faculty Competency Rubric when meeting with new educators to evaluate their simulation knowledge (Leighton, Mudra, & Gilbert, 2018). If weak areas are identified, the educator is assigned online learning modules in specific areas to ensure that best practices are implemented, (The online learning modules we use were developed through a partnership involving the University of Mississippi Medical Center, Indiana University, and Franciscan Health. Similar modules can be used, including Simulation Innovation Resource Center [SIRC] courses.)
After new educators complete the assigned modules, simulation staff observe the first two simulations they facilitate to verify competency. The new educator is then evaluated annually utilizing the DASH tool. Beginning in 2019, the simulation center will provide simulation champions with a digital credential (see example) to recognize them for their simulation competency achievements. Wikipedia describes a digital credential as “the digital equivalent of paper-based credentials. A digital credential is a proof of qualification, competence, or clearance that is attached to a person.”
The challenges simulation centers face implementing onboarding and continual evaluation practices can be formidable. However, simulation staff can work with simulation stakeholders and organizational administrators to identify similar evaluation procedures to the ones described here that will ensure simulation educators uphold simulation standards.
References
Center for Medical Simulation. (n.d.). Debriefing Assessment for Simulation in Healthcare (DASH).
Herrington, A., & Cockrell, R. (2017). Hinds Community College Simulation Rubric.
INACSL Standards Committee. (2016). Standards of Best Practice: SimulationSM. Clinical Simulation in Nursing, 12(Suppl), S1-S50.
INACSL Standards Committee (2017, December).
Leighton, K., Mudra, V., & Gilbert, G. E. (2018). Development and psychometric evaluation of the Facilitator Competency Rubric. Nursing Education Perspectives, 39(6), E3-E9. doi: 10.1097/01.NEP.0000000000000409
Society for Simulation in Healthcare Committee for Accreditation of Healthcare Simulation Programs. (2016, May). Core standards and measurement criteria.
Alaina is great!! However, sometimes the pt (mannequin) just wont die like it’s supposed to! Lol love you Alaina!!