Simulation-based education (SBE) literature emphasizes debriefing frameworks, with little discussion on developing SBE competencies. Introduced in 2005 by the Royal College of Physicians and Surgeons of Canada, Entrustable Professional Activities (EPAs) offer a robust curriculum development and assessment process for workplace-based assessments. There is paucity of literature on EPAs related to simulation and how simulation faculty move from novice to independent practice. The objective of this curricular innovation project was to develop standardized EPAs and milestones to assess the independence of simulation faculty by the end of mentorship. Using a modified Delphi technique, the team identified expert faculty to rate the level of importance for each EPA and milestone. Five EPAs were identified: Technology; Scenario Design; Simulation Facilitation; Prebriefing and Debriefing. EPAs provide a structured framework for tracking progress, targeting areas for formative feedback and offering opportunities for advancements and transformation of faculty development for simulation programs.
What this essay adds
• There is paucity of literature on the mentorship of simulation faculty as they move from novice to independent practice.
• Given the current gap for simulation faculty development and mentorship, there is a need for a standardized formative assessment approach that requires structured, observational-based assessment of all domains of simulation competence including technology, scenario design, simulation facilitation, prebriefing and debriefing.
• Building on the competency-based medical education approach, established by the Royal College of Physicians and Surgeons of Canada, the Provincial Simulation team developed five Entrustable Professional Activities (EPAs) and associated milestones, which offer a robust curriculum development and assessment process for simulation faculty development.
• Using a modified Delphi technique, the team identified expert interprofessional faculty from rural and urban centres across Alberta to rate the level of importance for each EPA and milestone, facilitating development of a valid and reliable Entrustable Professional Activities: Faculty Assessment for Simulation Tool (EPA-FAST).
• The EPA-FAST is a highly replicable tool that provides a clear structured framework for the systematic formative assessment of faculty towards safe independent practice. It can be generalized to other simulation programs, and it provides a significant advancement to the field of simulation through standardizing mentorship and faculty development programs.
The simulation-based education (SBE) literature emphasizes debriefing frameworks and methods to maintain the quality of simulation facilitators, with little discussion detailing how faculty develop SBE competencies over time [1–3]. Despite its importance, simulation faculty development concentrates primarily on foundational skills, such as debriefing [4–6], and neglects to describe a trajectory through which simulation faculty develop these skills from novice to independent practice.
Currently, there are several approaches to faculty development for simulation facilitators. One common and effective approach to faculty development is peer coaching [7]. Peer coaching can include teaching specific to (i) psychological safety, (ii) frameworks, (iii) method/strategy, (iv) content, (v) learner-centredness, (vi) co-facilitation, (vii) time management, (viii) difficult situations, (ix) debriefing adjuncts and (x) individual style and experience [7]. Alternatively, mentorship as an approach to faculty development creates targeted learner centred opportunities that promote the development and sustainment of expert SBE skills, knowledge, attitudes and behaviours [8]. Priorities of mentorship programs for simulation faculty development include creating a safe learning environment, a nurturing relationship and to encouraging and modelling deliberate self-reflection with feedback. The emphasis is also on promoting ample opportunities to facilitate and sustain debriefing and facilitation skills, and support for healthcare facilitators who juggle multiple responsibilities [9].
It has been recognized that a structured, tiered approach to faculty development, mentorship and certification ensures quality instruction and includes observation, didactic, interactive experiential learning, practice expert feedback and mentoring [3]. Introduced in 2005 by the Royal College of Physicians and Surgeons of Canada as part of competency-based medical education (CBME) [10], Entrustable Professional Activities (EPAs) offer a robust curriculum development and assessment process for faculty development for workplace-based assessments through a continuum of knowledge acquisition to application and proficiency [11,12].
While there is emerging evidence on the development and application of EPAs and associated milestones for medical residents and health professional education [13–17], there is paucity of literature on EPAs specifically for faculty development across a healthcare simulation career. EPAs are defined as reliable, ‘observable tasks’ simulation facilitators are ‘trusted’ or expected to be able to perform independently by the end of mentorship [14,18]. A milestone is a specific observable marker of an individual’s ability along a developmental continuum (i.e. as they progress from beginner tasks to tasks that are more complex and towards independent practice) [19–21]. EPAs and milestones focus on the appropriate expectations that the mentor trusts the simulation faculty to perform safely and independently, and helps identify achievements and targeted areas for improvement within the workplace environment [18].
Just as clinicians need EPAs to develop and demonstrate competence, so must simulation facilitators have entrustable skills, knowledge and attitudes; passion alone to be a simulation facilitator is no longer adequate if you want to achieve simulation excellence [19,22]. EPAs can be useful in assessing readiness to practice but entrustability cannot be determined by a single simulation event, coaching or mentorship session [20]. Further, there are several applications of EPAs within CBME including both undergraduate and graduate studies [19,21] and beyond medical education, for example Keating et al. described the used EPAs to ensure nurse practitioners readiness using SBE [20].
An identified gap for the simulation community has been the lack of standardization of core competencies required to reliably mentor faculty towards best practice as well as defining and monitoring essential competency progression over time. Current proposed frameworks for competencies of simulation facilitators include topics on simulation curriculum, educational theory, assessment, debrief, simulation research, simulation operations and administration [12]. Thomas and Kellgren et al. applied Benner’s novice to expert model to simulation faculty development as a conceptual framework for simulation faculty, yet there is no standard approach in the literature how SBE develops competencies/skills from novice to expert independent faculty over time [23].
There are also few valid and reliable evaluation tools to formatively assess simulation faculty, outside of the Debriefing Assessment for Simulation in Healthcare (DASH) [24,25], which focuses primarily on debriefing skills alone and not on formative and summative assessment of the skills of simulation faculty across the continuum of their career. Similarly, the Facilitator Competency Rubric (FCR) tool was developed for formative and summative evaluation for competency of simulation facilitators, in which scores guide and prioritize faculty development but only evaluate faculty at one point in time [26]. The FCR includes components of preparation, prebriefing, facilitation, debriefing and evaluation. Each concept has a scoring rating that differentiates who was competent, who needed help (beginner/advanced beginner) and who could provide that help (proficient/expert). FCR is targeted for facilitators in academic undergraduate nursing settings in simulation and not specifically simulation faculty providing continuing education within the healthcare environment [26].
In alignment with CBME, the Provincial Simulation program in Alberta, Canada, addressed this identified gap by developing a novel set of EPAs and milestones, specifically targeting formative assessment of competencies for SBE. While mapping of EPAs and milestones have been traditionally used for residency training [27], this novel curricular development of EPAs for simulation faculty training illustrates an education innovation to advance standard competencies in SBE, which to the authors knowledge has not been done by other simulation programs globally. Applying EPAs to simulation faculty development can serve as a framework across the spectrum of health science education and into a variety of education domains to achieve higher levels of proficiency and mastery within the workplace [27]. It has been recognized that EPA application can go beyond CBE for physicians or healthcare professionals. EPAs can be used as an agenda for further development and research across all levels of the educational continuum and implemented across disciplines and professions for continuing professional development and certification [11,28,29]. Harnessing the use of EPAs and milestones for formative assessment of simulation faculty is an opportunity for significant advancements in transforming standardization of faculty development and mentorship for simulation programs globally.
The goal of this curricular innovation evaluation paper is to describe the use of a modified Delphi technique to develop standardized EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program.
In 2017, the Provincial Simulation program completed a needs assessment of independent simulation faculty and champions across Alberta, to gain a better understanding of the current state of faculty development needs and to explore gaps in SBE mentorship design, tools, resources and the lack of standardization of expected competencies. Prior to this needs assessment, there had been no formal inventory for simulation faculty on current continuing education needs, upskilling opportunities, education resources, mentorship, peer feedback, evaluation of outcomes and certification over the last 10 years.
The needs assessment results highlighted a mismatch in resources, delivery and formal assessment of simulation faculty. The Provincial Simulation program used various tools and approaches to faculty development and mentorship that without provincial standardization remained siloed across sites based on their geographic location. The findings stipulated a review of current process to align with the requirements of national simulation accreditation standards that includes the domains of governance, infrastructure, education and healthcare systems.
Simulation accreditation was recognized as an opportunity to standardize simulation curriculum, as well as integrate a formative assessment and evaluation of education approaches to faculty development and mentorship. The process of applying for national accreditation allowed the program to take stock on how it was measuring its capacity, growth, training beyond the initial novice courses in simulation, and initiate future planning for maintaining and upskilling its existing faculty.
To identify areas of priorities for future planning of the program, the provincial program implemented a systematic inquiry, applying a SWOT [30] (Strengths, Weakness, Opportunities and Threats) analysis aligned with the national simulation accreditation standards (see Table 1).
Accreditation Standard Requirement | Strength | Weakness | Opportunities | Threats |
---|---|---|---|---|
Infrastructure There is a process in place to perform regular peer assessments and feedback on the performance of the instructor. | Assessment Tools exist in simulation literature (i.e., DASH1/OSAD2/peer debrief/plus-delta). The tools used intermittently for faculty/peer and self-evaluation. | Although expert faculty mentors engage in formative peer assessment, no formal tracking of performance is in place. | A formalized working group is in place to develop standardized curriculum and tools. | Capacity of the team impacted due to competing demands of frontline simulation, projects and ongoing expansion of simulation throughout the province. |
Education Minimal expectation includes scenario development, learning objectives, facilitation & debriefing >1hr workshop. Encourage progression of learning through continued training, observation, co-facilitation & feedback |
Current Workshop in Simulation Education (WISE 13) focused on debriefing skills, facilitation, scenario development. Defined mentorship program was adapted with some members of the simulation team. |
No formalized or structured feedback approach in curriculum. Vast geography in the province has historically hindered training and development. |
Opportunity to develop competency with new curriculum. Ability to expand our instruction design to include podcasts and webinars, and leverage virtual options for uptake with geographical limitations. |
The Provincial Simulation team is unstandardized in its current approach to mentorship as well as the delivery of content of the existing WISE 1 course. |
Curriculum Evaluation There is a quality review process in place whereby curriculum evaluation data, for individuals or groups, used to help modify and improve the curriculum and the delivery of courses/sessions to ensure that all educational objectives continue to be met adequately. | WISE 1 evaluation tool allowed for collation and dissemination of course level feedback among simulation faculty. | Data from WISE 1 evaluation was not consistently collected, measurable, or observable to inform faculty development and mentorship. No current knowledge management system (KMS) in place to support collation, theming, and dissemination of evaluation data. |
Opportunity to develop program evaluation formative assessment tools with new faculty development curriculum and mentorship Opportunity to evaluate new faculty in mentorship as part of an ongoing formative assessment, peer feedback and program quality review. |
Cost to hire an Education Lead to manage faculty development and mentorship provincially, the capacity of current expert faculty/consultants to mentor new faculty, and resources to procure KMS with health authority fiscal restraints in place. |
1 Debriefing Assessment for Simulation in Healthcare
2 Objective Structured Assessment of Debriefing and Debriefing Assessment for Simulation in Healthcare
3 WISE- Workshop in Simulation Education (Foundational 2 day course for Simulation Faculty)
CBME focuses on the use of milestones and EPAs to provide structure for teaching, learning and assessment [31]. It is an essential task of a discipline (profession, specialty or subspecialty) that an individual can be trusted to perform without direct supervision in a given healthcare context, once sufficient competence has been demonstrated [28,29,32]. Many iterations of medical curriculum started with a time-based model to a competency-based model and, most recently, with the addition of EPAs [11].
Building on the EPA approach from CBME [11], the Provincial team developed an Entrustable Professional Activities: Faculty Assessment for Simulation Tool (EPA-FAST) for new simulation faculty starting mentorship. This EPA-FAST focuses on the trusted tasks of the discipline and the appropriate expectations that simulation faculty can perform safely and independently while also tracking achievements and targeted areas for improvement. Within each EPA is a series of milestones, or specific observable tasks, that require sign-off as faculty advance in mentorship.
As part of the curriculum mapping exercise, the Provincial EPA Faculty group took into consideration EPAs and milestones for Faculty Development which aligned with the competencies from the new Provincial Faculty Development Curriculum. Through curriculum mapping, they identified gaps which led to modification of milestones and ensured alignment with Operational Expectations and Procedures, Strategic Plan and National Simulation Accreditation standards. Further existing tools and simulation curriculum standards in the literature were considered for the development of the EPAs and milestones which included referencing existing internal mentorship document, Harvard DASH [24,25], International Nursing Association for Clinical Simulation and Learning (INACSL) Standards [33], Royal College of Physicians and Surgeons of Canada (RCPS) [34] and Canadian Patient Safety Institute (CPSI) [35].
The modified Delphi method is a group consensus strategy that systematically uses literature review, opinion of stakeholders and the judgement of experts within a field to reach agreement [36]. The goal of the modified Delphi in our curricular innovation project was to decrease the number of EPAs and specific milestones and to improve the clarity of the language so that each EPA/milestone would resonate with groups of experts across a range of disciplines, clinical areas and levels of expertise. We chose a core group of experts considered important and knowledgeable in the field of SBE to assist us with the consensus strategy.
An expert is defined as one who is knowledgeable about the subject of SBE and capable of representing the views of his or her peers [37]. Several Delphi studies recommend using 10–20 carefully selected expert respondents, enough to provide a range of opinions but also few enough for the research team to be able to summarize and integrate those opinions [37].
The modified Delphi review was completed by EPA Faculty group as well 20 Simulation Experts with diverse experience in provincial simulation programs. In addition to track changes and feedback, experts rated the questions below:
1) Does this EPA and associated milestones resonate with you as key observable tasks of the discipline required for a simulation faculty to practice independently? (yes/no)
2) Do you see ways to improve the strength of the language? If so, please re-write, add comments or make suggestion to combine with another EPA/milestone(s).
3) Using a 4-point scale: extremely important (3), very important (2), moderately important (1), not important (0), how important is this EPA and associated milestone(s) for a simulation faculty to practice independently?
In total, three Delphi rounds were completed including a first-round review by EPA Faculty group and second- and third-round review by simulation experts.
Driven by the accreditation standards and provincial governance model, there was an identified need for increasing diversity in the simulation experts’ group by varying the years of experiences, type of professional and geographic representations. This was important in both membership of the EPA Faculty group as well as the external Expert Simulation Faculty. The team was representative of multiprofessional rural, urban and academic experience.
The EPA Faculty group which completed round 1 of the modified Delphi included nine members, inclusive of a Medical Director (n = 1), Research Scientist (n = 1), Education Coordinator (n = 1), Technical Consultant (n = 1), Simulation Lead (n = 1) and Expert Faculty Mentors/Consultants (n = 4) from both rural and urban centres to ensure comprehensive representation of disciplines and expertise.
The Expert Simulation Faculty for the Delphi review included stakeholders employed by the Provincial health authority, with interprofessional representations (n = 4) and diverse experience in simulation (between 5 and 15 years) across academic, rural and urban settings. The Figure 1 below provides an overview of the demographics of the 20 Expert Simulation Faculty in round 2 and 3 for the Delphi method.
The final 5 EPAs and 31 associated milestones identified after the completion of three rounds of modified Delphi were: (1) Technology, (2) Scenario Design and Fidelity-Realism, (3) Simulation Facilitation (Considerations for Session Planning and Implementation), (4) Prebriefing and (5) Debriefing. See Supplementary material for the EPA-FAST.
The Expert Simulation Faculty for round 2 and 3 of the modified Delphi were also asked to rate on a 4-point scale and how important each of the five EPAs and associated milestone(s) were for a simulation faculty to practice independently: extremely important (3), very important (2), moderately important (1), not important (0). The average rating for round 2 and 3 are summarized in Figure 2.
The following sections describe the key findings after each of the rounds of modified Delphi.
In the first round of the modified Delphi, the milestones were separated into levels of skills: Beginner, Novice, Advanced and Expert. This categorization, however, was found to be complex and required a grounded understanding of what was deemed to be beginner vs. novice or expert. Research-Scholarship, Patient Safety and Teamwork and Culture were initially included as EPAs but then were removed as it was difficult to align with clearly observable tasks.
In the second round of the modified Delphi, the Logistics EPA was removed, and operational-based checklists were created for the provincial simulation program. These were recognized as being specific to the individual program and therefore less generalizable across institutions outside of Alberta.
Several experts were confused by the phrasing, ‘safety competencies’ in the Scenario Design EPA and questions also arose with the use of the Promoting Excellence and Reflective Learning in Simulation (PEARLS) framework as the exclusive debriefing tool [4]. Several experts cited that scenario development was not essential in their work where pre-existing curriculum is most often used. Some experts gave the ‘Technology’ EPA a low rating and several milestones were deemed unnecessary. Concerns were also raised regarding the narrow focus of the ‘Setting the Stage’ EPA, with suggestions to instead explore milestones for facilitation and post-session practices.
The changes made in response to the second round of the modified Delphi included a title change of the Setting the Stage EPA to Simulation Facilitation and Implementation to better encompass pre/during/post simulation facilitation. Further, the use of standard nomenclature aligning with the simulation program’s Operational Expectations and Healthcare Simulation Dictionary [38] led to generalized rewording of these milestones.
In round 3 of the modified Delphi, further changes were adapted in the Technology EPA and milestones, for example simplified troubleshooting language. Second, merging 12 milestones into 5 milestones and removed specific technical or clinical language to improve generalizability. Confusion with the term ‘embedded participant’ was noted by experts; therefore, definition of an embedded participant was added. Although several experts wanted to include additional milestones for procedural task trainer skills, the group decided not to include procedural-based inventory, as this was a request specific to one group for residency training. Following the third round of the modified Delphi, the EPA Faculty group reviewed the results for consensus. Any items that did not achieve agreement, were dropped, or revised for clarity. The final analysis of the iterative three round of the modified Delphi revealed stability between the three successive rounds. Consensus was achieved between both the Expert Simulation Faculty and EPA Faculty group on all items by the third round of the modified Delphi, which led to the finalization of the EPA-FAST. Figure 3 highlights the evolution of the number of track changes, milestones and EPAs from initial round of EPA Faculty group to third round of Delphi. In summary, we started with 9 EPA and 144 milestones and by round 3 of the modified Delphi the experts agreed on 5 EPA and 31 milestones, with a total of 228 track changes from the original document.
Table 2 describes the specific track changes of EPAs and milestones through each evolution of rounds 1–3 of the modified Delphi.
Round 1 | Round 2 | Round 3 | |
---|---|---|---|
EPAs | Total #EPAs pre-round: 9 Total #EPAs post-round: 6 Initial EPAs align with language and domains of Provincial Simulation Program Faculty Development. Three EPAs (Research-Scholarship, Patient Safety and Teamwork/Culture) removed due to difficulty finding observable tasks. Concepts already embedded in other EPAS. Overall Assessment Rating for each EPA is included in the tool. |
Total #EPAs pre-round: 6 Total #EPAs post-round: 5 The Logistics EPA removed, and operational-based checklists created for the provincial simulation program. Title of ‘Setting the Stage’ EPA is changed to ‘Simulation Facilitation and Implementation’ to better encompass simulation facilitation. Average EPA Rating from Experts (1 = Not Important to 4 = Extremely Important) Technology: 3 Setting the Stage: 3.3 Scenario: 3.5 Prebrief: 3.7 Debrief: 3.8 |
Total #EPAs: 5 No Change post-round The overall assessment rating for each EPA removed; the rating scale used as the overall indicator for performance. Average EPA Rating from Experts (1 = Not Important to 4 = Extremely Important) Technology: 3.05 Setting the Stage: 3.8 Scenario: 3.75 Prebrief: 3.95 Debrief: 3.95 |
Milestones | Total #milestones pre-round: 144 Total # milestones post-round: 70 Milestones initially categorized into levels: Beginner, Novice, Advanced and Expert. This categorization removed because of the inability to differentiate specific observable tasks across levels. Milestones are streamlined only include observable tasks during a simulation session. |
Total # milestones pre-round: 66 Total # milestones post-round: 42 Revisions were made to the technology milestones to ensure more generalizability across program and institutions. Most EPAs and milestones resonated as important with experts: 98/100 answered yes, key observable tasks of the discipline required for a simulation faculty to practice independently. |
Total # milestones pre-round: 42 Total # milestones post-round: 31 Technology milestones are adapted by removing specific language for CPR feedback. The suggestion of adding milestones for procedural task trainer skills excluded because it is not generalizable to all Simulation Faculty. All EPAs and milestones resonated as important with experts: answered yes, key observable tasks of the discipline required for a simulation faculty to practice independently. |
Taxonomy and Track Changes | Total # track changes: 93 Language is modified to ensure it is generalizable outside of Provincial Simulation Program (e.g. removed ‘brave space’ as a reference to psychological safety and ‘follow the leader’ as a co-debrief style). |
Total # track changes: 105 Standard nomenclature changed to align with the Provincial simulation program’s policies and Healthcare Simulation Dictionary. The PEARLS debrief model, though it was noted to have a narrow focus, remained because it is the model used throughout the current faculty development curriculum. The word ‘simulationists’ replaced with ‘simulation faculty’. |
Total # track changes: 30 The definition of ‘Embedded participant’ is included for clarity and alignment with current language in the literature. A reference link to the eSIM Program’s Operational Expectations (guidance documents) and Standard Scenario template is included. |
Despite its importance, simulation faculty development concentrates primarily on foundational skills, such as debriefing, and neglects to describe trajectory through which simulation faculty develop these skills from novice to independent practice. While there is emerging evidence on the development and application of EPAs for medical residents and health professional education [13–15,28], there is paucity of literature on EPAs specifically for faculty development across their healthcare simulation career. Further, according to Gardner et al., no formal demonstration of competency is required for simulation centre leaders or expert simulation faculty [22]. It has been recognized that to be optimally successful, simulation faculty not only need knowledge and skills related to delivery of educational curricula, but they must also be skilled in the areas beyond debriefing skills (e.g. technology) [22].
The development of our standardized EPA-FAST for simulation faculty builds on the work by Iqbal et al. [13] who proposed an EPA framework which would serve as a roadmap for ‘longitudinal training and entrustment of small group facilitators’ in which learning activities are mapped against predetermined competencies, as well programmatic development for simulation faculty [12]. Yet despite the emerging need, there have also been minimal evaluation tools to formatively assess simulation faculty [39]. Two tools commonly cited the literature are the DASH [24,25] and the FCR [26]. The DASH [24,25] focuses primarily on debriefing skills and not on formative assessment of the simulation faculty across the continuum of their career. Similarly, the FCR focuses on assessing competency based on levels (i.e. Beginner, Novice, Competent, Proficient, Expert) [26] and doesn’t include trustable observable skills that are required for independence or assess observable behaviours over time. The FCR was initially targeted for facilitators in academic undergraduate nursing settings in simulation labs. Further, the FCR does not include measurable milestones or observable tasks of the discipline that can be formatively assessed over time, which is current gap for those simulation faculty providing SBE to staff within a healthcare environment [26]. Our proposed EPA-FAST validates the competencies and concepts described by Leighton et al. in the FCR [26]. The EPA-FAST enhances readiness to practice beyond a 5-point Likert scale. Predictors of competency are based on if the simulation was facilitated at a particular day, time of week and the fidelity of the simulation [26]. In contrast to the FCR, the EPA-FAST standardizes simulation faculty competencies for all new faculty, the assessment of those competencies and therein promoting independence.
Our findings from this curricular innovation project describe the use of a modified Delphi technique to develop standardized EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program. An unintended outcome from the modified Delphi was the identification of a mismatch of expert’s simulation expectation of skills required to be an independent faculty. The evaluation of current state of independent faculty yielded knowledge gaps specifically around scenario design and technology. This was likely due to advancement of the simulation expert faculty mentor’s role, specifically leveraging the expertise of the provincial simulation program in providing technology support for simulation sessions. This was predominantly noted in physician expert responses in the modified Delphi rounds. Further, faculty experts in programs with access to existing pre-designed curriculum and scenarios gave lower ratings for scenario design competency as scenario design was a skill they had not developed. However, it was the decision of the EPA Faculty group to retain EPA and milestones in technology and scenario design in the EPA-FAST, as faculty do require understanding of all domains in simulation to be considered independent in their practice. This ensures that all simulation faculty have a basic literacy of SBE competencies and mitigates barriers such as hierarchy and long-term sustainability ensuring the scale and spread of the simulation program.
Following the development of our EPA-FAST, the next logical step was to determine how to operationalize this process as new faculty complete the required faculty development courses. As an initial step, a faculty development flow map was developed to illustrate the steps to be followed as new faculty move through mentorship towards independence.
With the importance of tracking and documenting new faculty as they move through the continuum of faculty development through mentorship, the development of an electronic fillable form for each faculty member completing Faculty Development (FD) courses was created. The internal program level tool is used to screen potential applicants to determine the breadth and scope of their simulation plans in order to determine a detailed strategy for the facilitation of simulation sessions.
As new faculty enter mentorship (i.e. once the required foundational online and in-person simulation faculty development courses are completed), an initial meeting with expert faculty mentors is set up to outline the steps and mentorship plan. During this consultation, a needs assessment is completed with new faculty highlighting learning objectives for future sessions and how these were attained (i.e. through identifying their perceived and unperceived needs). It is during this stage that the EPA-FAST fillable tracking tool will be started for each new faculty member. Upon observing simulation sessions, milestones within each of the five EPAs (e.g. technology, prebriefing, etc.) will be referenced and signed off according to the date the observation took place. Session dates will be tracked as well as dates that the specific observable milestones were achieved, or that are still in progress. The number of mentorship sessions required to sign off on all the EPAs and milestones will vary based on the individuals experience and comfort, but it is estimated this would be a minimum of 3–6 sessions for new faculty.
Also contained within this internal tracking form there is a section to document follow-up conversations 3–6 months post-mentorship and EPA sign-off. The overarching benefit of developing a comprehensive document which can be utilized for each new faculty member is the ability to reference conversations from screening for courses to post-EPA completion. As is sometimes the case, several expert faculty mentors may be part of a team mentoring one faculty member to independence. Having the ability to revisit previous conversations and reflect on learning objectives provides a comprehensive and continuous, sustainable process in mentorship which can be shared easily among several expert faculty mentors. Further to ensure standardization of this process, the Provincial Simulation Program intends to transition to an online learning management system that will track, monitor and centralize the location of EPA-FAST and each individual’s mentorship plan.
Finally, one approach a simulation program might consider in supporting the sustainability of EPAs and milestones is to create online Community of Practices (CoP) for new simulation faculty graduates. It has been recognized that mentorship, alongside with proactive planning, will assist faculty with developing and demonstrating the necessary knowledge, skills and behaviours for high-quality simulation facilitation [22]. Access to a simulation CoP network of simulation mentors and peers [22] promotes the sociology of a simulation mentorship environment. The goal of the CoP is to promote deliberate practice and reflection of debriefing strategies or other facilitation domains. The focus is on sharing common simulation facilitation challenges and successes related to skills and knowledge such as difficult debriefing, co-debriefing, or using PEARLS effectively [8].
While our proposed EPA-FAST targeted new simulation faculty, there are a plethora of opportunities for future faculty development and recommendations for the development of advanced EPAs and milestones for simulation faculty in the domains of Co-Debriefing, Peer Debriefing, Virtually Facilitated Simulations, System Integration Simulation, operations, as well as advanced simulation technology and research.
This curricular innovation project is subject to some identified limitations. The results of the modified Delphi have been generated through simulation expert responses and assumptions within a Canadian healthcare system; therein, care should be taken in generalizing these findings to other settings and contexts. Further, the curricular innovation project used a cross-sectional design and included a convenience non-probability sample which may have resulted in sampling and selection bias of participants and feedback on the EPAs and milestones. Also, experts may have had some degree of recall bias, recalling either only very positive or very negative experiences potentially impacting their scoring of the EPAs and milestones. Further research is needed to validate the EPA-FAST, across different context and healthcare systems.
Harnessing the use of EPAs and milestones for formative assessment of simulation faculty is an opportunity for significant advancements in transforming standardization of faculty development and mentorship for simulation programs globally. Currently, there is wide variation to how simulation faculty develop these skills across their career from novice to independent practice. The objective of this curricular innovation project was to use a modified Delphi technique to develop EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program. Five EPAs and 31 milestones were identified through 3 rounds of modified Delphi: Technology; Scenario Design; Simulation Facilitation; Prebriefing and Debriefing. The EPA-FAST provides a structured framework of clear expectations for assessing and tracking progress of simulation faculty; targeting areas for improvement and formative feedback to facilitate independent and safe practice. While mapping of EPAs and milestones have been traditionally used for residency training, this novel curricular development of EPA-FAST for simulation faculty training provides opportunities for significant advancements in championing new opportunities for faculty development and mentorship for simulation programs locally, nationally and internationally.
Supplementary data are available at The International Journal of Healthcare Simulation online.
This project could not have been accomplished without the leadership support from eSIM Provincial Program, Alberta Health Service. The authors would like to acknowledge the following individuals for their contributions to the EPA-FAST: Faculty Assessment for Simulation Tool: Alejandra Boscan, Mirette Dube, AnnaMaria Mundell, Chris Dyte, Danaiet Teame, Gord McNeil, Helen Catena, Irina Charania, James Huffman, John Kortbeek, Jon Duff, Kristin Fraser, Megan Rolleman, Nicholle Oomen, Ryan Iwasiw, Ryan Wilkie, Sue Barnes, Ken Brisbin, Jonathan Jaekel and Stuart Rose.
All authors contributed to manuscript conception and design. Material, preparation, data collection and analysis were performed by AK, CS, NT, TF, JS, CE, VG. The first draft of the manuscript was written by AK and all the authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.
None declared.
None declared.
None declared.
None declared.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.