Simulation-based education (SBE) literature emphasizes debriefing frameworks, with little discussion on developing SBE competencies. Introduced in 2005 by the Royal College of Physicians and Surgeons of Canada, Entrustable Professional Activities (EPAs) offer a robust curriculum development and assessment process for workplace-based assessments. There is paucity of literature on EPAs related to simulation and how simulation faculty move from novice to independent practice. The objective of this curricular innovation project was to develop standardized EPAs and milestones to assess the independence of simulation faculty by the end of mentorship. Using a modified Delphi technique, the team identified expert faculty to rate the level of importance for each EPA and milestone. Five EPAs were identified:
What this essay adds
There is paucity of literature on the mentorship of simulation faculty as they move from novice to independent practice.
Given the current gap for simulation faculty development and mentorship, there is a need for a standardized formative assessment approach that requires structured, observational-based assessment of all domains of simulation competence including
Building on the competency-based medical education approach, established by the Royal College of Physicians and Surgeons of Canada, the Provincial Simulation team developed five Entrustable Professional Activities (EPAs) and associated milestones, which offer a robust curriculum development and assessment process for simulation faculty development.
Using a modified Delphi technique, the team identified expert interprofessional faculty from rural and urban centres across Alberta to rate the level of importance for each EPA and milestone, facilitating development of a valid and reliable
The EPA-FAST is a highly replicable tool that provides a clear structured framework for the systematic formative assessment of faculty towards safe independent practice. It can be generalized to other simulation programs, and it provides a significant advancement to the field of simulation through standardizing mentorship and faculty development programs.
The simulation-based education (SBE) literature emphasizes debriefing frameworks and methods to maintain the quality of simulation facilitators, with little discussion detailing how faculty develop SBE competencies over time [
Currently, there are several approaches to faculty development for simulation facilitators. One common and effective approach to faculty development is peer coaching [
It has been recognized that a structured, tiered approach to faculty development, mentorship and certification ensures quality instruction and includes observation, didactic, interactive experiential learning, practice expert feedback and mentoring [
While there is emerging evidence on the development and application of EPAs and associated milestones for medical residents and health professional education [
Just as clinicians need EPAs to develop and demonstrate competence, so must simulation facilitators have entrustable skills, knowledge and attitudes;
An identified gap for the simulation community has been the lack of standardization of core competencies required to reliably mentor faculty towards best practice as well as defining and monitoring essential competency progression over time. Current proposed frameworks for competencies of simulation facilitators include topics on simulation curriculum, educational theory, assessment, debrief, simulation research, simulation operations and administration [
There are also few valid and reliable evaluation tools to formatively assess simulation faculty, outside of the Debriefing Assessment for Simulation in Healthcare (DASH) [
In alignment with CBME, the Provincial Simulation program in Alberta, Canada, addressed this identified gap by developing a novel set of EPAs and milestones, specifically targeting formative assessment of competencies for SBE. While mapping of EPAs and milestones have been traditionally used for residency training [
The goal of this curricular innovation evaluation paper is to describe the use of a modified Delphi technique to develop standardized EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program.
In 2017, the Provincial Simulation program completed a needs assessment of independent simulation faculty and champions across Alberta, to gain a better understanding of the current state of faculty development needs and to explore gaps in SBE mentorship design, tools, resources and the lack of standardization of expected competencies. Prior to this needs assessment, there had been no formal inventory for simulation faculty on current continuing education needs, upskilling opportunities, education resources, mentorship, peer feedback, evaluation of outcomes and certification over the last 10 years.
The needs assessment results highlighted a mismatch in resources, delivery and formal assessment of simulation faculty. The Provincial Simulation program used various tools and approaches to faculty development and mentorship that without provincial standardization remained siloed across sites based on their geographic location. The findings stipulated a review of current process to align with the requirements of national simulation accreditation standards that includes the domains of governance, infrastructure, education and healthcare systems.
Simulation accreditation was recognized as an opportunity to standardize simulation curriculum, as well as integrate a formative assessment and evaluation of education approaches to faculty development and mentorship. The process of applying for national accreditation allowed the program to take stock on how it was measuring its capacity, growth, training beyond the initial novice courses in simulation, and initiate future planning for maintaining and upskilling its existing faculty.
To identify areas of priorities for future planning of the program, the provincial program implemented a systematic inquiry, applying a SWOT [
SWOT (Strengths, Weakness, Opportunities and Threats) analysis based on national simulation accreditation standards
Accreditation Standard Requirement | Strength | Weakness | Opportunities | Threats |
---|---|---|---|---|
Assessment Tools exist in simulation literature (i.e., DASH |
Although expert faculty mentors engage in formative peer assessment, no formal tracking of performance is in place. | A formalized working group is in place to develop standardized curriculum and tools. | Capacity of the team impacted due to competing demands of frontline simulation, projects and ongoing expansion of simulation throughout the province. |
|
Current Workshop in Simulation Education (WISE 1 |
No formalized or structured feedback approach in curriculum. |
Opportunity to develop competency with new curriculum. |
The Provincial Simulation team is unstandardized in its current approach to mentorship as well as the delivery of content of the existing WISE 1 course. |
|
WISE 1 evaluation tool allowed for collation and dissemination of course level feedback among simulation faculty. | Data from WISE 1 evaluation was not consistently collected, measurable, or observable to inform faculty development and mentorship. |
Opportunity to develop program evaluation formative assessment tools with new faculty development curriculum and mentorship |
Cost to hire an Education Lead to manage faculty development and mentorship provincially, the capacity of current expert faculty/consultants to mentor new faculty, and resources to procure KMS with health authority fiscal restraints in place. |
Debriefing Assessment for Simulation in Healthcare
Objective Structured Assessment of Debriefing and Debriefing Assessment for Simulation in Healthcare
WISE- Workshop in Simulation Education (Foundational 2 day course for Simulation Faculty)
CBME focuses on the use of milestones and EPAs to provide structure for teaching, learning and assessment [
Building on the EPA approach from CBME [
As part of the curriculum mapping exercise, the Provincial EPA Faculty group took into consideration EPAs and milestones for Faculty Development which aligned with the competencies from the new Provincial Faculty Development Curriculum. Through curriculum mapping, they identified gaps which led to modification of milestones and ensured alignment with Operational Expectations and Procedures, Strategic Plan and National Simulation Accreditation standards. Further existing tools and simulation curriculum standards in the literature were considered for the development of the EPAs and milestones which included referencing existing internal mentorship document, Harvard DASH [
The modified Delphi method is a group consensus strategy that systematically uses literature review, opinion of stakeholders and the judgement of experts within a field to reach agreement [
An expert is defined as one who is knowledgeable about the subject of SBE and capable of representing the views of his or her peers [
The modified Delphi review was completed by EPA Faculty group as well 20 Simulation Experts with diverse experience in provincial simulation programs. In addition to track changes and feedback, experts rated the questions below:
Does this EPA and associated milestones resonate with you as key
Do you see ways to improve the strength of the language? If so, please re-write, add comments or make suggestion to combine with another EPA/milestone(s).
Using a 4-point scale: extremely important (3), very important (2), moderately important (1), not important (0), how important is this EPA and associated milestone(s) for a simulation faculty to practice independently?
In total, three Delphi rounds were completed including a first-round review by EPA Faculty group and second- and third-round review by simulation experts.
Driven by the accreditation standards and provincial governance model, there was an identified need for increasing diversity in the simulation experts’ group by varying the years of experiences, type of professional and geographic representations. This was important in both membership of the EPA Faculty group as well as the external Expert Simulation Faculty. The team was representative of multiprofessional rural, urban and academic experience.
The EPA Faculty group which completed round 1 of the modified Delphi included nine members, inclusive of a Medical Director (
The Expert Simulation Faculty for the Delphi review included stakeholders employed by the Provincial health authority, with interprofessional representations (
Expert faculty representation by professional experience domains.
The final 5 EPAs and 31 associated milestones identified after the completion of three rounds of modified Delphi were: (1)
The Expert Simulation Faculty for round 2 and 3 of the modified Delphi were also asked to rate on a 4-point scale and how important each of the five EPAs and associated milestone(s) were for a simulation faculty to practice independently: extremely important (3), very important (2), moderately important (1), not important (0). The average rating for round 2 and 3 are summarized in
Average EPA rating based on expert faculty response in Delphi rounds.
The following sections describe the key findings after each of the rounds of modified Delphi.
In the first round of the modified Delphi, the milestones were separated into levels of skills: Beginner, Novice, Advanced and Expert. This categorization, however, was found to be complex and required a grounded understanding of what was deemed to be beginner vs. novice or expert.
In the second round of the modified Delphi, the
Several experts were confused by the phrasing, ‘safety competencies’ in the
The changes made in response to the second round of the modified Delphi included a title change of the
In round 3 of the modified Delphi, further changes were adapted in the
EPA and milestones evolution.
Changes to EPA milestones to three rounds of modified Delphi
Round 1 | Round 2 | Round 3 | |
---|---|---|---|
Despite its importance, simulation faculty development concentrates primarily on foundational skills, such as debriefing, and neglects to describe trajectory through which simulation faculty develop these skills from novice to independent practice. While there is emerging evidence on the development and application of EPAs for medical residents and health professional education [
The development of our standardized EPA-FAST for simulation faculty builds on the work by Iqbal et al. [
Our findings from this curricular innovation project describe the use of a modified Delphi technique to develop standardized EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program. An unintended outcome from the modified Delphi was the identification of a mismatch of expert’s simulation expectation of skills required to be an independent faculty. The evaluation of current state of independent faculty yielded knowledge gaps specifically around scenario design and technology. This was likely due to advancement of the simulation expert faculty mentor’s role, specifically leveraging the expertise of the provincial simulation program in providing technology support for simulation sessions. This was predominantly noted in physician expert responses in the modified Delphi rounds. Further, faculty experts in programs with access to existing pre-designed curriculum and scenarios gave lower ratings for scenario design competency as scenario design was a skill they had not developed. However, it was the decision of the EPA Faculty group to retain EPA and milestones in technology and scenario design in the EPA-FAST, as faculty do require understanding of all domains in simulation to be considered independent in their practice. This ensures that all simulation faculty have a basic literacy of SBE competencies and mitigates barriers such as hierarchy and long-term sustainability ensuring the scale and spread of the simulation program.
Following the development of our EPA-FAST, the next logical step was to determine how to operationalize this process as new faculty complete the required faculty development courses. As an initial step, a faculty development flow map was developed to illustrate the steps to be followed as new faculty move through mentorship towards independence.
With the importance of tracking and documenting new faculty as they move through the continuum of faculty development through mentorship, the development of an electronic fillable form for each faculty member completing Faculty Development (FD) courses was created. The internal program level tool is used to screen potential applicants to determine the breadth and scope of their simulation plans in order to determine a detailed strategy for the facilitation of simulation sessions.
As new faculty enter mentorship (i.e. once the required foundational online and in-person simulation faculty development courses are completed), an initial meeting with expert faculty mentors is set up to outline the steps and mentorship plan. During this consultation, a needs assessment is completed with new faculty highlighting learning objectives for future sessions and how these were attained (i.e. through identifying their perceived and unperceived needs). It is during this stage that the EPA-FAST fillable tracking tool will be started for each new faculty member. Upon observing simulation sessions, milestones within each of the five EPAs (e.g. technology, prebriefing, etc.) will be referenced and signed off according to the date the observation took place. Session dates will be tracked as well as dates that the specific observable milestones were achieved, or that are still in progress. The number of mentorship sessions required to sign off on all the EPAs and milestones will vary based on the individuals experience and comfort, but it is estimated this would be a minimum of 3–6 sessions for new faculty.
Also contained within this internal tracking form there is a section to document follow-up conversations 3–6 months post-mentorship and EPA sign-off. The overarching benefit of developing a comprehensive document which can be utilized for each new faculty member is the ability to reference conversations from screening for courses to post-EPA completion. As is sometimes the case, several expert faculty mentors may be part of a team mentoring one faculty member to independence. Having the ability to revisit previous conversations and reflect on learning objectives provides a comprehensive and continuous, sustainable process in mentorship which can be shared easily among several expert faculty mentors. Further to ensure standardization of this process, the Provincial Simulation Program intends to transition to an online learning management system that will track, monitor and centralize the location of EPA-FAST and each individual’s mentorship plan.
Finally, one approach a simulation program might consider in supporting the sustainability of EPAs and milestones is to create online Community of Practices (CoP) for new simulation faculty graduates. It has been recognized that mentorship, alongside with proactive planning, will assist faculty with developing and demonstrating the necessary knowledge, skills and behaviours for high-quality simulation facilitation [
While our proposed EPA-FAST targeted new simulation faculty, there are a plethora of opportunities for future faculty development and recommendations for the development of advanced EPAs and milestones for simulation faculty in the domains of Co-Debriefing, Peer Debriefing, Virtually Facilitated Simulations, System Integration Simulation, operations, as well as advanced simulation technology and research.
This curricular innovation project is subject to some identified limitations. The results of the modified Delphi have been generated through simulation expert responses and assumptions within a Canadian healthcare system; therein, care should be taken in generalizing these findings to other settings and contexts. Further, the curricular innovation project used a cross-sectional design and included a convenience non-probability sample which may have resulted in sampling and selection bias of participants and feedback on the EPAs and milestones. Also, experts may have had some degree of recall bias, recalling either only very positive or very negative experiences potentially impacting their scoring of the EPAs and milestones. Further research is needed to validate the EPA-FAST, across different context and healthcare systems.
Harnessing the use of EPAs and milestones for formative assessment of simulation faculty is an opportunity for significant advancements in transforming standardization of faculty development and mentorship for simulation programs globally. Currently, there is wide variation to how simulation faculty develop these skills across their career from novice to independent practice. The objective of this curricular innovation project was to use a modified Delphi technique to develop EPAs and milestones that a simulation faculty is trusted to independently perform by the end of faculty development mentorship program. Five EPAs and 31 milestones were identified through 3 rounds of modified Delphi:
Supplementary data are available at The International Journal of Healthcare Simulation online.
Supplementary PDF file supplied by authors.
This project could not have been accomplished without the leadership support from eSIM Provincial Program, Alberta Health Service. The authors would like to acknowledge the following individuals for their contributions to the EPA-FAST:
All authors contributed to manuscript conception and design. Material, preparation, data collection and analysis were performed by AK, CS, NT, TF, JS, CE, VG. The first draft of the manuscript was written by AK and all the authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.
None declared.
None declared.
None declared.
None declared.