Home Article
International Journal of Healthcare Simulation
image
Simplifying complexity science principles: developing healthcare faculty for using simulation as an educational method

DOI:10.54531/qwkd2435, Pages: 1-12
Article Type: Original Research, Article History
Abstract

Introduction

Professional development in simulation-based education (SBE) is a complex process. Individual components have overlapping boundaries and relationships, rendering them suitable for Complex Adaptive Systems (CAS). This complexity is multifold in low-resource settings, unfamiliar with simulation, and augmented by hierarchical culture and linguistic diversity. This study aimed to conduct a professional development course through SBE using principles of CAS for faculty in a Pakistani medical college.

Methods

A simulation educator team (six on-site, three online) from Pakistan and North America constructed and facilitated a 6-hour-long hybrid course following CAS in three phases. Planning phase consisted of needs assessment and inclusion of stakeholders in decision-making. Implementation phase involved remote facilitators joining the in-person team using Zoom and demonstrating evidence-based practices in case design, pre-briefing, facilitation and debriefing. In the evaluation phase, participants completed an immediate Post-workshop Survey and a Follow-up Survey after 4 weeks.

Results

Twenty-three faculty members from basic medical (n = 9) and clinical sciences (n = 14) participated in the course and completed the evaluations. All five outcomes intended for this program, i.e., level of acceptance for simulation, simulation knowledge, self-efficacy, simulation application in the learner’s setting and performance of workshop facilitators, were met. The unexpected positive outcome was smooth delivery of the program from an administrative perspective and enthusiastic response from learners towards simulation.

Discussion

We suggest using CAS as a framework for professional development programs to train faculty in simulation. A CAS framework can help the organizers to integrate systems thinking into educational interventions.

Bajwa, Khatri, Ali, Ahmed, Muhammed Elgasim, Raechal, Mukhtar, Ansari, and Fayyaz: Simplifying complexity science principles: developing healthcare faculty for using simulation as an educational method

What does this study add?

    Complex Adaptive Systems (CAS) can be utilized in developing a simulation workshop among diverse stakeholders

    Utilizing a complexity science framework facilitates in planning, implementation and evaluation of simulation teaching workshops

    CAS enhances the adaptability of faculty development initiatives in changing circumstances commonly encountered in medical education

    Long-term support is needed for the medical faculty after introducing a new teaching technique

    This short intervention can be used as a road map for developing courses in low-resource set-ups and has the potential to be scaled up into a long-term intervention.

Introduction

Simulation-based education (SBE) is routinely used in high-income countries with a documented need, prevalence and impact for healthcare education across health professions. A growing body of literature identifies its need and the importance of faculty and staff training in using simulation techniques as an educational modality [1,2]. Despite regular use in high-resource settings, healthcare simulation is much less prevalent as an established instructional method in underdeveloped and resource-restrained countries and communities. This is due to barriers, such as scarcity of resources and differences in healthcare education cultures [3]. Professional development courses for faculty education in using simulation are even rarer in such settings [4]. Additionally, as evident, healthcare education is not linear but complex [5,6]. This manuscript captures the construction and facilitation of the first SBE course for a medical college faculty in Pakistan using complexity science principles.

What is complexity science?

Complexity science studies the dynamics, conditions and consequences of interactions within a complex system [7]. Establishing a faculty development course for using healthcare simulation as an instructional modality is complex, with the ‘whole’ being greater than the sum of individual components. There are several interrelationships among the individual components with fuzzy (indistinct) and overlapping boundaries [5,8]. Learning being complex and adaptive, educational systems represent a complex adaptive system, an area governed by complexity science [6]. Complex Adaptive Systems (CAS) are characterized by diversity, interaction and dependency of elements, the nesting of systems within others and self-organization, such as medical education [6,9,10]. The impact of faculty education cannot be fully captured using a linear approach due to the involvement of several moving and interconnected pieces and nested systems [6]. This prompted us to use a CAS approach to develop, implement and evaluate a training course for healthcare educators [8]. The complexity science approach assisted in considering interactions of its components, such as course design, participants, instructors and outcomes, with each other [11].

We found several applications of complexity science in the literature [12]. However, its application from the perspective of introductory-level faculty training in using simulation in a low-resource setting was not found. Medical education, as complex as it is, became even more complicated in this case because a newer educational method was introduced, which was ‘imported’ from the west and was taught by a hybrid team of simulation educators, one on-ground and one online, directly from the United States and Canada.

The rationale for using a complexity science framework

This paper used the phrase ‘Program Cycle’, which refers to all project phases [13]. Complexity-Aware Monitoring (CAM) is one method to monitor CAS [13]. The following criteria made this program cycle a candidate to use the CAM framework [13]:

    1) Uncertain cause-and-effect relationships among the stakeholders and various elements/factors, such as medical faculty, learning a new instructional methodology, the course educators, the hybrid learning environment, the content itself and the teaching method of delivering the content.

    2) Perspective of diverse stakeholders: institutional leadership of all parties involved, individual learners (faculty in this case) regarding their mindset and aptitude towards simulation, on-ground (local) simulation educators and online simulation educators.

    3) Contextual factors like professionals from multiple medical professional backgrounds using a new method ‘imported’ from the western world.

    4) New (unintended) opportunities or needs while preparing and conducting the course.

    5) Unpredictable pace of change in faculty’s behaviour regarding using simulation in their teaching. Not all faculty learners would internalize this acquired knowledge to the extent of having it integrated into their teaching practices.

Study aim

The study aimed to construct and deliver a professional development program using CAS theory to a Pakistani medical college faculty to use simulation according to the Healthcare Standards of Best Practices [1]. We used the CAM framework [13] to answer the following question: How can the complexity theory/CAS be applied to the planning, implementation and evaluation of a healthcare simulation professional development program for medical educators with no prior knowledge and training?

Methods

This is a descriptive study of a healthcare faculty training outreach program underpinned by complexity science principles [8,13], using descriptive statistical and thematic content analyses [14,15].

Course description

This was a professional development course for healthcare faculty using SBE. It was conducted in a hybrid learning environment with an online and in-person simulation team in August 2022 in one Pakistani medical college. We structured and conducted a 6-hour professional development course using CAS [8,13] to introduce the faculty to healthcare simulation. Our team consisted of educators, facilitators and technicians with on-site (in-person in Pakistan, n = 6) and remote simulation facilitators (from North America, n = 3, via Zoom [16]).

To understand and adopt the inherent complexity of healthcare education with known and unknown interrelationships of multiple factors and their impact on learning, we adopted the Three-Phase approach from Edwards et al. [8]. We divided the course into planning, implementation and evaluation phases (see Table 1) and applied the CAM framework to monitor the progress of each phase [13]. It helped us visualize the program’s impact and how it worked, consistent with the literature [11].

Table 1:
Evidence-guided actions during the program
Evidence-guided actions [8,13] Examples from the program cycle
1.Planning Phase
1.1. Identifying stakeholders and their interests - Development of planning team, who identified stakeholders: Educators (local and international), learners (host institute faculty), host and organizing institutes with its leadership, support staff (technical and administrative), current and future undergraduate medical students at the host institute and local community
1.2. Communication among the planning team - Weekly meetings, e-mails, WhatsApp group communication, a visit to the site, and meetings with the on-ground organizing and host institutes
1.3. Anticipate the needs - Conducted needs assessment with host institute representative; structuring the program within one 6-hour-long day;
- the desire and the need for more simulation practice was expressed which led to customization of the course down the road
1.4. Prioritize stakeholder interests, and establish expectations & timelines - Divided learners into small groups with similar professional background; constructing simulation scenarios that are applicable to their own setting.
Expectation: Delivering the program in a shared physical space and limited time; delivering the program according to host institute time zone
1.5. Finding & managing interrelationships, perspectives and boundaries - Grouped learners together with similar interests, considering the unspoken distinction between clinical and non-clinical medical sciences; managed the time-zone difference by having a pre-recorded session for a panel discussion
1.6. Assess organizational climate and make implicit to explicit organizational assumptions and choices - The host institute desired to provide professional development courses; however, they were unsure about the approach and direction of these courses. We explored the options and provided them with our described professional development program.
- Groups were intentionally made with experienced and novice faculty learners in each group that created a positive learning environment
1.7. Conflict resolution methods - Employed conflict management strategies from the beginning:
- regular meetings, consensus building, need to have vs. nice to have, now vs. later
- Mitigated concerns arising from internal bias regarding simulation being foreign, expensive and difficult by respectful conversations and showing examples of low-cost and do-it-yourself (DIY) homemade task trainers
1.8. Synchronize monitoring with pace of change - Conducted regular team meetings; made list of tasks and their delegation to appropriate personnel; monitored the preparation progress at the organizing and host institute and managed the change accordingly
2.Implementation
2.1. Implementation team establishment with needed skill set - Formation of local and international educators’ teams based on their skills, including nursing faculty and technical counterparts
2.2. Check in with the stakeholders and manage their expectations - Held open dialogues with faculty learners during the session; managed the technical issues during the session; held open and frequent communications with host institute; highlighted host institute leadership during the program
2.3. Awareness of the alternate causes and contributing factors - Bias arising from the mindset that simulation is costly and not possible in low-resource settings was mitigated by using homemade low-cost task trainers; a few learners were exposed to simulation prior to coming to this session which made it easier for them to follow the content;
- We understood the reason behind not engaging or participating could be because of technical issues
2.4. Managing interrelationships, perspectives and boundaries - Invited senior leadership from host institute to experience the workshop proceedings and final discussion session to make the environment conducive for future organizational support
2.5. Synchronize monitoring with the pace of change - Adjusting the pace and context of content delivery according to faculty learner’s level from group to group during round robins; modified the content with mutual agreement after lunch and prayer break due to insufficient time
3.Evaluation
3.1. Checking with the learners and educators - Assessing the need of each group during the session and modifying content according to learners’ needs while staying in line with the core curriculum
3.2. Review of the progress - For faculty learners: Post-Workshop Survey, Follow-up Survey after 4 weeks
- Educator team: Meetings before, the day of, and after the program
3.3. Awareness of intended and unintended results - Asked questions regarding intended outcomes, open-ended questions for unintended outcomes in the surveys.
One unintended outcome was that some participants overcame their lack of resources by using homemade solutions.
3.4. Awareness of non-linear change - Some faculty learners were more receptive to adopting this new methodology than others
3.5. Managing interrelationships, perspectives and boundaries - Respected the boundaries set by host institute to contact the learners through the liaison, to abide by their media release and marketing policies, and learner’s evaluations
3.6. Synchronize monitoring with the pace of change - Held conversation with host institute for further assessment of behavioural impact and further professional development

1.Planning Phase

Following the guidelines [8,13], we determined stakeholders, their implicit and explicit interests, and communication channels. We met regularly with stakeholders in the planning phase, resolved the discords and planned the course. Communication methods included weekly standing meetings, frequent communications through e-mails and an instant messaging app, WhatsApp [17]. Consensus building by asking people proved helpful and was part of regular team meetings. Limited monetary resources were overcome by asking volunteer simulation educators to teach, using local faculty to role-play as simulated participants and adjusting the schedule of the healthcare faculty when some of its members came to partake in the course.

We conducted a needs assessment through meetings with the host institute liaison. According to the needs assessment, we recruited team members, constructed the curriculum and determined the delivery methods. While developing the curriculum, we sought interrelationships among several factors and their overlapping boundaries, which could affect the content, delivery and learning. Key factors to consider included cultural diversity regarding social, educational and healthcare norms in Pakistan and logistical issues such as time-zone differences and lack of simulation equipment readily available from the western world. We revised the program as needed, considering the fuzzy (indistinct) boundaries [5] of interacting factors. We constructed and disseminated a participant guide, including the workshop details, reference material and outline of the day (see Supplementary Appendix 1). Following guidelines, we conducted several online dry runs to troubleshoot the technology, connectivity and overall workflow [18]. A final in-person dry-run was conducted on-site a day before the workshop to check the internet, audiovisual support, room organization and tech support availability.

2.Implementation Phase

We arranged a full team huddle half an hour before the session as part of continuous monitoring [13] and following best practices [1]. We were made aware of another strong factor in interrelationship, the institutional hierarchy, which resulted in us having an opening statement from the host institute’s local leaders. During implementation, remote facilitators collaborated with the on-ground facilitators using Zoom [16] according to a pre-set agenda for the day. This accommodated physician educators from diverse medical fields and experience in medical education with no prior formal training for simulation (see Supplementary Appendix 1 for logistical details).

We planned the day as requested, with only an hour of didactic discussions followed by role-playing. Educational activities included in-person and online discussions regarding best practices in case designing, pre-briefing, facilitation and debriefing, and demonstration of two simulations. We divided the participants into four groups and rotated the participants through the stations in a round-robin style. These stations were arranged in a spacious room where participants experienced baseline knowledge with hands-on training on scenario development, pre-briefing, facilitation and debriefing.

3.Evaluation Phase

We have linearly explained this program cycle in three consecutive phases of planning, implementation and evaluation for the ease of structuring and implementing this program. The primary underlying fact is that this was a non-linear and cyclical process requiring continuous monitoring. We determined the progress as part of continued monitoring even during the session delivery by having brief intermittent facilitators conversations using a Zoom Breakout Room [16] and WhatsApp [17], which made us aware of intended and unintended results.

For evaluation, participants completed course feedback immediately after the workshop and 4 weeks afterward. Following CAM principles, we monitored the whole program in the planning and implementation phase [13] (see Supplementary Appendices 2 and 3 for the evaluation surveys).

Application of CAM

We applied the three key CAM principles to monitor the progress of the implementation process and the learning outcome: 1) attending performance monitoring’s three blind spots: anticipating a broader range of outcomes, expecting alternative causes and obtaining the results of individual variables; 2) synchronizing monitoring with the pace of change; and 3) considering interrelationships, perspectives and boundaries.

1) Attending performance monitoring’s three blind spots: Following the cyclical nature of performance monitoring [13], we not only focused on achieving the intended outcomes but also intended to monitor the three blind spots in the phases of the program cycle.

    Anticipating a Broader Range of Outcomes: We were cognizant of broader outcomes, whether intended, unintended, positive or negative. It led us to find some unintended but positive outcomes (e.g. knowledge about communicating in a digital environment), unanticipated outcomes and even negative outcomes (e.g. doing two cases consecutively without a buffer debriefing or even a few minutes break) in between, due to time restraints leading to cognitive overload for the faculty. Another example was that the faculty (learners) considered simulation-based learning as problem-based learning (PBL). We, as course facilitators, spent time clarifying the differentiating concepts of simulation-based learning from problem-based learning. Anticipating such outcomes helped us mitigate them, and we explained this during the end-of-session debrief.

    Expecting Alternative Causes: Acknowledging that other factors could contribute to outcomes was important as it could result in better resource allocation and prevention of fixation errors. For example, someone missing the session by not joining could be due to connectivity problems. Some learners understood it better than others because they had previous exposure to some form of simulation.

    Obtaining the Results of Individual Variables: Individual factors or their interactions can lead to a non-linear change. Keeping an eye out for individual variables (e.g. different technical affinities and technology acceptance thresholds, personal interactions among the learners and a vastly different professional field) helped us look for unclear, messy and inconsistent results. The online facilitators constantly communicated with the on-ground team for this purpose.

2) Synchronizing monitoring with the pace of change: We needed to synchronize the monitoring process with the change happening with or due to this course because the change was not uniform or uniformly distributed in terms of time or geographical location.

Effective management at the planning and implementation stages depended on timely information and early detection of an issue that might hinder learning down the road. For example, during the planning phase, we needed to identify technological problems beforehand since a substantial portion of our instruction was through live videoconferencing. During the implementation phase, synchronized monitoring was crucial. The best way to gauge our learning pace was by asking about other previous experiences with simulation and assessing the engagement. This was achieved by having co-facilitators in person in the sessions who could monitor the engagement, check the learning management system’s stats and distribute a questionnaire about previous experience. Another observation was that two of the three distant facilitators could ensure engagement and clarify the learner’s personalized context. Despite being separated in time and space, their cultural and linguistic backgrounds were the same; they still felt connected to the participants.

3) Considering Interrelationships, Perspectives and Boundaries: Three central guiding systems of CAM are interrelationships, perspectives and boundaries, which work synergistically in a complex system [13] (see Figure 1). We monitored the program cycle by focusing on the following questions [13],

Interrelationships and boundaries in Complexity-Aware Monitoring system in a faculty development course to use healthcare simulation
Figure 1:

Interrelationships and boundaries in Complexity-Aware Monitoring system in a faculty development course to use healthcare simulation

What are the:

    Interrelationships among different perspectives,

    Variations in boundaries, and

    Priorities for us in the perspectives and interrelationships, and why?

As the figure shows, understanding the interrelationships of various stakeholders during the planning/structuring phase and continuous monitoring during the implementation phase was important (see Figure 1). Even though the factors have clear boundaries in the figure, there are no clear boundaries in the real world because the individual factors are interrelated and therefore represented with back-and-forth or two-way arrows. The Figure 1 shows the cyclical nature of the process in a linear way. For example, beginning from the top of this figure, designing the course grounded in the Healthcare Simulation Standards of Best Practice [1], we had to consider the individual culture of society, healthcare delivery and healthcare education in that country. This nested system of socio-cultural diversity included the acceptance of simulation as an instructional methodology. We struggled to convey the message that simulation could be used without jeopardizing the precious limited resources and would improve healthcare education and patient care.

Data collection and analysis

We collected data in August 2022 and September 2022 via two online surveys using Google Forms [19], one immediately after the workshop, the Post-workshop Survey, and one after 4 weeks, Follow-up Survey . Following recommendations from a systems science perspective [11], we collected quantitative and qualitative data. In both surveys, we obtained learners’ agreement on statements on a Likert scale of 1 through 5, where 1 was ‘strongly disagree’ and 5 was ‘strongly agree’. Using basic statistics, AK, SM, JF and MB quantitatively analysed and interpreted the data generated from agreement statements. We also included open-ended questions in both surveys to gather learners’ insight and used the descriptive thematic qualitative method [15,20] to analyse the answers. Following best practices [15,20], JF and MB reviewed the qualitative data repeatedly to immerse themselves in a systematic and consistent approach to analysis and did inductive coding independently [15]. They frequently met to compare themes and resolve conflicts [15].

Ethical consideration

The study was found exempt with the number HHN_IRB_2022_09_003 by the Institutional Review Board of the Indus Hospital and Health Network [21]. The research was conducted in established or commonly accepted educational settings that involved normal educational practices.

The data were collected on a Google form [19] and were saved in a passcode-protected account in the institutional account [21] to protect the confidentiality of the learners. Only MB, JF and AK had access to the data. AK de-identified and processed the data.

Results

Demographics

A total of 23 faculty participated in the simulation workshop from a single medical college in a large metropolitan city in Pakistan. Among them, 9 (39%) were from basic medical sciences, and 14 (61%) were from clinical medical sciences. Basic medical sciences included anatomy, physiology, pathology and forensic medicine. The clinical medical sciences included medicine, pediatrics, obstetrics, gynecology, otolaryngology, ophthalmology, neurology and psychiatry.

Quantitative analysis

The quantitative data were obtained through two surveys. The first one was immediately after the completion of the workshop, the Post-Workshop Survey, and the second one was after 4 weeks, Follow-up Survey. The response rate was 100% (n = 23) for the Post-Workshop Survey, while 70% (n = 16) completed the Follow-up Survey. The survey was sent via e-mail to all the participants. The data were password-protected, and only MB and JF had access to the data and de-identified the data for processing and analysis.

Both immediate Post-Workshop and Follow-up surveys focused on the five intended outcomes, level of acceptance, knowledge gain, self-efficacy, application of simulation knowledge and facilitator performance (see Table 2).

Table 2:
Comparison of the Post-Workshop Survey and the Follow-up Survey
Outcomes Post-workshop survey
n = 23
Follow-up survey
n = 16
*Neutral
n (%)
Agreed
n (%)
Strongly agreed
n (%)
*Neutral
n (%)
Agreed
n (%)
Strongly agreed
n (%)
1. Level of acceptance Understanding the purpose of workshop 0 8 (34.78) 15 (65.2) 0 0 16 (100)
Willingness to incorporate simulation 1 (4.34) 16 (69.56) 6 (26.08) 0 15 (93.75) 1 (6.25)
2. Knowledge gain Increase in sim knowledge 0 10 (43.47) 13 (56.52) 0 7 (43.75) 9 (56.25)
Adequacy of content 0 18 (78.26) 5 (21.73) 0 11 (68.75) 5 (31.25)
3. Self-efficacy Confidence with gained knowledge 2 (8.6) 10 (43.47) 11 (47.82) 0 5 (31.25) 11 (68.75)
4. Application of simulation knowledge Willingness to incorporate OR incorporated in their curricula 0 15 (65.21) 8 (34.7) 0 9 (56.25) 7 (43.75)
5. Facilitators’ knowledge Delivery of content 2 (8.6) 18 (78.26) 3 (13.04) 0 3 (18.75) 13 (81.25)
Preparedness of the facilitator 2 (8.6) 18 (78.26) 3 (13.04) 0 4 (25) 12 (75)
Learners engagement/impact 1 (4.34) 12 (52.17) 10 (43.47) 0 3 (18.75) 13 (81.25)

*The respondents did not give any points in the first two levels of the Likert scale, i.e. Strongly Disagreed and Disagreed. Therefore, these two columns were omitted in the results presentation.

We examined the impact in Kirkpatrick levels 1 and 2 [22] in the first and level 3 in the second survey. We considered assessing the faculty’s acceptance of simulation essential because it was their first exposure to simulation training (see Table 2). All faculty (100%) desired similar workshops, 37.5% hoped to focus on debriefing, 25% on facilitation and scenario development each and 12.5% on interprofessional education. Moreover, 75% of participants were willing to be contacted for a 30- to 45-minute-long interview to understand the simulation practices they aimed to incorporate into their teaching practices in their institute. Sixty-nine per cent of the participants reported that the scenarios presented during the workshop helped them to gain knowledge as the simulation scenarios were relevant to their workplace settings. Seventy per cent of participants strongly agreed, while 30% agreed that they got an opportunity to participate in the debriefing session practice actively, helping them with their confidence. Additionally, 44% of participants applied their simulation knowledge by incorporating simulation into their curricula at the time of the Follow-up Survey. The remaining participants, 9 (56%), were either in the planning phase or did not get a chance to implement it at the 4-week mark of the Follow-up Survey. The faculty learners agreed or strongly agreed, even after 4 weeks, that facilitators were knowledgeable when asked in the Follow-up Survey.

Qualitative analysis

We added open-ended questions to explore unintended outcomes regarding the program and participants’ learning, the application of simulation-based principles into their settings, the barriers and facilitators for implementing SBE in their settings and suggestions for future improvements (see Table 3).

Table 3:
Themes generated from open-ended questions during the Follow-up Workshop Survey
Qualitative questions Themes generated
What was the most positive aspect of the simulation experience? 1. Applicability to their settings: ‘Reproducible curriculum’, ‘We got the idea how simulation can be incorporated’ and ‘new experience which motivated me’
2. Facilitation of the workshop: ‘Very attentive and loyal’, ‘enthusiastic’ and ‘facilitators guiding at each station’
Did you apply the concepts of simulation-based education in your settings, please describe? 1. Application of Simulation: ‘In clinical skill lab’, ‘In procedural skill and clinical laboratory procedure skill’ and ‘During the OSPE’
What are the obstacles to the implementation of simulation-based education in your institute? 1. Lack of awareness for undergraduate basic sciences education: ‘Not much clear how we can apply in basic medical sciences’ and ‘Overburdened curriculum of MBBS’
2. Lack of infrastructure: ‘Lack of technical and functional experts’, ‘lack of simulation culture’ and ‘lack of space’
What are the facilitatory factors in the implementation of simulation-based education in your institute? 1. Simulation skill lab and trained faculty: ‘Proper skills lab and trained faculty’ and ‘Availability of skills lab & experienced faculty’
2. Leadership support: ‘Management is very supportive’ and ‘Our institution head (principal) is the most helpful facilitatory factor’
What are your suggestions for future improvement of the simulation workshop? 1. Longer duration of time: ‘Should be [a] two days session at least’ and ‘It should be a 2-day workshop with more simulation scenarios and better settings’
2. Institutional Support: ‘Institutional leadership support for simulation-based activities could be improved’ and ‘It should be repeated on a regular basis’
3. Well-organized course: ‘It was excellent the way it was conducted’ and ‘Course was well-organized’
4. Gaps identified ‘... it should be in a better setting’, ‘we were unable to identify who is a student and who is playing the role of simulated patient and had difficulty in recognizing the simulated participants from each other’ and ‘Try to make the simulation more realistic’
5. Frequent training needed: ‘It should be repeated on a regular basis’ and ‘Frequent training workshops of simulation-based education’

    1 Positive Experience: The participants thought the workshop was relatable and fitted well into their settings. One of the participants verbalized, ‘it is a reproducible curriculum for all trainees and gives instant performance feedback’, while another faculty said, ‘we got the idea how simulation can be incorporated in our curriculum for students’ learning’; another comment was, ‘it was a new experience which motivated me to increase my knowledge, skills, and attitudes toward Simulation methodology’.

  • The participants shared that the facilitators were motivating and created a positive learning environment. Some of the comments were, ‘very attentive and loyal participation of facilitators’, ‘each facilitator was enthusiastic’, ‘it’s safe for learning any kind of skill’ and ‘getting to experience the simulation activity’.
  • 2 Application of Simulation Concepts: Less than half of the participants could apply simulation in their settings for skills training, as exhibited by these statements, ‘procedural skill and clinical laboratory procedure’ and ‘this skill can be used in pathology to assess student knowledge’. Interestingly one of the participants used simulation for assessment, ‘during the OSPE; we have non-teaching staff members who are doing the simulated role’.

  • As verbalized by the participants, some had intentions and were planning to implement it, ‘not yet, but we plan to apply SBE for our postgraduates’, and ‘I am planning to incorporate it in the nephrology curriculum for graduate and postgraduate as well’.
  • 3 Obstacles in Implementation: The participants verbalized that SBE was not perceived as a pedagogy for the undergraduate basic sciences and was only applicable to either basic clinical sciences or postgraduate medical education. Some of the thoughts that participants shared are: ‘Not much clear how we can apply in basic science’; ‘Overburdened curriculum of MBBS’.

  • Many participants expressed that they lacked the resources, time and faculty to implement SBE in their settings. Some of the comments that alluded to this included, ‘lack of technical and functional experts, lack of smart team, and lack of simulation culture’, ‘lack of faculty, lack of space within the department, logistics’ and ‘... lack of resources [as] this is a new college, so we have a long way to go’.
  • 4 Facilitatory Factors in Implementation: Many participants shared that having resources enabled them to conduct simulation-based activities in their settings. Some of the feelings shared by the participants were: ‘proper skills lab and trained faculty’ and ‘availability of skills lab & experienced faculty’.

  • Many participants believed leadership and management buy-in was key in promoting SBE. ‘Management is very supportive in arranging resources’, ‘Our institution head, the principal, is the most helpful facilitatory factor, always encouraging the faculty in doing novel skills and strategies for the better education system’ and ‘keenness and financial support’.
  • 5 Future Directions for Improvement: Participants expressed their satisfaction with the workshop and suggested increasing the duration of similar sessions. They also expressed the desire to have more support from their institution for this initiative. They identified a few gaps, such as ‘better setting’ in terms of lack of dedicated physical simulation space, issues with ‘lack of realism’ associated with the resource-restricted environment, ‘performance of the simulated participants’ and ‘difficulty in recognizing the simulated participants’ from each other. They also expressed the desire for ‘more practice’ in similar workshops.

Discussion

Using the CAS as an underlying framework for our faculty development program enabled us to identify challenges early and mitigate them promptly. Although medical education is cyclical in nature and has various fractals or subsystems within, we discuss several aspects of its planning, implementation, and evaluation phases as an exemplar in a linear fashion for the ease of the readership.

Planning phase

Our program cycle highlighted several crucial factors to be considered to achieve intended outcomes during the planning phase, such as identifying stakeholders, managing elements and their interrelationships, establishing open communication and synchronizing the monitoring process with the pace of change. Recognizing elements and their interrelationships is essential in addition to identifying stakeholders and managing their competing and non-competing interests (see Table 1) [8,11]. Considering these factors in planning provides a roadmap to a comprehensive program evaluation not limited to conventional end-of-program assessment [11,23]. Additionally, understanding the interrelationships of stakeholders and elements provides the context for the environment under which the learner operates in the real world [23]. This fact can impact the learning process in both positive and negative ways and even impedes behavioural change [5,23].

Regular open communication is vital for establishing trust to ensure smoother program delivery [8] and mitigating cultural differences [24]. Open and respectful communication allows for learning about each other’s culture among stakeholders [24]. The learning environment built on cultural sensitivity also expresses instructors’ awareness and respect for learners’ culture and environment [24]. Moreover, understanding that change does not happen uniformly in all phases is crucial for a faculty development program, prompting the organizers to plan multiple internal checkpoints through regular and open communications. These internal check and balance points allowed us to notice the change promptly, resulting in a swift reaction if needed [8].

Implementation phase

The significant factors that affected the implementation of our program cycle included structuring the implementation team and being cognizant of the team’s confounding factors, interrelationships and perspectives. Assembling an implementation team with a needed skill set was our first step following the 12 tips [8], which comprised several simulation educators from abroad, simulation champions from Pakistan, the liaison faculty and the technical support team from the host institute. Before selecting the team members, several brainstorming sessions, careful planning and full-dress rehearsal are necessary to anticipate the needs arising during the implementation [8,25]. Acknowledging that unidentified or unpredicted factors are at play during implementation will assist in effective decision-making [13].

For us, managing stakeholders’ interrelationships and expectations during the program cycle’s implementation phase helped avoid unanticipated stress and undue duress [24]. It promoted learning interactions between the faculty learners and the facilitators [13]. We leveraged the positive impact of leadership inclusion by inviting the host college leadership during our session to promote learning and indicate hierarchical buy-in with culture change towards professional development [26]. We managed inter-faculty relationships by acknowledging the institutional culture with its unspoken tension between clinical and non-clinical faculty, a normal occurrence in the medical college setting of Pakistan. Therefore, we constructed deliberate non-random learner subgroups to support learning through psychological safety in their immediate learning environment, as supported by evidence [24,27]. We grouped together the undergraduate basic medical sciences (anatomy, physiology, pathology, biochemistry and forensic medicine) and clinical medical sciences (medicine, surgery, obstetrics, and such) faculty learners and directed conversations about the practical application of simulation-based practices in their respective curricula. We found that knowing the learners’ previous assumptions and internal biases was essential for a successful session as we discovered the misconception of our faculty learners in the opening dialogue; simulation was only limited to life support training and acute care settings. Thus, we modified our teaching approach and exhibited multiple simulation applications using the same trauma simulation case for basic and clinical medical sciences. Hence, we highlighted how the same scenario could be utilized to teach the physiology of hemorrhagic shock, the anatomy of large bone fractures and vascular injury, acute care and surgical practice. It also helped to soften the distinction between the basic and clinical medical sciences. It guided the faculty learners that they could work together on the same clinical scenario by recontextualizing the case according to the needs of their learners. Knowing to use the same scenario for multiple training sessions was also reassuring for the faculty as they expressed their inability to write up multiple scenarios because of their limited training and resources.

Evaluation phase

In this non-linear nested program, continuous monitoring during the process, evaluation of the intended and unintended outcomes and the impact of professional development as a factor of change were critically important. Checking in with the stakeholders is part of continued monitoring to keep abreast with the ever-changing variables [8,11]. Continuous feedback loops within a fractal, a nested subsystem, cause changes to that subsystem per feedback, increasing its adaptability while keeping the overall pattern the same [8]. Continuous feedback allows for swift adaptability, vital for efficient program implementation and resource allocation, leading to better outcomes [8]. Although evaluating a professional development program is complicated due to several reciprocal relationships between program components and outcomes [13,25], deploying various assessment methods and having a longitudinal evaluation of the program helps capture a relatively accurate picture [13,11].

The Post-workshop Survey and Follow-up Survey allowed us to gauge the faculty’s progress at two different points in time, which provided us a glimpse of the factors of change through our program. Professional development as a change factor can be viewed in two capacities [11], one at the individual level and one at the systems level. One of our intended outcomes was the application of simulation in the faculty learners’ respective professional fields after obtaining the knowledge, which was our indicator of change at both the individual and systems levels. The change factor affects a medical teacher’s personal and professional development through the internalization of tacit knowledge obtained during this professional development session [11]. Socialization [28] happens after a professional development session when faculty learners start to apply their knowledge to their settings, thus affecting the teaching culture by starting new professional relationships in the light of new knowledge with their peers and workplaces [11]. During the socialization phase, the faculty members might encounter unique challenges requiring ongoing support. This coincides with the concept of different change of pace in a program cycle [8,13]. These barriers are more pronounced in Lower-Middle Income Countries (LMIC), due to resistance to change to new educational methods, limited resources, and lack of institutional support and a growth mindset [29]. The participants showed high motivation to use simulation in their respective curricula, which we can explain by the so-called ‘Honeymoon effect’. For these reasons, a longitudinal follow-up is needed to further explore behaviour changes and sustainability.

Strengths: One of the strengths of the study design was using strategies to lower simulation costs by asking educators to volunteer their time and expertise and using faculty as simulated patients. Including educators with diverse professional settings and expertise and leveraging distance learning in the hybrid environment was another strength of the program. Another strength was providing diverse learner faculty to be in a session and learn together [30].

Limitations: Our study design had a few limitations, including a low response rate for the Follow-up Survey, as providing healthcare services took precedence during severe flooding in Pakistan. For the same reason, we could not follow up with the learners after a few months’ interval to assess the impact on their behaviour. Volunteering time is a risk to the long-term sustainability of a program. Many educators gave of their free time or extra time, and we were able to harness that for this project, but that can have unintended consequences for the longevity of a program. Another limitation was the learners’ fixed mindset that they could only do SBE when they had high-technology equipment and related resources. We tried to mitigate this thought process by exhibiting several ways to use low-cost simulation solutions.

Conclusion

Using principles of complexity science enabled us to understand and cater to the unique challenges of low-resource settings and provided an exemplar for educators in similar settings. Constructing and delivering a professional development program by systematically following the CAS principle allowed us to anticipate and mitigate the challenges and leverage the opportunities.

Impact and recommendation

We found positive intended and unintended impacts with no negative impact. Therefore, we can recommend using complexity science principles for simulation-based educational design to train medical personnel. These preliminary findings need more understanding and exploration. We recommend using the CAS principles repeatedly and in the multisite simulation training program to reveal and validate the findings and to explore any negative impact. This short intervention can be used as a road map for developing courses in low-resource set-ups and has the potential to be scaled up into a long-term intervention.

Supplementary material

Supplementary data are available at The International Journal of Healthcare Simulation online.

Declarations

Acknowledgements

The authors would like to thank Dr. Sabahat Fatima and Dr. Urooj Adnan, the emergency medicine residents at Indus Hospital and Health Network, and Natan Tahir and Rachel Johnson, the nursing faculty at Indus Hospital and Healthcare Network, for their unrelenting support and enthusiasm during the delivery of this workshop.

Authors’ contributions

MB, SA, AK, LJR and JF participated in this paper’s conceptualization, planning and design. All authors contributed to data collection. AK and SM did a quantitative analysis; MB and JF conducted a qualitative analysis. All authors contributed to the writing of the manuscript and followed the author’s instructions; they have read and approved the manuscript.

Funding

No funding was available for this paper.

Availability of data and materials

Supplementary appendices are attached and referenced appropriately in the body of the paper.

Ethics approval and consent to participate

The study was found exempt with the number HHN_IRB_2022_09_003 by the Institutional Review Board of the Indus Hospital & Health Network (https://indushospital.org.pk/). The research was conducted in established or commonly accepted educational settings that involved normal educational practices.

Competing interests

The authors declare no conflict of interest.

References

1. 

Watts PI, Rossler K, Bowler F, Miller C, Charnetski M, Decker S, et al. Onward and upward: introducing the healthcare simulation standards of best Practice™. Clinical Simulation in Nursing. 2021;58:14.

2. 

Hallmark B, Brown M, Peterson DT, Fey M, Decker S, Wells-Beede E, et al. Healthcare simulation standards of best Practice™ professional development. Clinical Simulation in Nursing. 2021;58:58.

3. 

Puri L, Das J, Pai M, Agrawal P, Fitzgerald JE, Kelley E, et al. Enhancing quality of medical care in low-income and middle-income countries through simulation-based initiatives: recommendations of the Simnovate Global Health Domain Group. BMJ Simulation & Technology Enhanced Learning. 2017;3:S15.

4. 

Seethamraju RR, Stone KP, Shepherd M. Evolution of a simulation faculty development program in a low-resource setting. Simulation in Healthcare. 2022;17(1):e122e127.

5. 

Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE guide no. 67. Medical Teacher. 2012;34(5):e288e299.

6. 

Mennin SP. Health professions education: complexity, teaching, and learning. In: Sturmberg JP, Martin CM, editors. Handbook of systems and complexity in health. New York, NY: Springer. 2013. p.755766.

7. 

Mennin S. Complexity and health professions education: a basic glossary. Journal of Evaluation in Clinical Practice. 2010;16(4):838840.

8. 

Edwards RA, Venugopal S, Navedo D, Ramani S. Addressing needs of diverse stakeholders: twelve tips for leaders of health professions education programs. Medical Teacher. 2019;41(1):1723.

9. 

Jorm C, Roberts C. Using complexity theory to guide medical school evaluations. Academic Medicine. 2018;93(3):399405.

10. 

Schoo A, Kumar K. The clinical educator and complexity: a review. The Clinical Teacher. 2018;15(4):287293.

11. 

Fernandez N, Audétat MC. Faculty development program evaluation: a need to embrace complexity. Advances in Medical Education and Practice. 2019;10:191.

12. 

Cristancho S, Field E, Lingard L. What is the state of complexity science in medical education research? Medical Education. 2019;53(1):95104.

13. 

U.S. Agency of International Development. (USAID). Program cycle discussion note: complexity-aware monitoring version 3. Bureau for Policy, Planning and Learning. 2021. Available from: https://usaidlearninglab.org/sites/default/files/resource/files/dn_-_complexity-aware_monitoring_final2021_1.pdf [Accessed June 2022].

14. 

Sandelowski M. Whatever happened to qualitative description? Research in Nursing & Health. 2000;23(4):334340.

15. 

Doyle L, McCabe C, Keogh B, Brady A, McCann M. An overview of the qualitative descriptive design within nursing research. Journal of Research in Nursing. 2020;25(5):443455.

16. 

Zoom.us. Version 5.11.1(8356). 2022. Available from: https://zoom.us/ [Accessed 4 August 2022].

17. 

Whatsapp.com. V.2.22.24.81. 2022. Available from: https://www.whatsapp.com/.

18. 

Watts PI, McDermott DS, Alinier G, Charnetski M, Ludlow J, Horsley E, et al. Healthcare simulation standards of best PracticeTM simulation design. Clinical Simulation in Nursing. 2021;58:1421.

19. 

Google Forms. Get insights quickly, with Google Forms. Available from: https://www.google.com/forms/about/ [Accessed 15 July 2022].

20. 

Elo S, Kyngäs H. The qualitative content analysis process. Journal of Advanced Nursing. 2008;62(1):107115.

21. 

Indus Hospital & Health Network. 2022. Available from: https://indushospital.org.pk/ [Accessed 22 September 2022].

22. 

Kirkpatrick JD, Kirkpatrick WK. Kirkpatrick’s four levels of training evaluation. Association for Talent Development. 2016. Available from: https://www.google.com/books/edition/Kirkpatrick_s_Four_Levels_of_Training_Ev/mo--DAAAQBAJ?hl=en&gbpv=0

23. 

Haji F, Morin MP, Parker K. Rethinking programme evaluation in health professions education: beyond ‘did it work?’ Medical Education. 2013;47(4):342351.

24. 

Mortaz Hejri S, Vyas R, Burdick WP, Steinert Y. Understanding and embracing culture in international faculty development. Perspectives on Medical Education. 2023;12(1):111.

25. 

Boustani M, Alder CA, Solid CA. Agile implementation: a blueprint for implementing evidence-based healthcare solutions. Journal of the American Geriatrics Society. 2018;66(7):13721376.

26. 

Nembhard IM, Edmondson AC. Making it safe: the effects of leader inclusiveness and professional status on psychological safety and improvement efforts in health care teams. Journal of Organizational Behavior: The International Journal of Industrial, Occupational and Organizational Psychology and Behavior. 2006;27(7):941966.

27. 

Rudolph JW, Raemer DB, Simon R. Establishing a safe container for learning in simulation: the role of the presimulation briefing. Simulation in Healthcare. 2014;9(6):339349.

28. 

Cruess RL, Cruess SR, Boudreau JD, Snell L, Steinert Y. Reframing medical education to support professional identity formation. Academic Medicine. 2014;89(11):14461451.

29. 

Anwar MI, Humayun A. Faculty development—looking through different lenses. Pakistan Armed Forces Medical Journal. 2015;65(1):110117.

30. 

Ong CC, Foo YY, Chiu FY, Nestel D. ‘It’s going to change the way we train’: qualitative evaluation of a transformative faculty development workshop. Perspectives on Medical Education. 2021;25:17.
Supplementary materials
  • Supplementary-material_S1.pdf info     save_alt  
  • Supplementary-material_S2.pdf info     save_alt  
  • Supplementary-material_S3.pdf info     save_alt