Simulation-based medical education (SBME) is an evolving method of teaching cardiac examination skills to healthcare learners. It has been deliberated how effective this teaching modality is and whether high-fidelity methods are more effective than low-fidelity methods. This systematic review aimed to assess the effectiveness of high-fidelity SBME in teaching cardiac auscultation compared with no intervention or another active teaching intervention (low-fidelity SBME) using evidence from randomized controlled trials (RCTs).
Literature searches were performed on Medline, Embase, PsychInfo and Cinahl. RCTs that compared the effectiveness of high-fidelity simulation against no intervention or high-fidelity simulation against low-fidelity simulation in teaching cardiac auscultation to healthcare learners were included. Outcomes were knowledge, skills and satisfaction relating to cardiac auscultation education. Data were analyzed using Review Manager 5.3 software.
Seventeen RCTs (n = 1055) were included. Twelve RCTs (n = 692) compared high-fidelity simulation with no intervention. The pooled effect sizes for knowledge and skills were 1.39 (95% confidence interval [CI], 0.39–2.38; p = 0.006; I2 = 92%) and −0.28 (95% CI, −1.49 to 0.93; p = 0.65; I2 = 94%), respectively. Five RCTs (n = 363) compared high-fidelity simulation with low-fidelity simulation. The pooled effect sizes for knowledge and skills were −0.73 (95% CI, −1.99 to 0.53; p = 0.26; I2 = 86%) and 0.32 (95%CI −0.75 to 1.39; p = 0.56; I2 = 89%), respectively.
This review’s findings suggest that high-fidelity SBME is an effective teaching method for cardiac auscultation education. Interestingly, there was no significant difference in knowledge or skills among learners when comparing high-fidelity simulation with low-fidelity simulation. Further research is needed to establish the effectiveness of different forms of SBME as educational interventions.
What is already known on this topic
• Numerous studies have suggested that simulation-based medical education (SBME) is significantly more effective than bedside or lecture-based teaching in improving healthcare learners’ knowledge, skills and satisfaction relating to cardiac auscultation teaching.
• With the impact Covid-19 has had on experiential learning, the opportunity for SBME to be used in student’s learning is greater than ever.
• However, there is not a systematic review of the existing randomized controlled trials to provide the highest level of evidence in this field.
• Studies have also disagreed over whether high-fidelity SBME is more effective than low-fidelity SBME.
What this study adds
• Our results provide the highest level of evidence in this area, suggesting that high-fidelity SBME may not be significantly more effective than other active low-fidelity SBME teaching interventions, despite being more expensive.
Cardiac auscultation is a key clinical skill for the diagnosis of patients with various heart diseases and is both reliable and cost-efficient [1,2]. Therefore, poor cardiac auscultation may lead to the dismissal of important pathology (false negatives) or over diagnosis (false positives) which has further implications on unnecessary referrals and investigations [3]. Although cardiac auscultation is clearly important, several studies have reported on skill incompetence within healthcare learners in cardiac auscultation [4,5]. The Covid-19 pandemic has affected this further, with fewer clinical learning experiences being available to students. Clinicians and departments are under unprecedented pressure, and time to teach students has been compromised, with timetabled teaching being periodically suspended.
Simulation-based medical education (SBME) was introduced in medical schools to improve learners’ competence and clinical experiences [6–9]. Simulation refers to the artificial representation of a real-world process allowing the trainer to control the learning environment to achieve educational goals [10]. Several studies show that the use of SBME in health professional education has a positive impact on the learners’ satisfaction and self-confidence [11–15]. SBME can also facilitate different planned exposure scenarios and auscultator abnormalities for various cardiac pathologies which are difficult for students to obtain in clinical placements [16].
There has been progression from low-fidelity SBME [17], such as using recorded audio files or multimedia CD-ROMs, to the development of high-fidelity SBME [18] with computerized manikins. These are more realistic and give learners a controlled, safe learning environment to make and correct mistakes without affecting patients’ safety [19]. High-fidelity (authentic) learning theory also explains the rationale of using SBME by increasing learners’ motivation to learn various clinical experiences [20].
Two systematic reviews [21,22] favour high-fidelity SBME in medical education compared with traditional teaching or no intervention. However, these reviews were relatively low levels of evidence (being limited to cohort and single-group studies). Thus, the effectiveness of SBME in cardiac auscultation training remains controversial. Several studies have also shown no significant differences in the effectiveness of high-fidelity SBME versus low-fidelity SBME [23,24].
This systematic review aims to address the gap in the literature regarding the effectiveness of SBME in cardiac auscultation training for health professional education within randomized control trials.
This systematic review and meta-analysis were performed as per the Preferred Reporting Items for Systematic Reviews and Meta-analysis (PRISMA) statement guidance [25].
The objective was to synthesize the highest level of available evidence regarding the effectiveness of SBME in cardiac auscultation within healthcare education.
The research questions were: Does the use of high-fidelity SBME for training healthcare professionals improve learners’ knowledge, skill performance, attitudes and satisfaction in cardiac auscultation training? And, if so, how does this compare to other active low-fidelity SBME teaching interventions?
Inclusion criteria were randomized controlled trials (RCTs) investigating the use of high-fidelity SBME modalities within cardiac auscultation, defined as humanoid simulators able to generate a heart sound, to teach health professionals at any stage of their training, in comparison to either a low-fidelity modality, defined as any form of simulation excluding the use of the humanoid simulators able to generate a heart sound alone such as set of headphones using recorded audio files or multimedia CD-ROMs), or no intervention, defined as a form of teaching not utilizing SBME. Learning outcomes included learners’ knowledge acquisition (written test scores, identification of murmurs), skills acquisition (examination skills, Observed Structured Clinical Examination [OSCE] performance) and satisfaction [26] (subjective self-reported attitude towards teaching technique).
Table 1 describes the inclusion and exclusion criteria using a Participants, Intervention, Comparisons and Outcomes format.
Inclusion Criteria | Exclusion Criteria | |
---|---|---|
Study design | Randomized controlled trials | All other study types |
Participants | All healthcare students and professionals in the training of cardiac auscultation All healthcare students learning cardiac auscultation |
Healthcare students or professionals in training and learning about other cardiology topics such as cardiopulmonary resuscitation or interventional cardiology |
Intervention | A high-fidelity cardiopulmonary simulator able to generate a heart sound (e.g. Harvey®, SAM® and Nasco®) | Other forms of teaching interventions |
Comparisons | No Intervention (e.g. continued usual teaching, no change to curriculum) Low-fidelity SBME comparison (e.g. computer-generated sounds only) |
|
Outcomes | Knowledge acquisition (written test scores, identification of murmurs) Skills acquisition (examination skills, OSCE performance) Satisfaction (subjective self-reported attitude towards teaching technique) |
OSCE = Observed Structured Clinical Examination
MEDLINE, EMBASE, CINAHL, Scopus, Web of Science and PsychINFO were searched independently by two authors (CO and AM) with no starting date. The last date of the search was 25 February 2022. Terms used for learners included ‘education medical’, ‘education nursing’, ‘physician associates’, ‘medical students’ and ‘nursing students’. Terms used for intervention included: ‘simulator’, ‘manikins’, ‘Harvey’ and ‘simulation’. Topics included ‘cardiac auscultation’ ‘heart sounds’, ‘heart murmurs’ and ‘cardiac examination’. Terms for outcomes included ‘skills’, ‘satisfaction’ and ‘knowledge’. These were decided with advice from an experienced research librarian.
No language restriction was applied. Search criteria were limited to humans. Furthermore, a manual search of bibliographies of the primary articles and reference lists of all included studies were searched and reviewed for additional relevant studies. A manual search of the abstract databases of international conferences was performed, including Association for Medical Education, Scottish National Medical Education, Association for Simulation Practice in Healthcare (ASPIH), Developing Excellence in Medical Education and International Association for Medical Education conferences.
All identified studies’ titles, abstracts and full texts were screened independently by two authors (CO and AM) for eligibility, and studies that met the inclusion criteria were included, in accordance with the Cochrane Handbook for Systematic Reviews of Interventions [27].
Data were extracted independently by two authors (CO and AM) followed by crosschecking and clarification of any differences, and if there were discrepancies, further discussion and decision were done including the senior author (CB). There were no non-English articles that required translation. Data entered into an excel sheet included the number and level of participants training, detailed randomization methods, purpose and aim of the study, area of clinical topic, details for intervention and comparison groups allocation, measurement of outcomes methods and the results.
In this review, studies were classified into two groups: High-Fidelity SBME versus No Intervention comparison (two groups randomized where one group uses SBME and the other has usual or traditional teaching) or High-Fidelity SBME versus Low-Fidelity SBME (two or more groups randomized where each group is using a different form of SBME).
Corresponding authors were contacted if data were missing. Data were analyzed using Review Manager Version 5.3 (Cochrane Collaboration, Oxford, UK). Results were expressed as standardized mean difference (SMD) and standard deviations (SD). For continuous outcomes, SMD was chosen due to the fact that knowledge and skills were being measured on differing scales in the RCTs. A p-value < 0.05 (95% confidence interval [CI]) was considered to be statistically significant. Heterogeneity was measured using the I2 score. A random-effects model was used throughout to allow meta-analysis.
The literature search demonstrated by the PRISMA diagram (Figure 1) initially generated n = 1025 studies, with full article review performed for n = 44 studies and n = 17 met inclusion criteria and were included in this review. Table 2 presents the summary of the included studies.
Study (Author, year) | Participants | Intervention vs. Comparison | Outcomes and Assessment | MS* |
---|---|---|---|---|
No Intervention comparison RCTs (12) | ||||
Birdane 2012 |
130 Year-5 medical students | Nasco auscultation trainer and smartscope™) (2 h) vs. routine theoretical and practical internship training | Knowledge Diagnosing real patients’ heart sounds on the ward, including MR, MS, PS, AR and VSD |
11.5 |
Catumbela 2017 |
117 Year-3 medical students | Nasco auscultation trainer and smartscope™ (6 h) vs. traditional ward-based teaching | Knowledge Identifying murmurs in real patients in cardiology ward |
11.5 |
Gauthier 2017 |
32 Year-1 medical students | Harvey® CPS (1 h) vs. standardized patient teaching | Skills + Satisfaction OSCE performance on ability to perform cardiac examination + Satisfaction questionnaire | 11.5 |
Kronschnabl 2021 |
70 Year-3 medical students | CPS ‘K’ (75 m) vs. no supplemental teaching | Skills Cardiac examination of volunteer |
14.5 |
Martinez 2012 |
32 Year-5 medical students + 18 medical residents | SAM® CPS (2 h) vs. routine training | Knowledge Identifying 8 simulated heart sounds |
11.5 |
Oddone 1993 |
56 Medical residents | Harvey® CPS (8 h) incorporated into curriculum vs. no supplemental teaching | Skills Cardiac examination of 3 real patients |
12.5 |
Penta 1973 |
30 Medical students | CPS (4.5 h) incorporated into curriculum vs. no supplemental teaching | Skills Cardiac examination of 6 simulated scenarios |
11.5 |
Scherer 2007 |
23 Nurse practitioner students | MedSim® Eagle SM (0.5 h) vs. Seminar and case study teaching | Knowledge Short Answer Question Test |
11.5 |
Sverdrup 2010 |
49 Year-3 medical students | CardioSim® Auscultation System (4 h) vs. bedside teaching | Skills OSCE Performance |
11.5 |
Tiffen 2011 |
29 Nursing students | VitalSIMKelly CPS (1 h) incorporated into curriculum vs. no supplemental teaching | Knowledge + Satisfaction Written knowledge test + Satisfaction questionnaire | 14.5 |
Tuzer 2016 |
52 Year-4 nursing students | High-fidelity simulator vs. standardized patient teaching | Knowledge + Skills Knowledge test + OSCE performance | 12.5 |
Vural Dogru 2020 | 72 Year-1 nursing students | High-fidelity simulator vs. ‘traditional teaching method’ | Knowledge + Skills Written knowledge test + OSCE performance |
14.5 |
SBME comparison RCTs (5) |
||||
Champagne 1989 |
37 Nurse practitioners | Heart Sim II [Atlantic Medical Systems Inc.] (0.5 h) with and without palpation | Knowledge Identifying 20 heart sounds |
15.5 |
Chen 2015 |
60 Nursing students | Infant BabySIM + Child PediaSIM, Medical Education Technologies Inc. (0.5 h) vs. Heart sounds only | Knowledge + Skills Identifying 20 heart sounds + Likert scale | 11.5 |
De Giovanni 2009 |
37 Year-3 medical students | Harvey® CPS (3 h) vs. CD of recorded sounds | Skills Cardiac examination in 5 real patients |
14.5 |
Fraser 2011 |
86 Year-1 medical students | Harvey® CPS (2 h) MR teaching vs. SBME without murmur teaching | Skills Cardiac examination in MR patients |
11.5 |
Friederichs 2014 |
143 Pre-clinical medical students | (Life/form Auscultation Trainer and Smartscope, Nasco) (0.5 h) vs. hybrid models | Satisfaction Satisfaction questionnaire |
11 |
*MS = MERSQI Score [28]
AR – aortic regurgitation, MR – mitral regurgitation, OSCE – Observed Structured Clinical Examination, PS – pulmonary stenosis, VSD – ventricular septal defect
Twelve RCTs compared high-fidelity SBME with no intervention and five RCTs compared high-fidelity SBME with low-fidelity SBME.
Several different high-fidelity simulators were used in these studies. The most commonly used simulator interventions were:
The following simulators were all used in one RCT each:
• CardioSIM® Auscultation System [24]
• CPS ‘K’ [35]
• Heart Sim II® [Atlantic Medical Systems Inc.] [36]
• Infant Baby SIM and Child PediaSIM [Medical Education Technologies Inc.] [37]
• MedSim® Eagle SIM [38]
• SAM® [Cardionics Inc., Texas, United States] [39]
• VitalSIMKelly [40]
Three RCTs did not specify which specific heart sound simulator they used [41–43].
Six RCTs assessed knowledge outcomes [33,37–40,42] in 307 learners (n = 157 in SBME vs. n = 150 in no intervention group). Meta-analysis produced a statistically significant difference in knowledge acquisition favouring the high-fidelity SBME group, with pooled effect size of 1.39 (95% CI, 0.39–2.38; p = 0.006; I2 = 92%; Figure 2).
Five RCTs assessed skills outcomes [24,30,35,36,41] in 236 learners (n = 125 in SBME vs. n = 111 in no intervention group) and were included in the meta-analysis. There was no statistical difference between the groups (−0.28; 95%CI, −1.49 to 0.93; p = 0.65; I2 = 94%; Figure 2).
Four RCTs [31,32,41,43] were not included in the meta-analysis as they did not report standard deviations or they reported outcomes as a number and percentage. Birdane et al. [32] reported that learners who underwent SBME demonstrated significantly greater accuracy in identifying a range of important clinical murmurs than learners who did not. Oddone et al. [31] reported that SBME was effective in improving learners’ ability to diagnose mitral stenosis compared with no supplemental teaching (15% improvement vs. 0% improvement). Penta and Kofman [41] reported that time spent with a high-fidelity SBME modality positively correlated with greater skills acquisition amongst learners. Vural Dogru and Zengin Aydin [43] found that median scores for students’ knowledge (p = 0.001) and skills (p < 0.001) were significantly better when taught by a high-fidelity simulator method than with the traditional teaching method.
Two RCTs [30,40] assessed satisfaction outcomes in 61 learners (n = 31 in SBME vs. n = 30 in no intervention group). Both RCTs found that satisfaction relating to SBME for cardiac auscultation was positive and better in the SBME group compared with the no intervention group. There were no standard deviations reported so meta-analyses could not be performed.
Two RCTs [36,37] compared the effects of high-fidelity SBME versus low-fidelity SBME in a total of 81 learners (n = 38 in high-fidelity SBME vs. n = 43 in low-fidelity SBME group). Meta-analysis showed no significant difference between the two groups and was in favour of low-fidelity SBME (−0.73; 95% CI, −1.99 to 0.53; p = 0.26; I2 = 86%; Figure 3).
Three RCTs assessed skills outcomes [23,29,37] in 135 learners (n = 64 in high-fidelity SBME vs. n = 71 in another active teaching intervention using SBME group). There was no significant difference between the two groups (0.32, 95% CI −0.75 to 1.39; p = 0.56; I2 = 89%; Figure 3).
One RCT compared two forms of SBME with satisfaction as an outcome. Friederichs et al. [34] compared hybrid SBME (hybrid models involve a human being that was electronically outfitted to produce pathological heart sounds with hardware and software chips) versus use of mannequins only. Learners reported a better satisfaction with the simulator in the hybrid SBME group when compared with the high-fidelity SBME only group (83% vs. 64%). The reported overall benefit scores on a student questionnaire were not significantly different (88% vs. 87%). As only one RCT reported a satisfactory outcome for this comparison, meta-analysis was not possible.
No studies were excluded on the basis of methodological heterogeneity. There was high degree (I2 > 75%) of statistical heterogeneity in both assessed outcomes (knowledge and skills). Therefore, a random-effects model was used.
The risk of bias was assessed using the Cochrane Collaboration’s RoB tool [27]. Most RCTs had good blinding of the assessor collecting the outcome and in reporting selection bias. However, reporting of random sequence generation, allocation concealment and blinding of participants were generally poor (Figure 4a and b).
The results of this review provide evidence to support high-fidelity SBME as an effective instructional approach to cardiac auscultation teaching compared with no intervention. However, the effectiveness is comparable with low-fidelity SBME.
Knowledge acquisition amongst learners was significantly better with high-fidelity SBME compared with no intervention or usual teaching. These results agree with two other systematic reviews which suggest that SBME training increases clinical knowledge [21,22].
Butter et al . [44] compared third-year students who had been trained with high-fidelity SBME against untrained fourth-year students. The high-fidelity SBME-trained group demonstrated significantly greater knowledge of simulated heart sounds (93% vs. 73.9%; p < 0.001) and heart sounds in real patients (81.8% vs. 75.1%, p = 0.003). In a single-group study, Perlini et al . [45] compared final-year medical students before and after SBME-based training. Their work found significant improvement in the students’ knowledge after SBME-based training (72% vs. 11% baseline; p < 0.001).
There was insufficient evidence to support SBME over no intervention in skills acquisition, which also agrees with McKinney et al . [46], possibly because of the low number of RCTs in this field. Skill improvement may also require more time to reach a significant difference [41]. RCTs in this review measured skills with OSCE performance, whereas knowledge was tested with written test scores. Skills required to do well in an OSCE setting are more complex than simply knowledge of heart sounds. Learners need to be trained to interact with a patient, communicate and show respectful professionalism while examining the correct anatomical areas and identifying any heart sounds. Time spent with the simulators ranged from 0.5 to 4.5 hours. More practice may be required in order to show a difference between SBME and usual teaching for skills transfer. In contrast, a test score only requires knowledge of the murmur. This is a huge advantage for SBME over usual teaching. Once learners hear a murmur on a high-fidelity simulator, they are more likely to retain knowledge regarding it, especially if they have not had the opportunity to hear this in opportunistic bedside teaching [10,16].
Kern et al . [47] found that there was a significant improvement in cardiac auscultation technique (85% vs. 71%; p = 0.003) within an academic year where students had SBME teaching compared with previous academic years where SBME was not taught. Interestingly, the students only had 50 minutes of contact with the simulator. However, these cohort studies have the added advantage that they assess students in summative examinations contributing to their overall degree. Therefore, the students’ exam performance may be more reliable as they have a greater motivation to demonstrate the full potential of their learning [48]. RCT designs cannot ethically test students in a summative examination as they are not being treated equally. Therefore, RCTs often recruit their students on a voluntary basis, and the outcome measurements (OSCEs/tests) are formative.
In this review, participants’ satisfaction was assessed in few RCTs but suggested that learners are highly satisfied with high-fidelity SBME compared with standard methods or no intervention. Participant-orientated satisfaction is an important aspect of any educational interventional process as learner engagement is key to achieving curricular outcomes through different educational approaches. Satisfaction can come from the novelty factor if students have not used a simulator before. By the same token, increased enjoyability can improve learning [26].
Gauthier et al. [30] reported students’ feedback was that SBME offered superior clinical findings compared with the use of standardized patients. It was noteworthy that 68.8% of students who had SBME training requested that they had a combination of SBME and standardized patients in future learning. Several studies remind the reader that SBME should act to supplement clinical teaching with real or volunteer patients rather than aim to replace it.
Scherer et al. [39] reported students’ feedback was that SBME allowed them to problem solve in a critical scenario without the stress of treating a real patient. Many stated that the experience helped them to gain more confidence in decision making in clinical practice.
Our review shows no significant difference between the effectiveness of high-fidelity SBME and low-fidelity SBME. This could be crucial information for health education directors, as high-fidelity SBME being a more expensive teaching method compared with low-fidelity SBME.
Investigating this further could help training institutions save money by choosing the cheaper option of low-fidelity SBME in order to achieve similar learning outcomes. An additional financial benefit from SBME may be the costs avoided through students making fewer errors in clinical practice. Benefits like these can be monetized and, therefore, included in a cost-effectiveness analysis. More difficult benefits to summate would include higher patient-centred care, increased empathy and an increased knowledge of relevant skills in the clinical environment [49].
The available data we have suggest that there is no significant advantage with high-fidelity SBME. Due to cost and staff availability issues, using SBME strategically is important to undergraduate and postgraduate health professional course directors.
The analyzed data for this review provide practical suggestions for educators. Firstly, integration may not be essential, converse to conclusions of existing literature [4]. In addition, time spent with simulators may be important in the transfer of the skills learnt.
Friederichs et al. [34] reported that feedback from tutors, simulated patients and students was all positive with hybrid simulation, i.e. SBME and volunteer patient combined. Students reported a higher level of attention and seriousness during the class that was taught with the hybrid model than in the group taught with auscultation manikins. The standardized patients also responded positively and ‘felt accepted and respected’. This positive verbal feedback was reflected by satisfaction questionnaires.
Although it would be useful to compare the effectiveness of low-fidelity SBME against no intervention, there is a gap in the literature regarding this comparison. The only one RCT included to compare these modalities was Chen et al . [37] who found that low-fidelity SBME significantly improved knowledge compared with no intervention, but not the improvement skills in cardiac auscultation. However, this included a small number of participants.
There are evidently a small number of high-quality RCTs looking into the effectiveness of SBME in cardiac auscultation teaching – a topic that should be of high interest to healthcare educators and learners. This has been highlighted by the Covid-19 pandemic, which resulted in increased pressure on senior clinicians and reduced elective presentations for students to seek exposure to. SBME is an effective modality in teaching students’ clinical skills including cardiac auscultation that will not take precedence in medical emergencies such as respiratory failure and cardiac arrest [50]. Ideally, future RCTs would be multi-institutional, have large sample sizes and allow more powerful statistical analysis.
Future studies should focus on comparing key instructional design features either between simulations or comparing different types of SBME to another educational modality, using rigorous and reproducible outcome measures. Assessing diagnostic skills in real clinical practice is the desired outcome to elucidate the best practices for this expensive resource.
This systematic review and meta-analysis assimilate the available literature regarding the effectiveness of high-fidelity SBME in cardiac auscultation training for healthcare professionals within RCTs. The highest level of evidence (Level 1) is obtained from a systematic review of RCTs because it allows comparison of an intervention group with a non-intervention group, which represents the population under investigation [27]. It also avoids inevitable bias in observational and non-randomized clinical trials. This review was conducted in accordance with PRISMA guidelines [25]. The search strategy was comprehensive and thorough, including various databases and manual searches within relevant conferences. In addition, there was no language restriction, allowing the inclusion of all available trials worldwide. Studies reviewed were conducted in different countries (Angola, Canada, Chile, Germany, Norway, Turkey, United Kingdom and the United States); therefore, findings could be extrapolated. Authors of all included studies were contacted for any missing data. Data were methodically extracted using a custom-designed form, and meta-analysis was only undertaken where appropriate to generate summaries. The risk of bias was assessed using the standard Cochrane Collaboration’s risk of bias assessment tool, which helped to facilitate the estimation of the true effectiveness of the interventions [27].
This review is primarily limited by the low number of published RCTs. In addition, RCTs that were found had small sample sizes. Therefore, the statistical power of our meta-analyses was limited. To overcome this, all available conference abstracts/unpublished RCTs were included. In the case of missing data, authors were contacted to obtain the required data. Heterogeneity between studies was high as they examined different populations, so a random-effects model was used.
This systematic review and meta-analysis showed that high-fidelity SBME is an effective teaching method for cardiac auscultation education. SBME significantly increases learners’ knowledge and obtains better skills and satisfaction relating to cardiac auscultation education than no intervention. Interestingly, there was no significant difference in knowledge or skills among learners when comparing high-fidelity simulation to low-fidelity simulation. Further high-quality research is needed to establish the effectiveness of different forms of SBME as educational interventions to enhance the teaching of cardiac examination for different healthcare learners.
Dr Craig Osborne – development of the idea, literature search, study screening, data collection, data analysis, quality of study assessment, writing–up and submission.
Dr Craig Brown – discuss the project idea, review and edit the manuscript.
Dr Alyaa Mostafa – create and developed the idea of the project, double check the literature search and data collection, review the analysis, edit and final approval of the manuscript.
None.
None.
None declared.
None declared.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
43.
44.
45.
46.
47.
48.
49.
50.