"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has."

Margaret Mead
Original article
peer-reviewed

What Can We Learn About Debriefing From Other High-Risk/High-Stakes Industries?



Abstract

Objectives: To explore the debriefing methodologies utilised in high-risk/high-stakes industries with the intention of informing healthcare practice.

Methods: A sequential mixed methods approach was adopted to explore the use of feedback/debriefing methodologies in ten different industries. A first phase standardised survey focused on timing of debriefing post-event, debriefing techniques used, duration of debrief provision, technology used, and follow-up provision. A second phase observational study at one site (aerobatic team) explored debriefing in detail with a standardised tool and semi-structured interview technique.

Results: This study highlighted variability in debriefing practices, style, and duration amongst ten high-risk/high-stakes industries. The second phase observation study identified a highly effective method of self-directed video-based auto-debriefing.

Conclusions: Debriefing after critical or routine events is common practice in a range of high-risk/high-stakes industries. A key theme was the recognition of the importance of debriefing to promote reflection to increase performance and mitigate risk. Further in-depth qualitative analysis of the adoption of these debriefing techniques into healthcare and simulated practice is warranted.

Introduction

Simulation-augmented education is now recognised as an educational tool that provides opportunities to improve professionals' performance and enhance patient safety [1]. A number of debriefing techniques are available with the differing incorporation of three basic phases, descriptive, analytical, and application, to future events. A literature review of high-fidelity healthcare simulation indicated that feedback (debriefing) is the most important feature of simulation-based education [2]. To date, research has yet to determine the most effective debriefing styles and components to impact upon knowledge, skills, attitudes, and behaviours. Moreover, there is no evidence that one approach is optimal to impact all of those facets, as opposed to a composite debriefing approach.

Video-facilitated debriefing has been reported in recent experimental studies [2-7], which have reported inconsistent results. Grant and colleagues [3] concluded that video-facilitated simulation feedback is potentially useful in increasing desirable clinical behaviours in a simulated environment. Healthcare debriefing research [3-5] has previously focused on performance measurement rather than exploring the complexity or transferability of the learning experience; for example, in relation to human factors, situational awareness, error recognition, patient safety, or preparing the learner for repetitive reflective practice.

Additionally, structured debriefing after ‘real-life’ events/incidents is not common practice in the clinical healthcare environment throughout the United Kingdom (UK). In contrast, debriefing after critical or routine events has been utilised by a host of industries to learn from experiences and improve performance for decades in the UK [8].

It is evident that the concept of an individual or team having to gather information, process, analyse, decide, and act or not in a complex, continually changing environment [9] is not unique to the healthcare domain. Moreover, the paradigm of reflecting in, on, and after action [10] is the cornerstone of experiential learning that is itself the construct at the heart of simulation education and is not exclusive to the healthcare arena. The concept of trying to achieve a desired outcome by repeating an event again and again, each time analysing how to improve, can be pictured easily with man’s early attempt to fly or to make tools before that. Viewing this with an evolutionary biology lens, one could extend this time frame markedly more. 

The dynamic equilibrium of repeatedly experiencing events and attempting to make sense of how to improve has been punctuated or indeed rapidly evolved when the process of how we make sense and learn, combined with how we facilitate that in other learners, came under the educational scrutiny and research prowess of such pioneers as Dewey [11], Schön [10], Kolb [12], and others. If one returns to the more commonly used terms of feedback or performance analysis or coaching to capture what we may consider debriefing, there is perhaps much we can learn from decades of post-experience analysis by other industries that strive to improve performance, mitigate risk, and ultimately, improve safety.  The purpose of this study was to explore debriefing activities in high risk industries with the intention of informing healthcare practice. 

Materials & Methods

A sequential mixed methods approach was adopted to explore the use of feedback and debriefing methodologies in ten different industries. The aims of the study were two-fold: 1) To identify the types of feedback or debriefing methodologies utilised for post-event analysis and performance development within high risk industries, and 2) to explore the use of feedback/debriefing methodologies used for post-event analysis and development at one site.
The first phase of the study involved surveying the current feedback or debriefing strategies utilised for post-event analysis and performance development. The second phase explored one industry in detail. Ethical permission was sought from Royal Manchester Children’s Hospital Research & Development Department and deemed not necessary by the Committee.

Phase one

Ten industries that employ debriefing techniques, to improve performance of their workforce, were purposively selected and invited to participate in the study. An individual who had participated in debriefing (for example, trainer, coach, instructor, officer) was identified in each industry. All volunteers agreed to participate in a structured telephone interview. The interview schedule focused on four aspects of feedback/debriefing to ascertain what the strategies typically employed at each of the 10 sites (Aim 1) was. The four aspects discussed were:

1) Timing of the debriefing session, post-event, (for example, flight, performance, operation, mission),
2) Debriefing techniques used,
3) Duration of the debriefing event,
4) Follow-up provision.

All interviews were transcribed verbatim and analysed using thematic network analysis [14].

Phase two

One professional aerobatic display team voluntarily agreed to participate in this study. An observational study was used to explore the use of feedback/debriefing methodologies used for post-event analysis and development in detail in one high risk industry which participated in Stage One (Aim 2).

Data collection included:

1) flight observation,
2) digital images, 
3) interviewer rating of the debrief using the rater version of the DASH tool [15],
4) a semi-structured interview.

Interview participants were blinded to the DASH score at the time. Interview transcription was analysed using thematic analysis. Field notes and photographs were recorded with permission, allowing triangulation of all data. All data was stored in accordance with the Medical Research Council’s Good Research Practice Guidance [16].

The DASH tool [15] rates the following elements:

1) How the stages are set to enable an engaging learning environment,
2) How the debriefer maintains an engaging context for learning,
3) Structure of the debriefing,
4) How the debriefer provokes interesting and engaging discussions and fosters reflective practice,
5) Whether performance gaps have been identified,
6) Whether performance gaps have been closed.

Each of the elements were then rated using the following seven-point Likert scale:

1 - Extremely Ineffective/Abysmal
2 - Consistently Ineffective/Very Poor
3 - Mostly Ineffective/Poor
4 - Somewhat Effective/Average
5 - Mostly Effective/Good
6 - Consistently Effective/Very Good
7 - Extremely Effective/Outstanding.

Results

Phase one

One participant from each of the 10 high risk industries agreed to participate in the structured telephone interview. The interviews lasted between 10 and 20 minutes. Results of the thematic network analysis [14] of the interview transcripts have been summarised in Table 1

Industry Timing of the Debrief Session Debriefing Technique Duration of Debrief Follow-up Provision
Airline Immediate post-flight Reflective cycle 5-10 minutes Verbal follow-up
International rugby team Immediate post-game Didactic, key points addressed 2-5 minutes In-depth review 48 hrs later, verbal, and review of video
Olympic cycling team After 24 hours Cognitive interviews 30-60 minutes Verbal & written follow-up
Premier League professional football At 24 – 48 hours Reflective cycle 15-20 minutes with video Verbal reinforcement
Nuclear industry At greater than 24 hours Structured in-depth analysis Variable, minutes to many hours Verbal and  written
Mountain rescue team Immediate Didactic, key points addressed 10 minutes No follow-up
Army regiment Immediate after action Focused directly on behaviours (including self-scoring) 30 minutes Verbal
Fire service training Immediate and at 24 hours post-event Reflective, stop and start all key points  addressed 30-60 minutes Written
Police service Immediate Reflective cycle 30 minutes Verbal and written
Aerobatic team Immediate Guided auto-debriefing 30 to 60 minutes Verbal, written, video and practice with  models

Phase two

The members of the aerobatic flight team agreed to participate in the observational study. The team members initially undertook a live aerobatic flight display. Debriefing occurs immediately after each flight, in a specific area and follows a standard reflective cycle methodology, facilitated by the squadron leader. 

The debriefing sessions last between 30 to 60 minutes (comparatively longer than flight durations). The immediate post-flight debrief was observed and scored by one investigator, with nine years of debriefing experience, using the DASH [15] tool. The scoring of this debrief has been presented in Table 2.

Debriefing Assessment for Simulation in Healthcare (DASH)© Elements Rating score (1 Extremely Ineffective/Abysmal  –  7 Extremely Effective/Outstanding)
1. Sets the stage to enable an engaging learning environment 7 Extremely effective/outstanding
2. Maintains an engaging context for learning 7 Extremely effective/outstanding
3. Structures debriefing in an organized way 7 Extremely effective/outstanding
4. Provokes interesting and engaging discussions and fosters reflective practice 7 Extremely effective/outstanding
5. Identifies performance gaps 7 Extremely effective/outstanding
6. Helps close performance gaps 7 Extremely effective/outstanding

The debrief experience scored extremely effective/outstanding on all six elements of the DASH Tool, from setting the stage for an engaging learning environment to helping close performance gaps. The process has been summarised according to the key components of the DASH tool [15]. The debriefing process has been summarised pictorially in Figure 1 from the concrete flight experience through to pre-flight active experimentation.

No formal debrief training was reported, the organisation adopting a systematic grandfathering methodology, at least two years as a team member, before facilitating the debrief. The importance of the debrief facilitator skills to consider emotions and guide reflection on action by effective auto-debriefing were evident.

Setting the stages to enable an engaging learning environment:

The debriefer set the stage for an engaging learning environment, the setting was a relaxed lounge, facilitator (team leader) was positioned off-centre and projected a clear emphasis on advocacy, inquiry and promotion of post-event auto-debriefing.

Maintains an engaging context for learning:

The debriefing occurred immediately post-flight, in a specific area, and follows a standard reflective cycle methodology facilitated by the squadron leader. There was a clear emphasis on advocacy, inquiry, and promotion of post-event auto-debriefing. 

Structure of the debrief:

The use of video replay enabled the pilots to review the entire flight, featuring complex manoeuvres (Figure 1, Image 2). 

Provocation of an interesting and engaging discussions and fostering reflective practice:

There was clear evidence of constructive alignment of the debrief to explicit team learning objectives, particularly with reference to spatial positioning and reference points of the display manoeuvres. 

Performance gap identification:

Each pilot was then encouraged to verbalise positive and negative aspects of their performance in alignment of the specific learning objectives (Figure 1, Image 3).

Closing the performance gaps:

Following full flight video replay, the post-debrief follow-up featured further face-to-face discussions, the use of models/blackboards to reference positions, and actions to allow closure of performance gaps (Figure 1, Image 4).

One limitation of this study is the use of a single observer to rate the debriefing performance. In addition, although the observer had experience of adaptations of the DASH tool (acknowledged on the DASH website) [15], this observer has had no formal training of using this tool.

Discussion

This study highlighted variability in debriefing practices, style, and duration amongst these 10 high-risk/high-stakes industries. Many authors have identified that effective facilitated debriefing sessions as a key step to promote deep insight and reflection to enhance learning and performance in healthcare [2-9]. Additionally, literature suggests that debriefing is a social process where an understanding of emotion and thought processing is a vital element to the process. Current healthcare debriefing literature acknowledges the importance of reflection to increase retention of knowledge, skills, and attitudes [2-9]. This study highlighted that high-risk/high-stake debriefing are frequently undertaken immediate post-event, with some utilising more structured analysis using supplementary video technology. Both sports teams and the aerobatic display team utilised video technology to focus learning and develop future strategies or performance improvements. There has been a reported increase in the use of supportive video technology in healthcare debriefing, but the effectiveness and optimal timing is yet to be determined [3]

It is important to emphasise that the debriefing methodologies described by the high-risk/high-stakes industries in Phase One of this study were based on live events. Whilst debriefing in healthcare is accepted as the most important element of simulated learning [2-3, 8], it is not yet commonplace in the clinical healthcare environment. Thus, it is possible that healthcare could learn from research into other high-risk/high-stakes industries to improve debriefing in the clinical healthcare environment as well as in simulated practice.

This observational study identified a highly effective method of self-directed video-based auto-debriefing. The debrief methodology was rated outstanding/excellent (score of 7/7) in each of the six components of the DASH tool [15]. Daily familiarisation, experienced debriefers, standardised procedures, and constructively aligned learning objectives contributed strongly to effective auto-debriefing. The adoption of this debriefing style, built upon a safe culture of self-reflection, with team members exploring their own performance gaps with minimal facilitation may be significantly beneficial in simulated learning environments. Moreover, one could postulate that a drive to debrief, with the goal of high quality self-reflection and closure of performance gaps, is required in real life clinical fields, in addition to the simulations, to enhance patient care.

Conclusions

This study has explored debriefing activities in high-risk/high-stakes industries with the intention of informing healthcare practice. The sequential mixed methods study has indicated that debriefing after critical or routine events is common practice in a range of high-risk/high-stakes industries, sports, and civil aviation. Debriefing in these industries is undertaken in order to learn from experiences and improve performance and safety. Variability in practices, style, and duration exists amongst these industries. Further examination of debrief activities in high-risk/high-stakes industries is required to determine how best practices from these arenas can be translated into improving patient safety within healthcare practice and simulated learning.


References

  1. Issenberg SB, McGaghie WC, Petrusa ER, Lee GD, Scalese RJ: Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005, 27:10-28.
  2. Grant JS, Moss J, Epps C, Watts P: Using video-facilitated feedback to improve student performance following high-fidelity simulation. Clinical Simulation in Nursing. 2010, 6:e177-e184.
  3. Byrne AJ, Sellen AJ, Jones JG, et al.: Effect of videotape feedback on anaesthetists' performance while managing simulated anaesthetic crises: A multicentre study. Anaesthesia. 2002, 57:176-179.
  4. Bond WF, Deitrick LM, Arnold DC, et al.: Using simulation to instruct emergency medicine residents in cognitive forcing strategies. Acad Med. 2004, 79:438-46.
  5. Savoldelli GL, Naik VN, Park J, Joo HS, Chow R, Hamstra SJ: Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback. Anesthesiology. 2006, 105:279-85.
  6. Scherer LA, Chang MC, Meredith JW, Battistella FD: Videotape review leads to rapid and sustained learning. Am J Surg. 2003, 185:516-20.
  7. Schiavenato M: Reevaluating simulation in nursing education: Beyond the human patient simulator. J Nurs Educ. 2009, 48:388-94.
  8. Fanning RM, Gaba DM: The role of debriefing in simulation-based learning. Simul Healthc. 2007, 2:115-25. 10.1097/SIH.0b013e3180315539
  9. Schön DA: Knowing in action: The new scholarship requires a new epistemology. Change. 1995, Nov/Dec:27–34. Accessed: accessed 15 April 2014: http://dsmgt310.faculty.ku.edu/SuppMaterial/SchonEpistofPractice.htm .
  10. Schön D: Educating the Reflective Practitioner. Jossey-Bass, San Francisco; 1987.
  11. Dewey J: How We Think. D C Heath, New York; 1933.
  12. Kolb DA: Experiential Learning. Prentice Hall, Englewood Cliffs, New Jersey; 1984.
  13. Ritchie J, Spencer L: Analyzing qualitative data. Qualitative data analysis for applied policy research. Bryman A, Burgess RG (ed): Routledge, London; 1994. 173-194.
  14. Brett-Fleegler M, Rudolph J, Eppich W, Monuteaux M, Fleegler E, Cheng A, Simon R: Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc. 2012, 7:288-94. 10.1097/SIH.0b013e3182620228
  15. Debriefing Assessment For Simulation In Healthcare© (DASH©). Accessed: 4/14/14: http://www.harvardmedsim.org/debriefing-assesment-simulation-healthcare.php.
  16. MRC ethics series Good research practice: Principles and guidelines. (2010/2011). http://www.mrc.ac.uk/news-events/publications/good-research-practice-principles-and-guidelines/.
Original article
peer-reviewed

What Can We Learn About Debriefing From Other High-Risk/High-Stakes Industries?


Author Information

Ralph MacKinnon Corresponding Author

Department of Paediatric Anaesthesia & North West and North Wales Paediatric Transport Service, Royal Manchester Children's Hospital, UK

Suzanne Gough

Health Professions Department, Manchester Metropolitan University, UK


Ethics Statement and Conflict of Interest Disclosures

Conflicts of interest: The authors have declared that no conflicts of interest exist.


Original article
peer-reviewed

What Can We Learn About Debriefing From Other High-Risk/High-Stakes Industries?


Figures etc.

PDF Print Share
Taylor  Sawyer
April 25, 2014 at 03:24 AM
Taylor Sawyer

Great article. I love the idea of translating the best practices from other industries into healthcare simulation debriefing.

Original article
peer-reviewed

What Can We Learn About Debriefing From Other High-Risk/High-Stakes Industries?

  • Author Information
    Ralph MacKinnon Corresponding Author

    Department of Paediatric Anaesthesia & North West and North Wales Paediatric Transport Service, Royal Manchester Children's Hospital, UK

    Suzanne Gough

    Health Professions Department, Manchester Metropolitan University, UK


    Ethics Statement and Conflict of Interest Disclosures

    Conflicts of interest: The authors have declared that no conflicts of interest exist.

    Acknowledgements


    Article Information

    Published: April 25, 2014

    DOI

    10.7759/cureus.174

    Cite this article as:

    Mackinnon R, Gough S (April 25, 2014) What Can We Learn About Debriefing From Other High-Risk/High-Stakes Industries?. Cureus 6(4): e174. doi:10.7759/cureus.174

    Publication history

    Received by Cureus: April 18, 2014
    Peer review began: April 18, 2014
    Peer review concluded: April 24, 2014
    Published: April 25, 2014

    Copyright

    © Copyright 2014
    MacKinnon et al. This is an open access article distributed under the terms of the Creative Commons Attribution License CC-BY 3.0., which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

    License

    This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Abstract

Objectives: To explore the debriefing methodologies utilised in high-risk/high-stakes industries with the intention of informing healthcare practice.

Methods: A sequential mixed methods approach was adopted to explore the use of feedback/debriefing methodologies in ten different industries. A first phase standardised survey focused on timing of debriefing post-event, debriefing techniques used, duration of debrief provision, technology used, and follow-up provision. A second phase observational study at one site (aerobatic team) explored debriefing in detail with a standardised tool and semi-structured interview technique.

Results: This study highlighted variability in debriefing practices, style, and duration amongst ten high-risk/high-stakes industries. The second phase observation study identified a highly effective method of self-directed video-based auto-debriefing.

Conclusions: Debriefing after critical or routine events is common practice in a range of high-risk/high-stakes industries. A key theme was the recognition of the importance of debriefing to promote reflection to increase performance and mitigate risk. Further in-depth qualitative analysis of the adoption of these debriefing techniques into healthcare and simulated practice is warranted.

Introduction

Simulation-augmented education is now recognised as an educational tool that provides opportunities to improve professionals' performance and enhance patient safety [1]. A number of debriefing techniques are available with the differing incorporation of three basic phases, descriptive, analytical, and application, to future events. A literature review of high-fidelity healthcare simulation indicated that feedback (debriefing) is the most important feature of simulation-based education [2]. To date, research has yet to determine the most effective debriefing styles and components to impact upon knowledge, skills, attitudes, and behaviours. Moreover, there is no evidence that one approach is optimal to impact all of those facets, as opposed to a composite debriefing approach.

Video-facilitated debriefing has been reported in recent experimental studies [2-7], which have reported inconsistent results. Grant and colleagues [3] concluded that video-facilitated simulation feedback is potentially useful in increasing desirable clinical behaviours in a simulated environment. Healthcare debriefing research [3-5] has previously focused on performance measurement rather than exploring the complexity or transferability of the learning experience; for example, in relation to human factors, situational awareness, error recognition, patient safety, or preparing the learner for repetitive reflective practice.

Additionally, structured debriefing after ‘real-life’ events/incidents is not common practice in the clinical healthcare environment throughout the United Kingdom (UK). In contrast, debriefing after critical or routine events has been utilised by a host of industries to learn from experiences and improve performance for decades in the UK [8].

It is evident that the concept of an individual or team having to gather information, process, analyse, decide, and act or not in a complex, continually changing environment [9] is not unique to the healthcare domain. Moreover, the paradigm of reflecting in, on, and after action [10] is the cornerstone of experiential learning that is itself the construct at the heart of simulation education and is not exclusive to the healthcare arena. The concept of trying to achieve a desired outcome by repeating an event again and again, each time analysing how to improve, can be pictured easily with man’s early attempt to fly or to make tools before that. Viewing this with an evolutionary biology lens, one could extend this time frame markedly more. 

The dynamic equilibrium of repeatedly experiencing events and attempting to make sense of how to improve has been punctuated or indeed rapidly evolved when the process of how we make sense and learn, combined with how we facilitate that in other learners, came under the educational scrutiny and research prowess of such pioneers as Dewey [11], Schön [10], Kolb [12], and others. If one returns to the more commonly used terms of feedback or performance analysis or coaching to capture what we may consider debriefing, there is perhaps much we can learn from decades of post-experience analysis by other industries that strive to improve performance, mitigate risk, and ultimately, improve safety.  The purpose of this study was to explore debriefing activities in high risk industries with the intention of informing healthcare practice. 

Materials & Methods

A sequential mixed methods approach was adopted to explore the use of feedback and debriefing methodologies in ten different industries. The aims of the study were two-fold: 1) To identify the types of feedback or debriefing methodologies utilised for post-event analysis and performance development within high risk industries, and 2) to explore the use of feedback/debriefing methodologies used for post-event analysis and development at one site.
The first phase of the study involved surveying the current feedback or debriefing strategies utilised for post-event analysis and performance development. The second phase explored one industry in detail. Ethical permission was sought from Royal Manchester Children’s Hospital Research & Development Department and deemed not necessary by the Committee.

Phase one

Ten industries that employ debriefing techniques, to improve performance of their workforce, were purposively selected and invited to participate in the study. An individual who had participated in debriefing (for example, trainer, coach, instructor, officer) was identified in each industry. All volunteers agreed to participate in a structured telephone interview. The interview schedule focused on four aspects of feedback/debriefing to ascertain what the strategies typically employed at each of the 10 sites (Aim 1) was. The four aspects discussed were:

1) Timing of the debriefing session, post-event, (for example, flight, performance, operation, mission),
2) Debriefing techniques used,
3) Duration of the debriefing event,
4) Follow-up provision.

All interviews were transcribed verbatim and analysed using thematic network analysis [14].

Phase two

One professional aerobatic display team voluntarily agreed to participate in this study. An observational study was used to explore the use of feedback/debriefing methodologies used for post-event analysis and development in detail in one high risk industry which participated in Stage One (Aim 2).

Data collection included:

1) flight observation,
2) digital images, 
3) interviewer rating of the debrief using the rater version of the DASH tool [15],
4) a semi-structured interview.

Interview participants were blinded to the DASH score at the time. Interview transcription was analysed using thematic analysis. Field notes and photographs were recorded with permission, allowing triangulation of all data. All data was stored in accordance with the Medical Research Council’s Good Research Practice Guidance [16].

The DASH tool [15] rates the following elements:

1) How the stages are set to enable an engaging learning environment,
2) How the debriefer maintains an engaging context for learning,
3) Structure of the debriefing,
4) How the debriefer provokes interesting and engaging discussions and fosters reflective practice,
5) Whether performance gaps have been identified,
6) Whether performance gaps have been closed.

Each of the elements were then rated using the following seven-point Likert scale:

1 - Extremely Ineffective/Abysmal
2 - Consistently Ineffective/Very Poor
3 - Mostly Ineffective/Poor
4 - Somewhat Effective/Average
5 - Mostly Effective/Good
6 - Consistently Effective/Very Good
7 - Extremely Effective/Outstanding.

Results

Phase one

One participant from each of the 10 high risk industries agreed to participate in the structured telephone interview. The interviews lasted between 10 and 20 minutes. Results of the thematic network analysis [14] of the interview transcripts have been summarised in Table 1

Industry Timing of the Debrief Session Debriefing Technique Duration of Debrief Follow-up Provision
Airline Immediate post-flight Reflective cycle 5-10 minutes Verbal follow-up
International rugby team Immediate post-game Didactic, key points addressed 2-5 minutes In-depth review 48 hrs later, verbal, and review of video
Olympic cycling team After 24 hours Cognitive interviews 30-60 minutes Verbal & written follow-up
Premier League professional football At 24 – 48 hours Reflective cycle 15-20 minutes with video Verbal reinforcement
Nuclear industry At greater than 24 hours Structured in-depth analysis Variable, minutes to many hours Verbal and  written
Mountain rescue team Immediate Didactic, key points addressed 10 minutes No follow-up
Army regiment Immediate after action Focused directly on behaviours (including self-scoring) 30 minutes Verbal
Fire service training Immediate and at 24 hours post-event Reflective, stop and start all key points  addressed 30-60 minutes Written
Police service Immediate Reflective cycle 30 minutes Verbal and written
Aerobatic team Immediate Guided auto-debriefing 30 to 60 minutes Verbal, written, video and practice with  models

Phase two

The members of the aerobatic flight team agreed to participate in the observational study. The team members initially undertook a live aerobatic flight display. Debriefing occurs immediately after each flight, in a specific area and follows a standard reflective cycle methodology, facilitated by the squadron leader. 

The debriefing sessions last between 30 to 60 minutes (comparatively longer than flight durations). The immediate post-flight debrief was observed and scored by one investigator, with nine years of debriefing experience, using the DASH [15] tool. The scoring of this debrief has been presented in Table 2.

Debriefing Assessment for Simulation in Healthcare (DASH)© Elements Rating score (1 Extremely Ineffective/Abysmal  –  7 Extremely Effective/Outstanding)
1. Sets the stage to enable an engaging learning environment 7 Extremely effective/outstanding
2. Maintains an engaging context for learning 7 Extremely effective/outstanding
3. Structures debriefing in an organized way 7 Extremely effective/outstanding
4. Provokes interesting and engaging discussions and fosters reflective practice 7 Extremely effective/outstanding
5. Identifies performance gaps 7 Extremely effective/outstanding
6. Helps close performance gaps 7 Extremely effective/outstanding

The debrief experience scored extremely effective/outstanding on all six elements of the DASH Tool, from setting the stage for an engaging learning environment to helping close performance gaps. The process has been summarised according to the key components of the DASH tool [15]. The debriefing process has been summarised pictorially in Figure 1 from the concrete flight experience through to pre-flight active experimentation.

No formal debrief training was reported, the organisation adopting a systematic grandfathering methodology, at least two years as a team member, before facilitating the debrief. The importance of the debrief facilitator skills to consider emotions and guide reflection on action by effective auto-debriefing were evident.

Setting the stages to enable an engaging learning environment:

The debriefer set the stage for an engaging learning environment, the setting was a relaxed lounge, facilitator (team leader) was positioned off-centre and projected a clear emphasis on advocacy, inquiry and promotion of post-event auto-debriefing.

Maintains an engaging context for learning:

The debriefing occurred immediately post-flight, in a specific area, and follows a standard reflective cycle methodology facilitated by the squadron leader. There was a clear emphasis on advocacy, inquiry, and promotion of post-event auto-debriefing. 

Structure of the debrief:

The use of video replay enabled the pilots to review the entire flight, featuring complex manoeuvres (Figure 1, Image 2). 

Provocation of an interesting and engaging discussions and fostering reflective practice:

There was clear evidence of constructive alignment of the debrief to explicit team learning objectives, particularly with reference to spatial positioning and reference points of the display manoeuvres. 

Performance gap identification:

Each pilot was then encouraged to verbalise positive and negative aspects of their performance in alignment of the specific learning objectives (Figure 1, Image 3).

Closing the performance gaps:

Following full flight video replay, the post-debrief follow-up featured further face-to-face discussions, the use of models/blackboards to reference positions, and actions to allow closure of performance gaps (Figure 1, Image 4).

One limitation of this study is the use of a single observer to rate the debriefing performance. In addition, although the observer had experience of adaptations of the DASH tool (acknowledged on the DASH website) [15], this observer has had no formal training of using this tool.

Discussion

This study highlighted variability in debriefing practices, style, and duration amongst these 10 high-risk/high-stakes industries. Many authors have identified that effective facilitated debriefing sessions as a key step to promote deep insight and reflection to enhance learning and performance in healthcare [2-9]. Additionally, literature suggests that debriefing is a social process where an understanding of emotion and thought processing is a vital element to the process. Current healthcare debriefing literature acknowledges the importance of reflection to increase retention of knowledge, skills, and attitudes [2-9]. This study highlighted that high-risk/high-stake debriefing are frequently undertaken immediate post-event, with some utilising more structured analysis using supplementary video technology. Both sports teams and the aerobatic display team utilised video technology to focus learning and develop future strategies or performance improvements. There has been a reported increase in the use of supportive video technology in healthcare debriefing, but the effectiveness and optimal timing is yet to be determined [3]

It is important to emphasise that the debriefing methodologies described by the high-risk/high-stakes industries in Phase One of this study were based on live events. Whilst debriefing in healthcare is accepted as the most important element of simulated learning [2-3, 8], it is not yet commonplace in the clinical healthcare environment. Thus, it is possible that healthcare could learn from research into other high-risk/high-stakes industries to improve debriefing in the clinical healthcare environment as well as in simulated practice.

This observational study identified a highly effective method of self-directed video-based auto-debriefing. The debrief methodology was rated outstanding/excellent (score of 7/7) in each of the six components of the DASH tool [15]. Daily familiarisation, experienced debriefers, standardised procedures, and constructively aligned learning objectives contributed strongly to effective auto-debriefing. The adoption of this debriefing style, built upon a safe culture of self-reflection, with team members exploring their own performance gaps with minimal facilitation may be significantly beneficial in simulated learning environments. Moreover, one could postulate that a drive to debrief, with the goal of high quality self-reflection and closure of performance gaps, is required in real life clinical fields, in addition to the simulations, to enhance patient care.

Conclusions

This study has explored debriefing activities in high-risk/high-stakes industries with the intention of informing healthcare practice. The sequential mixed methods study has indicated that debriefing after critical or routine events is common practice in a range of high-risk/high-stakes industries, sports, and civil aviation. Debriefing in these industries is undertaken in order to learn from experiences and improve performance and safety. Variability in practices, style, and duration exists amongst these industries. Further examination of debrief activities in high-risk/high-stakes industries is required to determine how best practices from these arenas can be translated into improving patient safety within healthcare practice and simulated learning.

References

  1. Issenberg SB, McGaghie WC, Petrusa ER, Lee GD, Scalese RJ: Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005, 27:10-28.
  2. Grant JS, Moss J, Epps C, Watts P: Using video-facilitated feedback to improve student performance following high-fidelity simulation. Clinical Simulation in Nursing. 2010, 6:e177-e184.
  3. Byrne AJ, Sellen AJ, Jones JG, et al.: Effect of videotape feedback on anaesthetists' performance while managing simulated anaesthetic crises: A multicentre study. Anaesthesia. 2002, 57:176-179.
  4. Bond WF, Deitrick LM, Arnold DC, et al.: Using simulation to instruct emergency medicine residents in cognitive forcing strategies. Acad Med. 2004, 79:438-46.
  5. Savoldelli GL, Naik VN, Park J, Joo HS, Chow R, Hamstra SJ: Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback. Anesthesiology. 2006, 105:279-85.
  6. Scherer LA, Chang MC, Meredith JW, Battistella FD: Videotape review leads to rapid and sustained learning. Am J Surg. 2003, 185:516-20.
  7. Schiavenato M: Reevaluating simulation in nursing education: Beyond the human patient simulator. J Nurs Educ. 2009, 48:388-94.
  8. Fanning RM, Gaba DM: The role of debriefing in simulation-based learning. Simul Healthc. 2007, 2:115-25. 10.1097/SIH.0b013e3180315539
  9. Schön DA: Knowing in action: The new scholarship requires a new epistemology. Change. 1995, Nov/Dec:27–34. Accessed: accessed 15 April 2014: http://dsmgt310.faculty.ku.edu/SuppMaterial/SchonEpistofPractice.htm .
  10. Schön D: Educating the Reflective Practitioner. Jossey-Bass, San Francisco; 1987.
  11. Dewey J: How We Think. D C Heath, New York; 1933.
  12. Kolb DA: Experiential Learning. Prentice Hall, Englewood Cliffs, New Jersey; 1984.
  13. Ritchie J, Spencer L: Analyzing qualitative data. Qualitative data analysis for applied policy research. Bryman A, Burgess RG (ed): Routledge, London; 1994. 173-194.
  14. Brett-Fleegler M, Rudolph J, Eppich W, Monuteaux M, Fleegler E, Cheng A, Simon R: Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc. 2012, 7:288-94. 10.1097/SIH.0b013e3182620228
  15. Debriefing Assessment For Simulation In Healthcare© (DASH©). Accessed: 4/14/14: http://www.harvardmedsim.org/debriefing-assesment-simulation-healthcare.php.
  16. MRC ethics series Good research practice: Principles and guidelines. (2010/2011). http://www.mrc.ac.uk/news-events/publications/good-research-practice-principles-and-guidelines/.

Community Discussion

Taylor  Sawyer
April 25, 2014 at 03:24 AM
Taylor Sawyer

Great article. I love the idea of translating the best practices from other industries into healthcare simulation debriefing.

Ralph MacKinnon

Department of Paediatric Anaesthesia & North West and North Wales Paediatric Transport Service, Royal Manchester Children's Hospital, UK

For correspondence:
ralph.mackinnon@cmft.nhs.uk

Suzanne Gough

Health Professions Department, Manchester Metropolitan University, UK

Ralph MacKinnon

Department of Paediatric Anaesthesia & North West and North Wales Paediatric Transport Service, Royal Manchester Children's Hospital, UK

For correspondence:
ralph.mackinnon@cmft.nhs.uk

Suzanne Gough

Health Professions Department, Manchester Metropolitan University, UK