The Science Behind Simulation

A newly promoted captain is en route with his crew to an apartment complex fire. The 911 caller had reported a large amount of smoke coming from one apartment and people trapped on the second floor. Suddenly, the captain realizes that he is going to be the incident commander (IC).

Nearing the scene, the captain sees a column of smoke and becomes incredibly anxious. He stammers over the radio, “Engine 3 is out and is command!”

Once the engine pulls up to the building, the captain exits the vehicle and tries to get his bearings. He walks around to the side of the apartment building and sees the first-arriving engine trying to place a ladder to rescue the victims on the balcony. He knows he has to get someone inside with an attack line–and fast.

The arriving units now start stacking up on the captain, ready for assignments. “Engine 6, you’ll be Division 2, along with Engine 2; Truck 4, you’ll be ventilation,” he says over the radio. He needs a second alarm, but he forgot to call for it earlier.

The fire is getting out of control, and interior units aren’t making progress. The ventilation group isn’t even on the roof yet. Victims have been rescued from the balcony, but there aren’t enough resources to search the other apartments and fight the fire.

The captain keeps the plan he started with, even though he knows he doesn’t have enough resources. He can’t think of anything else to do because he’s never had to handle anything like this before.

Then, a unit on the second floor calls a mayday.
The instructor pushes pause. This isn’t an actual fire; it’s a simulation. The only thing that’s real is the captain’s stress level. His hands are still shaking, and sweat is soaking through his shirt. His brain thought it was real, and the resulting physiological response hampered his ability to make decisions.
Understanding Decision Science
Traditional decision-making takes place in a static environment where decision-makers have access to vast amounts of information and resources–personal and professional networks, communication technology and the Internet–without time pressures, to evaluate multiple options.

Fireground ICs make decisions in a significantly different environment. They must recognize and appropriately respond to a highly dynamic environment, using their experience and knowledge to coordinate and assign an array of necessary tasks. Their decisions must occur with precision, despite the fact that they’re typically made with incomplete information, insufficient resources and under significant time constraints. In short, ICs can’t just push a “pause button.”

Research shows that while making job-related decisions at structural and/or wildland fire incidents, fireground ICs use naturalistic decision-making (NDM) processes based on their personal knowledge and past firefighting and incident command experience.1 Other studies have concurred, establishing that ICs use their firefighting experience to make rapid and highly effective decisions through the recognition of and response to situational cues during incident mitigation operations.2,3 One study finds that fireground ICs will begin the NDM process by completing an initial assessment of the emergency scene, and then evaluating it for familiar patterns.4 Further, according to my own research (Kurt Hall), which will be addressed later, any recognized patterns are used to establish initial goals and then assign the necessary tasks to mitigate the incident based upon past experiences in similar situations.5

Following initial assessment and establishment of the action plan, the IC must continuously assess the environment and accomplished tasks to measure the success or failure of the current plan and anticipate possible required action(s).6 Continuously assessing the environment supports situational awareness, which in turn supports the NDM process employed by ICs. In fact, the loss of situational awareness can be catastrophic to individuals working in these environments because decision-makers may be slower to identify problems and cannot respond to them in an effective fashion.7

With all this in mind, it seems clear that fireground incident command training programs should focus on the incident command system, correct decision-making and situational awareness.

Simulation-Based Training Programs
Dr. Steve Kozlowski, a professor at Pennsylvania State University, has focused his research on the dynamic systems that exist within teams and organizations. His research shows that to have a successfully coordinated outcome, teams operating in the NDM environment must have strong adaptive capabilities to properly assess the situation, the team’s performance and the proper timing. He writes, “When one asks how these desired capabilities can be enhanced, the attention naturally turns to training. Yet traditional training systems are not well equipped to address these concerns.”8 In short, traditional classroom environments are not effective in teaching teams how to operate in the NDM environment.

It is therefore important to develop experiential learning environments. This type of learning happens during real-life experiences–but also when the team is placed in a simulated environment that’s realistic enough for the brain to temporarily suspend disbelief, and when the environment realistically changes based on decisions the team makes.

Enter computer gaming-based simulation systems. These systems allow us to place teams into environments where the virtual world is real enough for teams to interact with the simulated environment. If a team member hits a window with an axe, it opens; if they apply water to a fire, it’s extinguished; and if they forget to provide ventilation, then the entire team suffers through low visibility and high heat. The realism of the team environment is a key part of experiential learning, where each team member’s decisions affect the team as a whole. Such simulations put teams in real situations without exposing them to any actual danger.

Along with realistic simulated environments, feedback is key to successful experiential learning. To gain experience, every decision and action must have a consequence, and realistic simulators can provide a majority of this feedback. However, simulators will never show all the consequences of the real world, so proper instruction and critique are vital to the success of this training.

Our department, the Allen (Texas) Fire Department, uses simulation-based training that focuses on the basics of fireground organization. First, each team member learns how their actions affect the entire team. Those lessons are then transferred to the next session, which is incrementally more complicated, and this continues until the teams are operating on a simulated multi-family dwelling with significant fire, multiple victims and a mayday.

The instructors are tasked with not only taking notes for the critique, but also providing instant real-time instruction to the team members. We find that the students grasp the concepts quicker when instructors coach them on the proper tactic. Students can then perform the action again, this time correctly–a more effective tactic for gaining experience than learning from their mistakes.

Research Study
Despite a proven track record in industries like aviation, simulation hasn’t caught on as much in the fire service, partly because some leaders continue to doubt that it can be effective. Accordingly, for my doctoral dissertation, I (Kurt Hall) set out to test whether simulation was effective in training fire service personnel. I predicted that computer-based simulation could be used to create environments where participants gained knowledge and experience without having to face the hazards associated with live-fire incidents or training.

The research design is an accepted academic design. It involves first testing two groups to develop baseline scores. Then one group receives the training while the other group does not. The groups are then retested, and their new scores are compared to their previous scores to determine the effectiveness of the training. 9

The two fire departments used for the study–the Allen and McKinney fire departments–were similar in their size, equipment operated, services provided and personnel training program philosophies. The participants were experienced firefighting personnel who, by job description (driver/engineers, captains and battalion chiefs), were trained and served as fireground ICs. The 66 participants were divided into two groups–33 in the treatment group (Allen) and 33 in the comparison group (McKinney).

One of the quantitative measurement instruments was used as the pre-test and post-test measurement. NFPA 1561: Standard on Emergency Services Incident Management System and the National Incident Management System (NIMS) were used to create a score sheet to measure how the participants performed compared to the standards. A point value was assigned to each benchmark, and participants received credit for completing each item. The final score was the sum of the points, with 100 being the total points possible.

Each participant from both the treatment (Allen) and comparison (McKinney) groups completed the pre-test using the computer-based simulator. Three fire service professionals, each having considerable fireground incident command experience, scored the pre-test. Each of the individual scores from the evaluators was averaged to obtain a single score for each participant.

Then, over the course of 7 months, the treatment group (Allen) participated in a fireground incident command training program using a computer-based simulation program. The comparison group (McKinney) did not receive the training program.

After the treatment group (Allen) had completed the training program, all study participants from both groups completed a post-test using the same instrument, computer-based simulation and instrument scorers as were used in the pre-test. Both the pre- and post-test data were coded, tabulated and analyzed statistically.

The results: The participants in the treatment group (Allen) showed a statistically significant average score increase of 12.54 points, or approximately 22 percent, whereas the comparison group (McKinney) did not. Further, the statistical analysis established a correlation between the computer-based simulation training program and the increase in the treatment group’s (Allen) post-test scoring.

After seeing the results, it is clear to us that the simulation training program that the Allen Fire Department underwent made a significant difference in the participants’ ability to make sound decisions in a dynamic environment with significant time constraints and considerable stressors. In short, this study shows just how powerful simulation-based training could be for the fire service if more departments implemented these programs.

Simulation has already proven successful in the military, aviation and medical industries, so imagine how this training could impact fireground decision-making in real-world environments. After all, training officers around the world have long been frustrated with the difficulty of training firefighters. But we now have a tool that could dramatically change the way we train. Put simply, years of experience may no longer take years.

The authors have reported no conflicts of interest with the sponsor of this supplement. Their department uses FLAME-SIM software.


  1. Klein, G, Calderwood, R, Clinton-Cirocco, A. Rapid decision making on the fire ground. Army Research Institute for the Behavioral and Social Sciences. 1986.
  2. Klein, G, Calderwood, A. Decision models: some lessons learned from the field. IEEE Trans Syst Man Cybern. 1991;21(5):1018—1026.
  3. Lipshitz, R, Klein, G, Orasanu, J, et al. Taking stock in naturalistic decision making. J Behav Decis Mak. 2001;14(5):331—352.
  4. Klein, G. An overview of naturalistic decision making applications. In Naturalistic Decision Making. Zsambok, CE, Klein, G (Eds.). Lawrence Erlbuam: New Jersey. 1997.
  5. Hall, KA. The effect of computer-based simulation training on fire ground incident commander decision making. PhD dissertation. University of Texas at Dallas. 2010.
  6. Chapman, T, Nettelbeck, T, Welsh, M, et al. Investigating the construct validity associated with microworld research: a comparison of performance under different management structures across expert and non-expert naturalistic decision making groups. Aust J Psychol. 2006;58(1):40—47.
  7. Endsley, MR, Kris, EO. The out of the loop performance and level of control in automation. Hum Factors. 1995;37(2): 381—394.
  8. Kozlowski, SWJ. Training and developing adaptive teams: theory, principles, and research. In Making Decisions Under Stress: Implications for Individual and Team Training. Cannon-Bowers, JA, Salas, E (Eds.). American Psychological Association: Washington, DC. 1998(115—154).
  9. Creswell, JW. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 3rd ed. Sage: Thousand Oaks, Calif., 2009.


No posts to display