The National Institute for Computational Sciences

Earth, Wind, & Fire

A Computer Model Combination Could Unravel the Mysteries of Sudden Firestorms

The thought of a wildfire evokes images of firefighting crews, airplanes, and helicopters battling the blaze. But fire analysts and managers have other weapons in their arsenal to plan the attack before sending resources out. Among those are computer-model wildfire forecasts.

A scene from the Moore Branch fire in East Texas that occurred in September of 2000. The fire consumed 15,864 acres. Using advanced computing resources from the Extreme Science and Engineering Discovery Environment (XSEDE), researchers were able to do a preliminary evaluation of a combination fire–atmosphere computer model through idealized tests and an examination of data from the Moore Branch fire. Such a model, or approximation of real-world scenarios and conditions, could lead to the ability to accurately forecast sudden fire escalations that result from interactions between a wildfire and the atmosphere. [Image credit:Texas A&M Forest Service]

“Models are great tools. They provide more information to fire managers for decision support, and that’s what it all comes down to,” says Brad Smith, a wildland fire analyst with the Texas A&M Forest Service, which provides statewide training, equipment, and support to local firefighting services and governments.

For many years, Smith has been involved with teams strategizing to suppress some of the biggest wildfires across Texas and the nation, and so he’s familiar with the important questions that wildfire computer model forecasts can help answer: What do we need to do with our resources? How long will the battle take? What are the chances of success? What is the potential for the fire to move into values at risk, whether a community, homestead, or something else?

Smith sees the models as another important input in decision support, along with first-hand accounts, reports from a network of weather stations, reports on vegetation (fuels to the fire), maps, or other data. And although not everyone in firefighting is a believer yet in the use of computer model data for making fire-management decisions, Smith says acceptance is catching on.

Computer Models 'Talking' to Each Other

In the research realm, efforts have long been aimed at improving the models to achieve more accurate wildfire forecasts. One way scientists believe they can do that is by combining a fire-behavior model with an atmospheric one. The goal historically has been thwarted by the prohibitive cost of advanced computing—but that is changing.

With the help of some of the most powerful supercomputers in the world from the Extreme Science and Engineering Discovery Environment (XSEDE), Nathan Dahl, a researcher with the Rosenstiel School of Marine & Atmospheric Science at the University of Miami; Haidong Xue and Xiaolin Hu of the Department of Computer Science at Georgia State University; and Ming Xue of the Center for the Analysis and Prediction of Storms (CAPS) at the University of Oklahoma conducted a preliminary evaluation of the performance of a coupled model.

A coupled model could allow calculations of the interactions, or “feedbacks,” between the fire and atmosphere to reveal the hidden factors that can unexpectedly produce firestorms that spread the fire in unanticipated ways, according to Dahl.

The project he led involved running idealized tests and examining data from the September 2000 Moore Branch fire, which took place in East Texas. For the investigation the researchers combined something called the Discrete Event System Specification Fire model (DEVS-FIRE) and the Advanced Regional Prediction System (ARPS) atmospheric model.

DEVS-FIRE was concerned with calculating only the spread direction and spread rate of a fire, but Dahl and his colleagues added the ability to compute how much surface heat was being released by the fire at different geographical points as it advanced. Then they fed the heat values into the ARPS model.

“As ARPS responds to that heat release by generating updrafts and other atmospheric currents, the near-surface winds from that adjustment were fed back into DEVS-FIRE, and those changes in wind speed and direction affected how the fire spread,” Dahl explains. “Basically, the process just continues to cycle, where DEVS-FIRE updates based on the wind speed and direction, calculates how much heat was generated during the last update, and feeds that heat into ARPS; and then ARPS adjusts to that, calculates the change in wind speed and direction, feeds it back to DEVS-FIRE, and the process continues.”

In that manner, the models exchange fire and atmospheric information to predict the spread of a wildfire.

Dahl and his colleagues chose to use the Moore Branch fire as the test case for the project because of what Dahl described as a very good fuel survey dataset from the forest service relative to capturing the properties of the field grass—amounts, distribution across the land, and moisture content—a few days before the fire. Plus, a summary of the fire’s behavior and the reported weather conditions at the time were available for reference.

However, the researchers also were drawn to something interesting that happened during the fire—an unanticipated escalation. Although the background winds were generally light, “extreme” fire behavior was noted on days four and five, suggesting that fire–atmosphere feedbacks played a significant role, Dahl says.

According to Dahl, oftentimes the signs are obvious when a fire is going to be severe, and in those situations, uncoupled models will do as well as coupled ones, but coupled models could reveal the hidden conditions that lead to the type of phenomenon that unfolded at Moore Branch.

Good Results, but More Work Remains

The findings in this project are favorable so far.

“Preliminarily, this coupled model captures the sorts of feedbacks that we want it to capture, and at least in this first case we looked at, it dramatically improved the quality of the fire-spread forecast in a way that, had it been available operationally, would have been really helpful to the firefighting personnel who were trying to combat this blaze,” says Dahl. “At this point, our findings are that the model is good enough to continue to analyze [through further cases and further tests].”

The XSEDE supercomputers providing support to the project were the Kraken and Gordon systems. Kraken, managed by the National Institute for Computational Sciences (NICS), was decommissioned last year. It was once the most powerful computer in academia and a workhorse for National Science Foundation (NSF) research. Gordon, managed by the San Diego Supercomputer Center (SDSC), is the first system built specifically for the challenges of data-intensive computing. XSEDE is a single, virtual system funded by NSF for scientists to interactively share resources, data, and expertise. People around the world use XSEDE’s resources and services—things like supercomputers, collections of data, and new tools—to improve our planet.

“We were able to run on hundreds, sometimes thousands, of processors, all working in tandem to solve the various equations, which are quite complex, both for the heat release from DEVS-FIRE and more particularly, the atmospheric response in ARPS,” Dahl explains. “Running all these processes in parallel on the XSEDE resources was the only way this problem was even doable in any reasonable amount of time; otherwise, it would have taken years, and instead we were able to run it in a matter of hours.”

XSEDE’s framework for submitting batch jobs enabled the researchers to ensure that the two models were running simultaneously, which was critical to the project, Dahl says.

He also was complimentary of the project support provided by XSEDE staff. Changing from one supercomputer to another, he explains, can be a challenge in terms of knowing how the file batch queues work, as well as getting the programming syntax and scripts right. “They were helpful in providing samples and scripts and even looking over my scripts to try to see if there was anything I was using on a previous machine that I shouldn’t use on this machine. The turnaround in terms of answering those types of questions was very good,” he says.

The details of the investigation are contained in a paper titled “Coupled fire-atmosphere modeling of wildland fire spread using DEVS-FIRE and ARPS” published online on Feb. 8, 2015, in Natural Hazards, Journal of the International Society for the Prevention and Mitigation of Natural Hazards. The paper can be accessed here.

Scott Gibson, science writer, NICS, JICS

Article posting date: 20 April 2015

About JICS and NICS:The Joint Institute for Computational Sciences (JICS) was established by the University of Tennessee and Oak Ridge National Laboratory (ORNL) to advance scientific discovery and leading-edge engineering, and to further knowledge of computational modeling and simulation. JICS realizes its vision by taking full advantage of petascale-and-beyond computers housed at ORNL and by educating a new generation of scientists and engineers well-versed in the application of computational modeling and simulation for solving the most challenging scientific and engineering problems. JICS operates the National Institute for Computational Sciences (NICS), which had the distinction of deploying and managing the Kraken supercomputer. NICS is a leading academic supercomputing center and a major partner in the National Science Foundation's eXtreme Science and Engineering Discovery Environment (XSEDE). In November 2012, JICS sited the Beacon system, which set a record for power efficiency and captured the number one position on the Green500 list of the most energy-efficient computers.