The 2019 Wildfire Risk Report, released in September, found that over 775,000 homes are at extreme risk of wildfire damage in the western United States. Estimated reconstruction cost for these homes tops $221 billion.
This report evaluates potential exposure of residential properties to wildfire in a defined region. Not surprisingly, California and Texas top the list of states in the high- and extreme-risk categories. Three metropolitan regions in California have 42% of residences at high-to-extreme wildfire risk. This is due to population density as well as the expanding residential spread into wildland urban interface areas.
Data from 1985-2018 show an increase in the amount of acreage burned when compared to the overall number of fires. 2018 was another record-breaker with 8.7 million acres burned, the sixth-highest total in over 100 years of data. The 13 states in this report have the highest history of acreage burned and the highest loss of life and property. They possess the highest probability of future property loss due to wildfire.
Forecasting wildfire severity
Planners and emergency managers may have a new tool in their toolbox to help forecast how severe a wildfire may be. Using modeling based on several key variables, a new machine learning algorithm can help forecast whether a wildfire will be small, medium or large.
Researchers tested this new modeling technique using Alaska data from 2001-2017. The model predicted 40% of ignitions would lead to large fires. These fires accounted for 75 percent of the total burned area. The researchers found this model outperformed other algorithms, but that there was still overprediction in some cases.
If refined to the point where it is more reliable, this kind of modeling can be useful in allocating firefighting resources and declaring evacuation warnings. It can also have an impact on other related fields such as ecology and land management.