Today is National Eclipse Day, and thanks to the Milli, Nena Spring, and Whitewater fires, I’m likely to be viewing it through a lens of smoke. So this has me thinking about wildfires and wondering if it is true, as some claim, that environmentalists are ultimately responsible for the increase in acres burned in the last decade or so.
Partly due to pressure from environmentalists, federal land timber sales declined by about 80 percent in the 1990s. Meanwhile, the ten-year rolling average of the number of acres burned grew from about 3 million acres in the 1980s and 1990s to 6.5 million acres in the 2000s and (so far) 2010s. Is this a coincidence or did the cessation in timber cutting lead to the growth in wildfires?
Those who blame environmentalists argue that timber cutting and related activities allowed forest managers to minimize fuel loads in the forests. When those activities stopped, the fuel loads grew and fires became hotter, larger, and harder to control.
After the Cerro Grande fire burned more than 400 homes in Los Alamos in 2000, the Forest Service itself argued that fuels had built up due to a century of fire suppression, thus justifying the increased expenditure of funds on firefighting and vegetation management. But this claim was contradicted by a 2002 Forest Service publication that found that only 26 percent of national forest lands have seen fire regimes “significantly altered from their historical range,” and 60 percent of those lands were of the type that are susceptible to large fuel build ups (denoted in the report as historical fire regime 1). Thus, the Forest Service could blame itself for only 15 percent of fire problems.
One reason for this is that federal fire protection efforts were never that successful in the first place. But in addition, forests grow slowly, so it takes a long time for fuels to build up enough to become a problem. This means that the few years since the reduction in timber sales in the 1990s is not going to create severe problems in forest health.
We can check this by taking a look at British Columbia forests, most of which are owned by the government. As shown in a table from Canada’s National Forestry Database, timber harvests there did not decline; in fact, they grew somewhat in the early 2000s, probably in response to the reduction in timber cutting in U.S. national forests.
If timber cutting helps prevent forest fires, then British Columbia forests would not have seen the large increase in number of acres burned experienced in the United States. In fact, the exact opposite is the case, as shown in this table in the National Forestry Database. Whereas an average of about 25,000 hectares per year burned between 1990 and 2002, since 2003 the average has been 159,000 hectares, or an increase of 540 percent. That’s a much larger increase than seen in U.S. forests, suggesting that something other that timber cutting is responsible.
The Antiplanner’s own analysis found that changes in the share of the United States that is suffering from severe to extreme summer (July-September) drought are sufficient to explain most of the annual variation in acres burned. Updating that analysis, from 1960 through 2016, the correlation is 0.57. Moreover, to the extent that this correlation isn’t perfect, it would predict that more acres would have burnt in recent years than actually did burn. While global warming could be increasing drought, the United States actually suffered much worse droughts in the 1930s than in recent years, and droughts were also bad in the early 1950s and early 1960s.
On the whole, the nation’s forests–especially in the West–really aren’t that much different today than they were a hundred years ago. Instead of looking to those changes to explain variations in acres burned, I would blame them on drought and on the ability of fire fighters to attenuate the worst effects of drought.