16. The Computer That Ate the Forest Service

Starting Forest Planning magazine was exciting, but after six years of 16-hour days on practically no pay, I was burnt out. Receiving the Neuberger Award was an indication that I was on the right track, but my actual accomplishments had been nearly nil.

The Forest Service had rejected the Oregon State Board of Forestry’s plan, but I had little to do with that. The state legislature had rejected my bill to reform the board and the governor had refused to appoint me to the board. The BLM had responded to my criticisms by building a brick wall around itself. The Forest Service was more open to discussion but hadn’t made any visible changes in response to my reviews of unit plans and timber management plans. On top of this, flunking the exam required to go on to get a Ph.D. in economics was dispiriting.

My friends James and Ellen, grateful for me getting the Forest Service to cancel the timber sale in their watershed because it was in a roadless area, offered to fix up one of their cabins for me to live in and recuperate. By this time, Miss Vickie, the woman I had met on the San Francisco Zephyr, and I were a couple, and she agreed to move down with me.

It was an idyllic life. The Illinois River was a few steps down the hill from our cabin. In the summer, we could wade across the river and be in the Siskiyou National Forest’s Alder Gulch Research Natural Area. The Kalmiopsis Wilderness was a few miles downstream.

In the fall, I joined James and his sons, Mose and Ben, as we ventured to the upper part of their property in a World War II vintage jeep which James called the Armored Weapons Carrier to gather wood for the winter. Decades before, the former homestead had been shorn of its conifers and whoever did it made no attempt to grow them back, so the forest cover consisted mainly of black oak, live oak, madrone, and other hardwood species. This made perfect firewood, and on a cold winter’s night we could heat the pot-bellied stove in our cabin until it was glowing red.

Once a month, Vickie and I made a trek back to Eugene to put out the magazine. Joe Cone, the editor, did most of the work of designing, editing, and getting the magazine ready for the printer. Our job was to print up the mailing labels and mail it out, plus I wrote a regular monthly column and various articles. My black van had died by then, so at first we hitchhiked and later drove an old Ford Torino that my grandfather had given me once he was too old to drive.

Our months at the Old Shade Place formed some of the happiest times of my life. While there, I read eclectically and happened to come across Byte magazine, which heralded the microcomputer revolution. I had heard of Apple II computers, but they seemed like toys that were inadequate for publishing a magazine. However, a program called Visicalc seemed like it might be useful for reviewing Forest Service plans.

After doing research, I decided we needed an Altos 8000 computer. While an Apple II came with 4 to 48 kilobytes of RAM and a floppy disc drive, the Altos had a whopping 256 kilobytes of RAM and, in addition to a floppy, a 10-megabyte hard drive. All of this was contained in a giant metal box about two feet long, 20 inches wide, and 8 inches high. Today’s smart phones and laptop computers are approximately a million times more powerful than the Altos, but at the time, the Altos was awesome.

Up to four terminals could be plugged into the computer. The operating system required about 80 kilobytes, leaving 48 kilobytes for three of the terminals but only 40 for the last one. A word processing program called WordStar required 48 kilobytes, but database software required only 40, so four people could use the machine at once provided one was doing database work such as maintaining our mailing list.

The list price for the computer, four terminals, and a daisy wheel printer was more than $16,000, but in the back of a Byte magazine I found a mail-order dealer who would sell the entire system for under $10,000. I borrowed some money from a relative and we moved back to Eugene. Starting with the June, 1981 issue, Forest Planning dispensed with typesetting and we printed everything except for the headlines out on the daisy wheel printer.

As far as I could tell, we were the second environmental group in the country to buy a microcomputer. At first, some people even asked us if having a microcomputer was compatible with being environmentalists, but soon other groups bought them as well. Mostly they used them for word processing and mailing list management. I used the Altos to bring together the scientific tools at my disposal for a coordinated attack on forest planning.

I indulged in my love for programming by buying Pascal, C, PL/1, and several other languages. I think I settled on C for most of my work. Visicalc wasn’t available on the operating system used on the Altos, so I wrote my own spreadsheet program which I called Forestcalc. I also wrote custom programs such as one to calculate stand density index from Forest Service inventory data. A few years later, after it was clear that my bet on microcomputers paid off in our intellectual battles with forest planning, a magazine labeled our Altos “The Computer That Ate the Forest Service.”

I wasn’t the only one thinking about computers, of course. In the 1970s, the Forest Service wrote its timber management plans with the assistance of a program called Timber Resource Allocation Method, or Timber RAM. The BLM had a program called SIMAC, for Simulated Intensively Managed Allowable Cut. But the Forest Service rejected these two programs for forest planning because they were exclusively focused on timber, and it wanted something that could also account for recreation, wildlife, and other resources.

It selected a program written by Norm Johnson, who had been John Beuter’s graduate student and co-author of Timber for Oregon’s Tomorrow. Known as FORPLAN, it (like Timber RAM and SIMAC) ran on mainframe computers, so to most of the public it was little better than a mysterious black box. But by this time Johnson was a professor at Oregon State University, so OSU students were some of the first to learn how to run the program.

One of those students was Andy Stahl, and after he graduated, the Forest Service hired him to teach forest planners how to run FORPLAN. After doing that, he went to work for a Eugene group called Associated Oregon Loggers. Despite working for the “other side,” one day he showed up on our doorstep with a pile of computer print outs in his hands and said, “I’m going to teach you how to read FORPLAN runs.”

FORPLAN allowed users to enter data including timber yield tables, the effects timber had on other resources, timber and other resource values, and costs. “Running” FORPLAN consisted of asking the computer to maximize something, such as the economic value of the forest, the amount of timber cut, or even the amount of recreation use, subject to constraints such as non-declining flow.

While this seems powerful, FORPLAN had very real limitations. First, the forest could only be divided into a certain number of units; I believe the initial limit was 300. Since the typical 1.5-million-acre national forest might have a half dozen or more ecological types, each with several different productivity classes and many different age classes, planners quickly bumped up against this limit.

Second, while FORPLAN considered resources other than timber, its handling of those resources was much less sophisticated than for timber. Basically, everything was an output of timber, so there was little provision for doing things that would enhance wildlife habitat or recreation quality that didn’t involve building roads and cutting trees. In addition, no more than ten other resources could be included, which meant lots of types of recreation and forms of wildlife were left out.

Third, although the data entered into FORPLAN was based on forest inventories, those inventories were based on sample plots. This meant the Forest Service didn’t actually have maps of all the ecosystems and other types that it was entering into FORPLAN. FORPLAN might say that so many acres of a particular type should be managed for timber, but the Forest Service had no idea where those acres were located. So running FORPLAN was more of a theoretical exercise than an actual plan for the forest. In addition, many of the inventories were old because the Forest Service spent money that would have been used updating the inventories on forest planning instead.

Finally, there was the whole issue of chaos and complexity. I didn’t really understand this at the time, but FORPLAN was based on the assumption that linear relationships between resources could be accurately defined and predicted. Chaos theory made people realize that such predictions were, in fact, impossible. Thus, the basic assumption that we shared with the Forest Service, that forest planning was doable (even if we didn’t believe the Forest Service could do it), was in fact wrong. It might even be said that the chaos of unpredictable events and effects — or what Nassim Taleb calls black swans — is what killed forest planning.

Thanks to Andy Stahl teaching me FORPLAN, I was one of those unpredictable effects. In retrospect, the Forest Service made a mistake in requiring every national forest to use FORPLAN, as it made them more vulnerable to someone like me who could analyze how FORPLAN worked and expose the biases built into each individual FORPLAN model.

The first forest plans, when they came out, were about 400 to 500 pages long printed on 8-1/2″x11″ paper and perfect bound into a book about two inches thick. The plans were accompanied by a draft environmental impact statement that was about the same size. When the final plans came out, the added a third document of about the same size that printed all the public comments with the Forest Service’s responses.

When I reviewed unit plans, such as the Twisp-Winthrop-Conconully, I did them from my office, limiting me to the documents the Forest Service had included in the plans. But when people asked me to review forest plans, I insisted on visiting the forest supervisors’ offices.

Except for a table in the environmental impact statement that compared alternatives, I ignored everything in the thousands of pages in the plans and EISs. Instead, I focused on the background documents: how the timber yield tables were formed, how prices and costs were calculated, what relationships were presumed between timber and other resources, and so forth. I also looked at the FORPLAN runs themselves to see what constraints were used and what the runs proposed to do.

Most FORPLAN runs aimed to maximize the economic value of the forest. But Tom Barlow had shown that most national forests lost money on timber sales. The question I frequently had to answer was: how did each forest planning team persuade FORPLAN to cut timber when doing so would reduce the economic value of the forest?

My review of the Okanogan Forest plan quickly revealed that the timber yield tables they used were taken straight from the Forest Service’s Technical Bulletin 201, The Yield of Douglas-Fir in the Pacific Northwest. Although the Okanogan Forest was in the Pacific Northwest, it was on the east side of the Cascade Mountains. Page 5 of the 1930 bulletin showed that 100 percent of the plots used to develop the yield tables were on the west side of the mountains. The mountains form a rain shadow, making the Okanogan far more arid than the areas used to write the yield tables.

To me, this was a rookie mistake that anyone should have been able to catch, but no one in the Northwest environmental movement had been thinking about yield tables for as long as I had. I entered the Forest Service’s inventory data into my custom-written computer program for stand density index and calculated that the yield tables overestimated productivity by a third. Moreover, 30 percent of the lands that the Okanogan considered suitable for timber management in fact was not capable of producing the 20 cubic feet of wood per acre minimum required by the Forest Service.

I also found documents indicating that the Forest Service knew about the problem when it wrote the plan. A 1981 memo from the regional office that required planners to take stand density index into consideration was ignored by Okanogan planners. Frederick Hall himself, who first developed stand density index tools for the Forest Service, told me that he had made some measurements in the Okanogan that confirmed my findings that standard yield tables would overestimate the productivity of the forest. His conclusions were ignored by planners.

The next plan I reviewed was for the Klamath National Forest in northwestern California. As in western Oregon forests, timber in the Klamath Forest was pretty valuable so there was little question about below-cost timber sales. However, the yield tables again were highly dubious. Foresters in the California region of the Forest Service had developed a program for writing yield tables called RAM-PREP, because it was first used in the days when Timber RAM was the computer model used in forest planning.

The yield tables that came out of RAM PREP were extremely crude, and didn’t look like real yield tables at all. Although my computer software wasn’t designed for making charts, I wrote a program that would make charts on a daisy wheel printer using closely spaced periods. A real yield curve looked like a large parabola, with rapid growth in the early years of a forest, then tapering up to a maximum peak, then slowly and eventually more rapidly declining. The yield curves produced by RAM PREP instead went up in a straight line, then after reaching a peak went down in a straight line.

I also discovered other anomalies with the yield tables. RAM PREP predicted that poorly stocked old-growth forests could double in volume in as little as 20 years. Close scrutiny revealed that FORPLAN responded to this prediction by deferring harvest of these stands until they had increased their volume. If these unlikely yield predictions proved wrong, the Klamath Forest would run out of timber in a few decades.

Since I later ran into RAM PREP in every other California forest plan I reviewed, I eventually wrote a lengthy critique of the yield tables and made a presentation at the regional forester’s office in San Francisco. The foresters who wrote the program refused to admit that it was fallible.

On my next forest plan review, however, I discovered a “smoking gun” that effectively made my career as an antiplanner. In late 1983, I was bicycling through downtown Eugene and was hit by a pickup truck that ran a red light. This put me in the hospital with a concussion, broken collar bone, and severely strained leg muscles.

Nevertheless, a few days later I took a train from Eugene to Albuquerque, supported by a cane and Forest Planning‘s then editor, Diane Weaver, to review the final forest plan for the Santa Fe National Forest. My client, Ted Davis, was an emergency room doctor who was concerned about the impacts of timber management on pueblo ruins and other cultural resources in the Santa Fe Forest.

While reviewing background planning documents, I discovered a memo that said planners had entered gross timber yields in the yield tables but used prices for net timber yields. There are several reasons why gross timber is more than net timber, but the upshot was that the yield tables were 25 percent too high for the prices that were used. Rather than fix the data, the forest planning team’s solution was simply to reduce the timber harvest outputs calculated by FORPLAN by 20 percent.

I remember telling Diane that there was something wrong with that, but I would have to think about it for awhile to figure out what it was. After a couple of hours of thought, it hit me: planners ran FORPLAN to maximize the economic value of the forest. If FORPLAN thought that timber was worth 25 percent more than it really was worth, it would assign more land to timber management than was economically sensible. The result would be timber harvests that were too high and environmental impacts that were greater than necessary.

I announced these results at a public meeting, and the head of the Santa Fe planning team responded, “The model isn’t sensitive to that.” “It should be,” I replied. “Timber values on the Santa Fe are marginal as it is, so 25 percent can make the difference between valuable and worthless on a lot of acres.”

Since this was a final plan, my findings were submitted in an affadavit that Ted Davis submitted with his appeal of the plan. A few months later, we received word: the Forest Service had withdrawn the Santa Fe plan and ordered the forest to start over from scratch.

At the next Western Forest Economists meeting, someone introduced me to Buddy Stewart, the economist for the Forest Service’s Southwest Region, which covered Arizona and New Mexico. His eyes lit up when he heard my name.

“I read your affidavit, and thought about it for a couple of hours,” he said, “and then it hit me: we lied. We shouldn’t lie, so I told the regional forester we had to withdraw the plan.”

When the revised draft plan came out in 1986, it reduced the suitable timber base by 130,000 acres — almost 25 percent. Ted Davis asked me to review that one, and as a result the final plan reduced the suitable timber base even more. Clearly, even if the Forest Service’s original model wasn’t sensitive to the error in the yield tables, the plan itself was.

My review of the first final Santa Fe plan made my reputation, and soon I was in demand nationwide. The next several years became a series of shuttles from forest to forest, usually carrying a computer with me, entering in massive amounts of data, going home to write up the results, and then going on to the next forest.

Bookmark the permalink.

About The Antiplanner

The Antiplanner is a forester and economist with more than fifty years of experience critiquing government land-use and transportation plans.

Leave a Reply