Skip to main content icon/video/no-internet

In late March 1979, unit two of the Three Mile Island (TMI) nuclear power plant near Harrisburg, Pennsylvania, malfunctioned, resulting in a partial core meltdown of the Metropolitan Edison (MetEd) reactor. The TMI accident was arguably the biggest accident in the history of the U.S. commercial nuclear power industry. It was certainly the most publicized and most impactful; in its wake, U.S. nuclear construction stopped for 29 years and counting.

The TMI accident is also an excellent illustration of crisis communication principles. Many of the crisis communication “lessons” of TMI were well established before that particular accident. And plenty of more recent emergencies, near-emergencies, and seeming emergencies demonstrate the same lessons. Nonetheless, it is instructive to view TMI through the lens of crisis communication principles. Here are seven of them.

Pay Attention to Communication

Just about all the experts agree that TMI was not a serious accident. That does not mean it wasn't a serious blunder. Things went wrong that should never go wrong. When analysts pumped the accident conditions through a computer simulation of the TMI plant, they got a total core meltdown and a genuine catastrophe; fortunately, reality was less conservative than the simulation. In human health terms, nothing much happened at TMI. Nevertheless, truly awful things almost happened.

TMI was by no means the only near-miss in the history of nuclear power. (The frequency of near-misses and the infrequency of real disasters—Chernobyl being the only one we know about for sure—signifies either that nuclear power is an intolerably dangerous technology and we are living on borrowed time or that “defense in depth” works and a miss is as good as a mile.) But TMI was the only near-miss that captivated U.S. public attention for weeks, that is widely misremembered as a public health catastrophe, that is still a potent symbol of nuclear risks, and that as a result has had devastating repercussions for the industry itself. In spite of Exxon Valdez, global warming, peak oil prices, and two Persian Gulf wars, there has not yet been a nuclear renaissance in the United States. Public resistance to such a renaissance may have weakened toward the end of the administration of President George W. Bush, but it remains a significant barrier. The main reason is not Chernobyl—Americans all too easily dismiss other countries' disasters. The main reason is TMI. And what went wrong at TMI—really, really wrong? The communication.

Communication professionals were minor players at TMI. Jack Herbein, the MetEd engineering vice president who managed the accident, was asked why he so consistently ignored the advice of his public relations specialist, Blaine Fabian. (Risk communication had not been invented yet.) He answered, “PR isn't a real field. It's not like engineering. Anyone can do it.” That attitude cost MetEd and the nuclear power industry dearly. And that attitude continues to dominate the nuclear industry, contributing to one communication gaffe after another. Nuclear power proponents keep shooting themselves in the foot for lack of risk communication expertise.

Err on the Alarming Side

In the early hours and days of the TMI accident, nobody knew for sure what was happening. That encouraged MetEd to put the best face on things, to make the most reassuring statements it could make given what was known at the time. So as the news got worse, MetEd had to keep going back to the public and the authorities to say, in effect, “It's worse than we thought.” This violated a cardinal rule of crisis communication: Always err on the alarming side. Make your first communication sufficiently cautious that later communications are likely to take the form, “It's not as bad as we feared,” rather than “It's worse than we thought.” In the 25 years since, countless corporations and government agencies have made the same mistake. Its cost: The source loses all credibility. And because the source is obviously underreacting, everybody else tends to get on the other side of the risk communication seesaw and overreact.

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading