Paul Holmes 05 Sep 1992 // 11:00PM GMT
At the height of the scare over the use of the chemical alar on the nation's apple crop, schools took apples off their menus; airlines stopped serving a fruit that is as American as, well, apple pie; housewives all over the country fled the supermarket aisles in terror, lest they and their loved ones be contaminated. Yet no one ever died from eating alar-treated apples, and if the majority of scientific opinion was to be believed no one ever would.
On the other hand, tens of thousands of people are killed in automobile accidents every year. But there is no mass movement to abandon the automobile; few are even recalled; and until recently research by automobile manufacturers indicated strongly that most people would not even pay a few extra bucks for safety devices such as air bags.
Clearly, by any totally objective standard, automobiles are more dangerous than apples. So why did an infinitesimal health risk in the case of apples provoke such hysteria, while the dangers of automobile travel leave most people blissfully unconcerned? It's a question that goes to the heart of risk communication.
Clearly, there is a massive misperception of risk among the public at large. Jim Lindheim, vice-chairman of BursonMarsteller, points out that: "Polls show the dominant perception of most citizens in the world's industrialized nations is that they face greater risks today than in the past. The real facts are that life spans are longer, that most life-threatening diseases are occurring less frequently, and that disasters arise far more often from nature than from man."
Public perceptions about risk, Lindheim says, do not follow logical patterns, nor are facts always successful at easing concern. Dr Peter Sandman, director of the environmental communication research program at Rutgers University and one of the country's leading risk communication experts, argues that the problem is not the public's misperception of risk, but rather its feeling of outrage, and that the traditional public relations solution of greater public education is simply not enough.
"In most risk controversies, people do misperceive the hazard; they also feel outraged," Sandman says. "The question is which is cause and which is effect. If people are outraged because they misperceive the hazard, then the solution is to explain the hazard better. On the other hand, if people misperceive the hazard because they are outraged, the solution is to find a way to reduce outrage."
Sandman has devised what he calls a "thought experiment" to illustrate this argument. He invites risk communicators to imagine a roomful of citizens listening to an expert on pesticide risks who has evidence that broccoli is more carcinogenic than dioxin. It's a tough sell, but the audience is calm, there's no immediate crisis, and the speaker is persuasive and has a huge bank of data to support his thesis. Over the course of an hour or so he succeeds in convincing people that, in fact, broccoli is more carcinogenic than dioxin. A misperception has been corrected through public education.
A second speaker follows. "Now we know broccoli is more carcinogenic than dioxin," he inquires, "which one do we want the EPA to regulate, the broccoli or the dioxin?" Sandman suggests, and most PR professionals will agree, that the audience would respond in favor of regulating dioxin, rather than broccoli. "We had a hazard misperception and we corrected it," Sandman points out. "And the policy preferences remained unchanged. That tells us that the hazard misperception was not our problem in the first place. As long as dioxin generates a lot of outrage and broccoli very little, educating people about their relative hazard is unlikely to affect the public's concerns, fears or policy choices."
Sandman lists several factors that contribute to public outrage: if the public is coerced into accepting a product rather than embracing it voluntarily; if a product is created by industry rather than occurring naturally; if a product is exotic rather than familiar; if it is memorable; if it is dreaded; if its potential impact is catastrophic rather than chronic; if it is controlled by others rather than by individuals; if it is unfair; if it is morally relevant; if it comes from untrustworthy sources; if the process that produces it is unresponsive.
Sandman's work is echoed to a certain extent by that of psychologist Paul Slovic, who says the two key elements in how the public views risk are control (the public will almost always underestimate risks when they feel they are in charge, and overestimate them when someone else is responsible) and the ability to understand the risk with clarity. Corporate communicators, says Jim Lindheim, cannot assume that facts, no matter how compelling, will carry the day.
The emotional aspects of risk communication are powerful and often overwhelming. "To say that workers are safer while they are in a company's plant than they are at home may be factually true," he says, "but it is not likely to convince a worker's spouse. She has too great an emotional investment in the idea that a home is safe place, particularly compared to a factory." "If you want an example, you can look at the way the acid rain issue was argued in the '80s," says Tom Buckmaster, svp of Fleishman-Hillard in Washington, D.C. "A great deal of talent and resources were invested in arguing the science, in suggesting that the whole phenomenon of acid rain was uncertain. All of this was done in the face of a public that believed the phenomenon existed, and the result was that the public perceived businesses and sectors of industry as being insensitive to their concerns."
In this case, Buckmaster says, the dissonance between communication and beliefs already held by the target audience helped create a dynamic that contributed to the crisis. There is considerable evidence that such communication cannot per suade, or change minds; that it can only reinforce opinions that are already held. Similarly, while most people are prepared to take medicines produced by biotechnology into their bodies, there is considerable resistance to the idea of bioengineered milk and food. In part, this is due to the perceived benefit of biotech medicines and the perception that there is no pressing need for bioengineered foods; in part, it is because people choose biotech medicines freely, while there is a suspicion that biotech food products could be slipped to us without our knowledge and consent. In the end, it comes back to the time-honored Bernaysian concept of public relations as manufacturing consent. Buckmaster recounts an
analogy used by Sandman: "If you kidnap me at night, drive me to the top of a tall, snow-covered mountain, blindfold me and tie two pieces of wood to my feet, then push me down a hill, it's called murder," he says. "If I go with you of my own free will, it's called skiing. Our job is to get people to go along of their own free will, with all the relevant information."
Tom Preston, president of Kentucky pubic affairs firm The Preston Group, has worked with coal mining interests in the region to communicate with communities on waste disposal issues. He says that in all environmental communications there will always be a small percentage of the people with whom it is impossible to communicate—and that these people tend to be the most vocal—but that there are ways of reaching most members of the community.
"Unfortunately, the truth means nothing to a lot of these people, because they are not accountable to anyone. Corpor ations, on the other hand, have to account for everything they say."
He says individuals can be made to understand that they are responsible for waste that is created by industries providing goods and services they need, particularly if the value of the industry's contribution to society can be explained to them and if there are apparent economic advantages—employment, taxes, community involvement—to having the industry in their communities. "The client company has to constantly monitor its impact on the environment, on the water supply and on ambient air, and be absolutely open with the public about this information," Preston says. "We won't work with a company that does not commit to deal honestly with the community, and frankly such companies find it difficult to operate in today's environment."
Jim Lukaszewski argues that no amount of data will be persuasive in an argument about risk because "people don't care about facts and data, especially if they don't support their emotional positions." People want to know how a decision affects their health and safety, their property values, their pride in their community, their peace of mind.
"Sadly, for the engineer and the technician, emotional concepts that don't deal with facts, data, science or proof are nearly impossible to comprehend," Lukaszewski says. "The lesson is if you want to communicate effectively, you can't hide behind performance measures and factoids like `It's only one part per quadrillion' or `There's more cyanide in a flower.' You have to deal with real gut-level stuff"
It would be easy to assume that the emotional nature of the public's response to such issues (the outrage Sandman describes) makes the role of "experts" less important: that if the public is responding on a largely emotional level, all the good science and hard data in the world will not be persuasive. Fleishman-Hillard's Buckmaster says he believes that to be an oversimplification.
"If anything, it makes the role of third party scientific experts an even more critical part of the equation," he says. "The difference is that you have to have people who are capable of establishing linkages to key publics by using basic common sense and language that people can connect to."
What it does mean is that it is no longer enough to roll out an expert with impressive credentials and then ask the public to: "Trust us." That mistake has been made by organizations as diverse as the nuclear power industry and the National Endowment for the Arts and has contributed to a general lack of public confidence in experts.
"The average person doesn't want to hear `trust me', he wants to hear `hold me accountable,"' Buckmaster says. "You have to be prepared for that accountability, and you have to give people the tools to hold you accountable. Corporations need to draw up codes of conduct and operating principles, and be open enough to let people know what those principles are."
Buckmaster, like many other experts, cites the chemical industry's Responsible Care program as an example. Over the past few years, chemical companies have been inviting the public to hold them more accountable, holding open days so people can come in and see how plants operate and making sure local communities are well-informed, so they understand that the companies involved are working to improve standards.
Jim Lindheim suggests the following rules for risk communication:
communicators should not expect statistics to reduce public anxiety;
communicators must recognize and respond to the emotional aspects of risk perception;
communicators must build a base of awareness of the benefits of a company's presence and products, separately from risk communications
communicators should draw on third-party support.
"In the final analysis," he says, "the best risk communication comes not from words but from deeds, from the actions people see a corporation take to add value for their customers and communities."