Risk Communication and The 95% Rule
Over the last 15 years we have seen defining studies published examining the social and psychological influences on risk communication, as well as highly relevant studies on social trust, the social amplification of risk framework, and the influence of risk on mass media. With such a burgeoning and dynamic research community exploring such a diverse range of topics it is easy for risk communication theories and practitioner approaches to become somewhat fragmented.
Whilst all these aspects of risk communication hold great interest for those familiar to the field, our challenge is to bring clarity and consistency to the theory and methods for those on the front lines in strategic communication and related fields, including non-communication professionals
The 95% Rule
One of the first principles we work with is “95% Rule” which was first developed in the early 1990s by Dr Vincent Covello, and it holds true today:
Research indicates that facts about risk appear to play little or no role in determining public perceptions and concerns about risks.
In fact this research suggests that facts actually only contribute about 5% towards influencing how risk is perceived by audiences. The 95% in any risk dialogue concerns the social, moral, ethical and personal aspects of risk.
This rule eludes many risk managers, communicators and technology advocates. Its implications are far-reaching, and whilst on the surface it appears quite straightforward, the rule requires some explanation.
The technical facts about risk, which we refer to as the hazard—and can be measured in terms of probability and magnitude — are important factors in any risk discussion. They do however, only contribute about 5% to a dialogue on risk in terms of weight of influence of audiences’ risk perceptions and concerns.
Let’s be clear, risk managers must get this 5%, the technical facts right. They must present them to audiences clearly and concisely, and must acknowledge uncertainties and unknowns in a plausible, concrete and trustworthy fashion. There are important techniques to achieve this. These include appropriate use of risk statistics, risk comparisons and use of narratives and visuals. Technical facts must be presented acknowledging values, beliefs and cultural factors relevant to the audience.
Despite the challenges in the effective communication of facts pertaining to risk, the technical data about risk is not what will shape public perceptions and judgments about the risk. Contrary to how most risk managers consider this type of information, as our rule implies it actually represents about 5% of the risk communication process. This is particularly the case for those controversial risks where technical data is ‘soft’ and far from conclusive (or wrapped up in considerable expert disagreement).
The implications of the 95% rule are profound, and yet, the rule remains a risk communication practitioner ‘secret’ that has not permeated the thinking or delivery of many science and technology developers or risk managers.
Chasing the 5%—Example
A recent example to illustrate this point: we were observers of discussions to appraise communication strategies surrounding a controversial technology. Deeply entrenched positions on both sides of the debate have interfered with progress for many years. A diverse group of organizational representatives came together, all passionate about repositioning the technology and changing the way they communicated to their audiences. Despite a great deal of discussion, including some passionate calls for radically different approaches, two conclusions were ultimately reached:
- We must more to make the public understand about the technology and product safety;
- We must respond to our opponents claims more vociferously and go on the offensive to expose their true motivations.
Unfortunately, these ‘initiatives’ will do little to reposition thought processes or considerations about a highly controversial technology. Thinking of the 95% rule would have been our first suggestion to refocus this discussion. The efforts to build understanding, or reiterate technical aspects of risk, often in our experience via complicated data dumps, will add nothing to the debate and probably only further underline differences in values and deficits in empathy that plague efforts to bring about a convergence of risk positions. Such disconnects are common and point to deep misunderstandings as to how risk information is gathered, processed and considered by publics.
TBFC—A good part of the 95%
The 95% rule demands we look far beyond the technical safety of the product. As a starting point we would advocate exploration of the four primary risk factors that make up a fair proportion of the outstanding 95% of the risk dialogue:
- Trust: are those imposing the risk trustworthy? Do they have a track record demonstrating ability, benevolence, and integrity? What are their motives and intent, are they aligned with those bearing the risks?
- Benefits: Are the benefits of the technology accruing to those that are taking the risks? Are benefits personally relevant to those being exposed to the risk?
- Fairness: Are those imposing risk reaping all the benefits? Are those exposed to the risks engaged in dialogues and is their feedback appreciated.
- Control: Do those being exposed have a choice to avoid the risk, or is exposure involuntary?
As we have discussed elsewhere, the TBFC concept is critical in risk discussions, and the questions posed to risk managers in addressing TBFC issues are varied and generally very challenging. Most risk management processes underestimate the importance of these risk factors or never address them at all. This threatens the entire efficacy of the risk management framework and leads to public suspicion and lack of trust.
TBFC issues usually go far beyond communication alone—actions in terms of processes, policies, strategies and structures need to be addressed. Societal demands on the nature of modern risk discourse dictate that significant resources be allocated to meet the key challenges of the 95% rule.
Market and regulatory acceptance of ‘risky’ technologies will not happen if this rule is not given the attention it demands.