Assessment of Risk

All technology should be assumed guilty until proven innocent.

David Brower

Introduction

Technical and economic approaches assume that risk assessments represent rational responses to the objective facts of the technology. As noted earlier, these assumptions cannot be met in practice. The approaches nevertheless assume that assessments are as rational as can be achieved. They rely upon either expected values or utility. The psychological approach notes that subjective, or emotional, elements affect risk assessments. That is, some aspects of technology seem more threatening than others, depending upon human resonses to them.

The Psychological Approach

Characteristics

Risk is measured as subjective utility. Subjective utility takes into account not only assessments of technical hazard (i.e., estimated physical harm), but also outrage, (i.e., emotional reactions to estimates of technical hazard). The approach focuses on personal preferences rather than "objectively defined" probabilities and attempts not to define hazard but explain why people are willing to accept some risks and not others. More detail on outrage factors is outlined in a later section of Soc 415.

Assumptions

The psychological approach assumes that subjective utilities are adequately recognized by the actor and that these assessments are rationally applied to intentions and behavior. Hence, it assumes that behavior follows from perceptions, even if perceptions are not necessarily based upon technical assessments of hazard but reflect instead emotional reactions to estimated hazard.

Strengths and Limitations

This approach recognizes that emotions guide risk assessments as much as do rational decisions about probability of harm and the balance of utilitarian costs and benefits. It brings people and their emotions into the risk assessment process. It is difficult, however, to translate emotional reactions into public risk policy. Should public policy be altered because people fear a technology that technical experts deem to be low risk?

The theoretical model depicted in this diagram provide an example of the psychological approach to understanding public responses to technology.

Examples of Use

Knowing public outrage related to a technology can help in the design of risk communication strategies and message content. Psychological research has identified twelve key attributes of technologies that affect emotional responses to them:
  1. Voluntary/Coerced. Risks we take upon ourselves create less outrage than those forced upon us.
  2. Natural/Industrial. Natural risks are viewed with less emotional response than risks created by human actions.
  3. Familiar/Unfamiliar. Things familiar are considered less risky than the unfamiliar.
  4. Memorable/Not Memorable. Linking technologies to highly memorable tragedies makes them seem more risky.
  5. Not Dreaded/Dreaded. Linking technologies to dreaded events (i.e., cancer) makes them seem more risky.
  6. Chronic/Catastrophic. Risks we face everyday create less outrage than the catastrophic event.
  7. Knowable/Unknowable. People tend to fear the unknown. Opponents of a new technology can always use this outrage factor to their advantage because, de facto, using new technologies involves uncertainties.
  8. Control/Not in Control. We feel safer when we have the ability to regulate the use of a technology.
  9. Fair/Unfair. People will become more outraged about a technology if they think they must bear more costs or fewer benefits than do others.
  10. Morally Irrelevant/Relevant. Linking the use of a technology with immoral motives creates outrage. Linking it with moral standards lessens outrage.
  11. Trustworthy/Untrustworthy. Trust in the experts who develop or endorse a new technology might be the most important factor influencing outrage.
  12. Responsive/Unresponsive. Outrage is reduced when persons/organizations responsible for the development or regulation of a new technology seem responsive to public concerns.
Knowing the key determinants of public outrage related to a technology can help change agents design risk communication strategies and message content. In the section on Risk Communication we will learn many important guidelines for communicating to an outraged public about complex and controversial technology.


Application in Context

    What are psychological perspectives on food irradiation?

    Opinion polls generally show that most consumers are concerned about food safety and consider irradiation to be a safe process. Few opinion polls, however, inform respondents of the concerns raised by its opponents. From the psychological perspective, one would assume that hearing negative information would yield unfavorable opinions of food irradiation. And the results of polls that include statements from opponents show this shift toward negative evaluations of food irradiation.

Example of the Psychological Approach

Trust, Emotion, Sex, Politics, and Science: Surveying the Risk-Assessment Battlefield Paul Slovic: Risk Analysis, vol. 19(4) 1999. Full text article.

Slovic asserts that risk management has become increasingly politicized and contentious. He expresses concerns that controversy and conflict might have become too pervasive. It might be that the quality of society erodes with too contentious public discourse about technology policy.

The irony, he states, is that at the same time our nation has expended considerable resources to make life safer, many persons have become more, not less, concerned about risk. To understand this phenomenon, Slovic describes the nature of risk assessment and its relationship to public perceptions. He distinguishes between hazard--technical assessment of potential physical harm--and risk--socially constructed perceptions of risk. [Earlier, we said that Risk = Hazard (technical assessment) + Outrage (emotional assessment). Here, Slovic is saying that Risk is the socially constructed sum of hazard and public perceptions. Thus the two perspectives are very similar.] Slovic states that assessments of danger, both by technicians and the public, are influenced by political, economic, cultural, and other social factors. Importantly, it is definitions of risk that affect risk policy--defining risk is an exercise in power.

Thus, risk controversies are not about science versus misguided public perceptions of science, wherein the unwashed public needs to be educated about "real" risks. Rather, risk controversies are struggles over whom will define risk. The public cannot be labeled as irrational because their judgments about risk are influenced by emotion. The viewpoints of scientists also are influenced by emotion, politics, economics, and so forth. Technology policy discourse, therefore, is not about whom is correct about assessment of danger, but whose assumptions about political, social, and economic conditions win out in the risk assessment battlefield. Thus, danger is real, but risk is socially constructed. Scientific literacy and public education are important, therefore, but they are not central to risk controversies.

Slovic raises concerns about how disparities between "real" and "perceived" risk might engender public discourse that, itself, is a risk to the social fabric of society. Trust is a critical factor in risk assessment and management. Social relationships of all kinds, notes Slovic, are strongly influenced by trust. Unfortunately, trust is fragile. Slovic states that the limitations of risk science, the importance and difficulty of maintaining trust, and the complex, sociopolitical nature of risk point to the need for a new approach to risk assessment--one that focuses upon introducing more public participation into both risk assessments and risk decision-making to make the decision process more democratic, improve the relevance and quality of technical analysis, and increase the legitimacy and public acceptance of the resulting decisions.

Slovic argues that the system destroys trust. The pervasiveness of media attention to technology and risk assessments destroys trust because most of what the media reports is trust-destroying news. Also, powerful special interest groups find access to the media. Slovic states that the young science of risk assessment cannot prevail against the level and intensity of assaults against it.

Slovic thereby argues that whoever controls the definition of risk controls technology policy. In contrast to others who note also the seemingly disproportionate effect of negative citizen opinion upon risk assessment, Slovic states that more, not less, citizen involvement is needed to adequately manage risk. It seems like Slovic is not comfortable with technology policy formed through contentious debate between scientific experts and special interest groups and therefore urges more widespread involvement in risk management by the public.
Go Back to Previous Page Go Forward to Next Page Go to the Home Page Go to the Reading Assignments Page