Assessment of Risk

That great, growling engine of change -- technology.

Alvin Toffler, Future Shock, 1970

The Sociological Approach

Technical, economic, and psychological approaches emphasize individual-level decision making, whether in the quantification of the potential for technology failure, the rational assessment of costs and benefits, or the emotional response to a technology. The sociological approach emphasizes the socially constructed nature of risk.


It is recognized that risk perceptions reflect negotiated meanings through interaction with others. Renn classifies sociological approaches to risk using two dimensions:
  1. Individual versus structural, and
  2. Objective versus constructivist.
Structural assessments emphasize the importance of societal definitions of risk rather than the processes by which risk evaluations are formulated. Individual approaches focus on how socially constructed risk is achieved through human interaction. Objective approaches are positivist in considering risks as observable and real whereas constructivist approaches think of risk as social artifacts fabricated from social interaction.

Dr. Sapp prefers the taxonomy presented in the Course Description, wherein sociological studies are classified according to their emphasis on:
  1. Social structure and functioning,
  2. Critical theory, or
  3. Human agency.
Whatever the classification system used, the critical element of the sociological perspective is that humans, through their interactions with one another, create expectations that influence public decision making regarding complex and controversial technology.


The assumptions of the sociological approach are that humans behave differently in groups than they would as individuals, that normative expectations are formed through human interaction, and that these expectations influence risk evaluations.

Strengths and Limitations

The sociological approach can be used to understand and influence the social construction of risk. By understanding fundamental properties of human collectivities (e.g., collectivities have prestige hierarchies and normative expectations for behavior) one can gain an understanding of the process of public decision making and exert some influence upon public decisions. Much more description of the sociological approach and its strengths and limitations is provided in the sections on the Diffusion of Innovations.

Examples of Use

Because sociological studies on risk are undertaken from three different paradigms, examples of their use vary widely. In Soc 415, we will emphasize principles of human agency to focus on understanding and influencing the behavior of rational actors in their risk decisions. Social mobilization theory, as one example of the structure-function paradigm, examines the circumstances under which individuals are motivated to actively promote or oppose certain technologies. Also as examples of the structure-function paradigm, organization theory investigates organizational change that occurs in response to the adoption of new technologies and systems theory examines how institutions affect and are affected by technological adoption. The critical paradigm motivates studies on the distribution of risk and the control of technology development and dissemination by the powerful elite. The human agency paradigm investigates how interaction with others influences consumers' opinions about complex and controversial technologies.

Application in Context

    How do social factors affect opinions of food irradiation?

    At the same time that Huisken Meats began their market testing of irradiated beef patties in Minneapolis, MN, sociologists at Iowa State University began tracking consumer opinions in a study of how human agency affects adoption of food irradiation. Although much research has been conducted on human agency over the past 35 years, few opportunities have arisen where researchers were able to track opinions over time beginning at the introduction of a controversial technology.

    As anticipated from theories of human agency, initial public skepticism toward irradiated food shifted toward acceptance over time. This shift was influenced most strongly by endorsements of respected people/organizations. Unit Three discusses this "diffusion effect" in more detail.

Examples of the Sociological Approach

Contemporary philosophy focuses as much on the social construction of risk assessment, management, and communication as classifying technology as good, bad, or indifferent. The central issues addressed relate to citizen involvement--or lack of involvement--in technology policy making. Contemporary viewpoints acknowledge improvements in living conditions brought about by advances in technology while noting that the manner in which risk is defined and by whom strongly affects technology policy. This section reviews viewpoints offered by Ulrich Beck, Michael Bell and Diane Mayerfeld, and William Freudenburg on relationships among risk, power, and democracy.
The Risk Society
Ulrich Beck, in Risk Society: Towards a New Modernity, expands upon the solution offered by Habermas to the critical philosophy of technology. Beck challenges our understandings of modernity, science, and technology and, in so doing, helps us recognize the need for new conceptions of these endeavors and our place in a society characterized not by relations of production, but by relations of risk. That is, Beck thinks the focal point of science and technology policies should be the effects of technology on the welfare of all citizens, not on the benefits enjoyed by a few citizens.

The Introduction to Risk Society, written by Scott Lash and Brian Wynne, provides a good review of Beck's viewpoints. This Introduction is summarized here.

Philosophers and social scientists long have sought to develop approaches for maximizing the use of beneficial technology while avoiding its negative consequences. Beck asserts that the dominant perspectives reflect scientism--the culture of science--which excludes non-rational forms of discourse and argument.

Thus, arguments not endorsed by officially sponsored scientific or governmental agencies, or those put forth by external agencies, such as consumer advocacy groups, are considered non-rational if they challenge assumptions of the status quo. Public skepticism is treated as non-rational and thus is not considered to be of sufficient importance to be taken seriously except as a barrier to scientific and technological progress. In the politics of technology evaluation even social scientific explanations of risk can be relegated to reflect merely the inaccurate perceptions of a misinformed public. As stated by Lash and Wynne, "technical experts are given pole position to define agendas and impose bounding premises a priori on risk discourses."

Beck argues for a new paradigm of risk evaluation, one that recognizes the benefits of technology development, but at the same time recognizes the many different and equally legitimate ways that technology can be rationally evaluated. This reflexive modernization, in contrast with traditional modernization, seeks to understand technology in practice--the unintended, unavoidable, and undesirable consequences of technology adoption--and the necessary and beneficial aspects of socially constructed risk assessments on technology development and use.

General Principles of Reflexive Modernization
  1. Physical risks always are created and effected in social systems, for example by organizations and institutions that are supposed to manage and control the risky activity.
  2. The magnitude of the physical risks is therefore a direct function of the quality of social relations and processes.
  3. The primary risk, even for the most technically intensive activities, is therefore that of the public's social dependency upon institutions and actors who might not have their best interests in mind.
The Rationalization of Risk
Michael Bell and Diane Mayerfeld (full text article) express concerns about how the language employed by experts to convey risk to the public can be used to manipulate rather than inform. They argue that what is different about the worries of the present day is neither the number of hazards we face nor the degree of uncertainty we feel about our lives, but rather it is the language we use to think and talk about them. They note that the language of risk can be used to explain uncertainty; but it also can be used to explain it away. Bell and Mayerfeld suggest that the language of risk as it is being used today has some strikingly undemocratic implications and strongly urge greater caution in its use by social scientists and policy makers.

Bell and Mayerfeld disagree that our times are more risky than the times of our ancestors. Their observations are that:
  • We live in a time of much risk; but so have others before us.
  • People always have sought for some sense of control over uncertainties.
  • What is changed is not the amount of risk, but control over the language of risk.
Historically, risk definition has fallen primarily to technicians with the expertise to understand the technical aspects of material innovations. But it should be recognized that evaluations of risk are subjective, not objective. Therefore control over the manner in which risk is defined and assessed is critical to risk management and communication.
  • Quantitative risk estimates are precise, but often are not accurate because they rely upon a whole series of assumptions, guesses, and extrapolations that limit their accuracy.
  • Estimated risks often do not account for multiple hazards that occur in conjunction with one another in complex technological systems. For example, we might estimate the risk of pesticides A and B, but often we do not estimate the risk of pesticide A in combination with pesticide B.
  • Numbers often carry disproportionate effect in technological assessments of risk.
  • Risk assessments often falsely homogenize populations. That is, the risk for a child might be different than the risk for an adult.
Given the limitations of risk assessment, control over risk definitions and strategic communication with the public become central to risk management:
  • Because people are aware of the limitations of quantitative risk assessment, they tend to respond with skepticism to these assessments, even though they tend to trust science and science-based organizations.
  • The field of risk communication arose in response to this form of "illogical" reasoning by the public.
  • Most risk communication efforts begin with the premise that scientific experts know actual risk and the skeptical public, out of ignorance or irrational fear or both, misperceives actual risk.
  • The goal of risk communication, therefore, is to educate the public for the purpose of removing their irrational fears. A central assumption of this approach is that experts favor the technology being discussed and non-experts (i.e., the public) opposes the technology.
  • "In its most extreme form, manipulative risk communication results in legal maneuvering to withhold information from the public altogether."
  • "In short, risk communication is infected with a contempt for the public, which perpetuates its undemocratic bias and also ensures the continued failure of risk communication efforts."
If control over language strongly affects risk management, then advanced procedures must be developed for interacting with the public about technology and risk:
  • "Risk is a far from neutral language. Rather than representing interest-free rationality, nameless knowledge that applies to everyone, risk represents the deeply interested knowledge of those who are able to command it."
  • People are becoming more aware of how power relationships influence risk.
  • "The reaction against risk represents democracy, not the hysteria of the ill-informed."
  • Risk assessments often falsely divide the population into those affected and those unaffected. Humanist viewpoints consider all to be affected when some are affected.
If control over language is critical to risk assessment and management, then citizens need to be become aware of risk assessment procedures and risk communication techniques used to convey information about technologies to them. The tenets of the critical philosophy alert us to the need to become aware also of how power relationships can affect risk assessment and communication.
  • Beck argues that technology advancement occurs so rapidly that our institutions cannot keep up, leading to a "risk society." To Beck, new hazards have led to new critiques of technology.
  • Bell and Mayerfeld believe that we have no more hazards or worry about hazards than we had before. Instead, we have a growth of new language for debating about hazards and greater public interest in discussing potential hazards related to technology development.
  • "The real uncertainty at stake in the language of risk is the relationship between power and democracy."
Recreancy and Societal Institutions
William Freudenburg (full text article) points out that the earliest discussions of risk were framed almost exclusively in terms chosen by engineers. Within the technical community, two explanations typically are given for public reactions to what the technicians deem to be objectively defined risk. The first is that the public is ignorant and/or irrational. From this definition of the situation, policy focuses on education of the ignorant and easily manipulated public. The second explanation, associated more with economists, is that public reactions represent economically rational, although understandably selfish, responses to risk. From this view, policy focuses on providing adequate compensation for risks endured.

The problem with the first view is that technical definitions of objective risk are not always precise (see Technical Risk Assessment) and always are influenced by social and cultural factors. The problem with the economic approach is defining adequate compensation. Events vary in the amount of outrage they create and it is difficult to assign monetary value to risk and negative health outcomes.

Increasingly, the findings from empirical studies are providing paradoxical with respect to the individualistic theoretical perspectives that have predominated in the past. Research shows that differences in risk perceptions cannot be attributed to differences in information levels, but are attributed more to differences in cultural outlooks and personal values.

Freudenburg notes that with increasing complexity of technological innovations and societal division of labor people find themselves in a position of not knowing much about highly complex and potentially dangerous technologies. They therefore must rely upon their judgments about whom to trust. Like Slovic, Freudenburg is aware that:
  • the public is not irrational in their skepticism about complex technologies, but rather cautious in deciding whom to trust in their understandable state of ignorance about these technologies,
  • the public and scientists rely upon social as well as technical criteria to evaluate risk,
  • claims that the public is irrational in part are responsible for increasingly contentious debate about complex technologies,
  • some special interest groups profit from fear mongering within this atmosphere of ignorance and fragile trust,
  • the media have a difficult job of presenting varying viewpoints on technical issues.
Freudenburg wants to look at societal not individualist explanations for this pervasive problem because contentious public debate can:
  • delay implementation of valuable technologies,
  • hasten implementation of undesirable technologies, and
  • create public discourse that in itself might be harmful to the social fabric of society.
Freudenburg uses the term recreancy, which means institutional failure resulting either from lack of competence and/or fiduciary responsibility, to refer to societal-level inadequacies in risk assessment, management, and communication. Recreancy does not necessarily result from "villainy," but instead comes about from inadequate societal-level definitions of risk, procedures for evaluating risk, risk management practices, and poor risk communication techniques. In other words, it is societal structure and functioning that is inadequate, not familiarity with technology or irrational thinking, in bringing about wise technology development and policy.

Freudenburg offers suggestions for improving societal-level capacity in risk assessment, management, and communication:
  1. Assess the level of recreancy in American society,
  2. Become more aware of societal-level influences on risk assessment, management, and communication,
  3. Build institutional capacity to facilitate wise technology policymaking.
Beck thinks that because society has become more risky, citizens need to become more involved in the process of risk assessment and management. Bell and Mayerfeld, on the other hand, think that the world is no more risky than it has been before, but that control over the language of risk, which strongly affects technology assessment and management, has become more advanced and therefore in need of more careful scrutiny by the public. Freudenburg focuses on organic social solidarity--the trust citizens place in societal institutions to behave with competence (i.e., skills, expertise, experience) and fiduciary responsibility (i.e., honesty, integrity). Each perspective highlights characteristics of society that must be addressed in understanding the sociology of technology. Given the emphasis of this course on human agency, we will direct most of our attention to techniques of gaining adoption of technologies considered to be mainly beneficial. The approach we will rely upon is "diffusion of innovations." This approach will be described in detail in the final section of the course.

Application in Context

    How does the language of risk affect your perceptions of technology?
  • The first four links presented on the Sampler web site regarding genetic engineering present this technology in a favorable manner, while the last four links present concerns about and objections to it. Skim through these materials again looking for key terms, use of language, or the context in which arguments are presented to investigate how the language of risk is used to sway opinion.
  • What are some key terms or phrases advocates use to make genetic modification of food seem like a good idea?
  • What key terms or phrases do opponents use to make this technology seem like a bad idea?

Go to the Home Page Go to the Reading Assignments Page