Skip to main content icon/video/no-internet

The social amplification of risk framework (SARF) was first aired in 1988 and arose out of a collaboration between risk communication researchers based at Clark University (Roger Kasperson and colleagues) and Decision Research in Eugene, Oregon (Paul Slovic and colleagues). At the time the risk perception and communication field comprised a number of competing schools of thought, fragmented both by discipline and approach. Recognizing that many of the problems being addressed in this field were genuinely cross-disciplinary, SARF represented the first systematic attempt at presenting an integrative framework capable of accounting for findings from a wide range of studies in science and risk communication. A particular focus was to bring together research from media and communication studies, psychological and cultural approaches to risk perception, hazard vulnerability and geography, and decision sciences.

Social amplification describes how various social agents receive, interpret, send, and modify information about risks. It proposes that such signals undergo predictable transformations as they are filtered through various social amplification stations in ways that can either increase (amplify) or decrease (attenuate) the volume and intensity of a message. The framework also provides an account of the dynamic social processes underlying risk perceptions and response in societies and communities. The approach brings with it the very clear message that social context matters: Individual risk perceptions and behaviors, whether risk avoiding or risk taking, are shaped by the everyday situations in which people live, the groups they identify with, and the institutional and cultural contexts that surround them and shape the messages they receive. In the ensuing 20 years, other research traditions have also been drawn on, such as political science and organizational studies, as the framework has been applied to a range of additional phenomena and case studies.

Origins of SARF

In their important 1988 article in the journal Risk Analysis, Roger Kasperson and colleagues make the observation that some hazards that experts believe to be low in risk can nonetheless very quickly become the subject of controversy and sociopolitical activity alongside rising concern among the public. They term this phenomenon risk amplification, with examples including rising concerns about nuclear power and radioactive waste storage in the 1970s, the controversy over genetically modified food in the 1990s, the response to mad cow disease (bovine spongiform encephal-opathy, BSE) in a number of countries; various radiological and chemical contamination events; and even responses to the so-called Y2K millennium computer bug. An opposite phenomenon is also highlighted, which these researchers term risk attenuation. Here hazards that experts believe to be very significant provoke little or no obvious public response or outcry. Examples of the latter include attitudes toward smoking in the 1950s and 1960s, to radon gas in the home, or until very recently to the threat of global climate change. The name, of course, stresses amplification of concerns, but it is important to bear in mind that the framework is an attempt to account for the occurrence of both intensification and attenuation in public concern and controversy.

The starting point for the framework is that for “risk events” to have any impact in the world, they must be communicated by somebody to others. A risk event need not have led to any actual harm to “qualify,” so one can include, alongside actual accidents, more minor incidents, media reports, or even secondary or scientific accounts of a possible problem with a risk issue. Risk communication is seen as a process through which these risk events become portrayed and represented in terms of varied risk signals (images, signs, or symbols). These symbols in turn are changed and shaped by a range of psychological, social, organizational, and cultural phenomena. Our experience of environmental or technological risk, for example, is a projection into an uncertain future. As such, it is not only an imagined experience of physical harm but also the result of the ways groups and individuals learn to create their own interpretations of risk events.

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading