The big framing divide between scientists and the public

Image edited by the author. Source file is a CDC photo available via Wikipedia titled: Surgeon General's warning on a cigarette pack, 2012.

What you’re getting into: 900 words, a 3 to 5 minute read.

Framing is one of the most important concepts in public communication. The term can get thrown around loosely, but in my mind, framing comes down to how we define problems and, as a consequence, how we think about potential solutions.

Most scientists and technical experts tend to define problems on a spectrum, whether it’s the risk of ecosystem collapse, temperature ranges for a warming planet, or the potential side effects of medication. When policymakers and members of the public approach these same issues, though, they often think of such risks in binary terms: Can we save these wetlands? Will we blow past the 1.5 to 2 C warming goal? Does this pill need a warning label?

Often, scientists wish they could help people see things their way: with the risks on a nuanced spectrum. In order to do so, they may have to speak binary first.

Is your car broken or outside specified design tolerance?

I recently spoke about framing to a group of AAAS Fellows and shared a Malcolm Gladwell story about auto safety that neatly illustrated this point (h/t to Stephen Young).

In 2009, Gladwell wrote, Toyota engineers were having a lot of frustrating conversations with customers who thought their cars had undergone “sudden acceleration.” In some rare cases, there were problems with people’s accelerators. But most of the time, the problem was human error: people were unconsciously hitting their accelerator, something drivers do with much more regularity than we tend to assume. Normally we just tap the brakes to slow down, get our feet back where they’re supposed to go, and go on driving. But Toyota drivers were worried, likely as a result of extensive media reporting about possible problems with the vehicles.

Gladwell elegantly captured the disconnect:

The public…didn’t think about the necessary compromises inherent in the design process. They didn’t understand that a car was engineered to be tolerant of things like sticky pedals. They looked at the part in isolation, saw that it did not work as they expected it to work—and foresaw the worst. What if an inexperienced driver found his car behaving unexpectedly and panicked? To the engineer, a car sits somewhere on the gradient of acceptability. To the public, a car’s status is binary: it is either broken or working, flawed or functional.

Toyota had to help their employees “reframe” their message. Yes, they could talk to customers about sticky pedals and design tolerance, but they first had to acknowledge that customers simply wanted to feel safe in their cars. Toyota went so far as to offer to replace perfectly fine vehicles if people felt unsafe in them. According to a management expert Gladwell interviewed, it completely turned things around. Instead of feeling ignored, customers started sending “love letters” to the company. (Gladwell doesn’t address this, but I’m assuming Toyota didn’t have to actually replace thousands of cars unnecessarily. As with many other such offers, it’s the thought that counts.)

Binary frames can be abused to ignore science

A former colleague was testifying before Congress about fisheries once. He told a committee that there was a 95 percent chance a certain fishery would collapse over the next several years without intervention. A Congressman responded by asking him to come back when he was 100 percent certain.

If you’re a scientist reading this, I know you’re shaking your head. One of the bedrock truths in science is that nothing is 100% certain. Even if the fishery collapsed, perhaps scientists would cautiously state that there was 99% certainty that all the fish were gone based on available data.

The Congressman was demanding a binary answer: tell me if it will collapse or not, yes or no. But science doesn’t often doesn’t do binary, especially on topics that the public and policymakers see as controversial.

One of my favorites from Jorge Cham's PhD Comics. The more we know, the more we understand the limits of knowledge in our field. Meanwhile, people outside our fields often have completely different frames for evaluating the same subject.
One of my favorites from Jorge Cham’s PhD Comics. The more we know, the more we understand the limits of knowledge in a given field. Meanwhile, people outside that field often have completely different frames for evaluating the same subject.

But we can also use binary frames to anchor more complex scientific frames

In 2013, the Intergovernmental Panel on Climate Change made big news when it announced that scientists were 95 percent certain that industrial carbon burning and other activities were causing global warming.

In attempting to explain where this basic conclusion of climate science sits on the certainty spectrum, the AP’s Seth Borenstein asked researchers what else in science enjoys that same level of certainty. He got some interesting answers, including the link between smoking and lung disease.

Most people, myself included, have not internalized certainty levels and percentages the same ways scientists have. What made Borenstein’s article particularly effective was that it translated a spectrum frame to a binary one:

  • “What level of statistical certainty do scientists have about industrial carbon burning causing recent global warming?” is a spectrum question. The answer is 95 percent.
  • “Are scientists as sure about the cause of climate change as they are that smoking causes lung disease?” is a binary question. The answer is yes.

Importantly, both of these questions are useful and both of these answers are accurate. We don’t have to choose between them and, in fact, people might ultimately need both frames to understand scientific evidence about societal risks.