Executive Health Administration

8 Communication Strategies to Combat Harmful Health Misconceptions

[fa icon="calendar'] Apr 14, 2015 8:33:00 AM / by William D. Leach, Ph.D.

Myth Busting Efforts BackfireOne of the most popular tools in public health is the “myth-versus-fact” brochure. Government agencies and nonprofits, from CDC to American Red Cross, frequently use this tool to combat misinformation on all manner of health topics including vaccination, fluoridated water, first-aid remedies, and suicide prevention. Myth-versus-fact is also a popular narrative device among journalists who value it for its seemingly balanced airing of both sides of a controversy, and for its ability to hook the reader through dramatic, head-to-head conflict. Engaging our rational and emotional faculties alike, the myth-versus-fact format has obvious appeal... Obvious, and apparently all wrong.

Provost Professor Norbert Schwarz, Co-Director of the USC Dornsife Mind & Society Center, debunked the efficacy of myth-busting campaigns at a February 12 "Health and Wellbeing" conference at USC  co-sponsored by the Behavioral Science and Policy Association and the Brookings Institution.

Recounting a decade of research on the psychology of metacognition – meaning our thoughts about what we think and why – Professor Schwarz suggested such campaigns do more harm than good toward positive health behavior and policy.  Instead of leading people to reject misinformation, myth-busting backfires and actually propagates misinformation. 

Truth-Testing Heuristics

The key to understanding why this strategy backfires lies in what Professor Schwarz calls the “Big Five” truth-testing heuristics. People are more likely to believe new information if it satisfies five metacognitive criteria:

  1. Compatibility: Is it compatible with other things I know?
  2. Coherence: Is it an internally coherent and plausible story?
  3. Credibility: Does it come from a credible source?
  4. Corroboration: Is there a lot of supporting evidence?
  5. Consensus: Do others believe this?

 

Thinking, Fast and Slow

Moreover, when evaluating new information against these five criteria, Nobel Prize winner Daniel Kahneman suggests people use two different types of cognitive processing:

  1. Analytic (“slow”) processing, which involves searching one’s memory for relevant information, and using logical reasoning to evaluate new information against one’s prior knowledge.
  2. Intuitive (“fast”) processing, which relies on gut reactions to determine whether the new information “feels right.” 

To determine whether there is consensus on a supposed fact, for example, analytic processing involves looking for evidence of public opinion, such as survey data. Intuitive processing results in quick judgements about questions such as, “Have I heard this before? Does it sound familiar?”

Because analytic processing is taxing, we often revert to intuitive processing when rushed or distracted, or when the costs of analytical processing are high. Anything that impairs our ability to recall relevant information from memory impedes analytic processing, and increases reliance on intuition. One common source of impairment is a time delay between when we learn something and when we need to use that stored knowledge to evaluate the truth of new claims. 

 

How Myth Busting Efforts Backfire

Professor Schwarz’s research demonstrates how these metacognitive processes play out in public health campaigns. For example, a 2007 study tested the efficacy of CDC’s flu vaccine ‘‘Facts & Myths’’ flyer.

CDC_FactsMyths_Flyer

“Right after reading the flyer, participants had good memory for the presented information and made only a few random errors, identifying 4% of the myths as true and 3% of the facts as false. Thirty minutes later, however, their judgments showed a systematic error pattern: They now misidentified 15% of the myths as true… Once memory for substantive details fades, familiar statements are more likely to be accepted as true than to be rejected as false. This familiarity bias results in a higher rate of erroneous judgments when the statement is false… These findings illustrate how the attempt to debunk myths facilitates their acceptance after a delay of only 30 minutes.”

A related 2005 study examined backfire effects from warnings about false health claims. “The more often we warn people that something is false, the more likely they are to accept it as true two days later because it feels more familiar. The effect is stronger for older people because they forget the details more quickly.”

When information is unfamiliar or incompatible with our existing beliefs, we quickly register feelings of unease and have difficulty processing the new information. When research subjects receive information that does not sit well with the Big Five heuristics, their resulting discomfort quickly registers in facial expressions. This intuitive reaction happens much faster than they can articulate a specific objection. Conversely, when new information feels vaguely familiar and easy to comprehend, we intuitively interpret this as a sign that the information is coherent, compatible, credible, corroborated, consensual, and therefore probably true. 

 

Eight Communication Strategies for Effective Public Health Campaigns

Take-home messages from research on metacognition are summarized in the following eight communication strategies for public health information campaigns. 

1. Do Not Repeat Misinformation

“False information is better left alone. Any attempt to explicitly discredit false information necessarily involves a repetition of the false information, which may contribute to its later familiarity and acceptance” (Schwarz et al. 2007). Therefore, to fight misinformation on vaccines, it is best to ignore the misinformants, and instead focus on communicating the risks of preventable infections and the benefits of vaccination.

2. Repeat “True Facts” Often

Repetition increases familiarity and implies that a claim is widely accepted. Ideally, multiple credible sources should repeat the same information, but failing that, simply having the same authority repeat the same message increases perceptions of consensus and veracity (Schwarz 2015).

3. Enlist Credible Sources

Heuristic #3 highlights the importance of credibility. However, highly credible sources should be especially careful not to repeat a myth. People may later associate the myth with the credible source, but forget that the information is untrue.

4. Warn Audience if Misinformation Follows 

Lead with a disclaimer. If misinformation must be repeated, explicitly warn the audience ahead of time that they are about to hear something that merits skepticism. The best protection against harmful health misconceptions is suspicion when you first hear it. One potential application of this principle is labelling and advertising for dietary supplements. By law, labels claiming unsubstantiated health benefits must include the disclaimer, “This statement has not been evaluated by the Food and Drug Administration. This product is not intended to diagnose, treat, cure, or prevent any disease” (21CFR101.93). Moving this message to the top of label and the beginning of radio advertisements would enhance its impact.

5. Emphasize Consensus on the True Information

Heuristic #5 implies one should emphasize consensus on true information and avoid signaling consensus on false information. News reports on health myths, such as the unfounded link between childhood vaccines and autism, unavoidably normalize mistaken beliefs by suggesting they are commonly held. 

6. Make the Message Simple, Brief and Memorable

Simple fonts, clear language, brevity, and white space on the page help the audience feel as though they understand the message effortlessly. Rhymes, jingles, mnemonics, and visuals can make a message memorable. “If the myth is simpler and more compelling than your debunking, it will be cognitively more attractive” (Lewandowsky et al. 2012).

7. First Affirm Audience's Worldview

Heuristic #1 involves checking new information against one's existing worldview. When presenting a correction or fact, it is extremely difficult to persuade people to accept information that contradicts their existing beliefs. To soften resistance, validate the audience’s basic values. For example, people who fear childhood vaccination and distrust the government or the pharmaceutical industry might be more receptive to vaccine information from an independent non-profit organization dedicated to children’s health, especially if vaccination is framed as a voluntary action taken by thoughtful parents who have thoroughly investigated the pros and potential cons.

8. Nudge Toward Better Decisions

Policymakers and healthcare leaders can often design “choice architecture” so that people automatically gravitate toward the preferred decision, thereby sidestepping the need to persuade the public through rational arguments that could threaten existing beliefs or worldviews (Lewandowsky et al. 2012). Behavioral economics has taught us a promising strategy using defaults. For example, many people never skip a six-month dental check-up, largely because dental offices always schedule your next visit before you leave the office. If family physicians routinely calendared their patients’ annual flu vaccine, vaccination rates might rise accordingly. Under the existing choice architecture, the default is to forgo annual flu vaccination. People who want the vaccine must actively opt-in. Flipping the default so that non-vaccinators would need to opt-out would improve public health without altering patients’ basic options and without confronting health misinformation and misconceptions about vaccines.   

 

 
CTA Comparative Employment Growth Healthcare

 

 

 

Dr.NorbertSchwarz

Video Presentation: "Causes and Consequences of Faulty Health Decision Making"   

 

Sources

Federal Trade Commission (2001) Dietary Supplements: An Advertising Guide for Industry.

Lewandowsky, Stephan, Ullrich K. H. Ecker, Colleen M. Seifert, Norbert Schwarz, & John Cook (2012) “Misinformation and its correction: Continued influence and successful debiasing.” Psychological Science in the Public Interest 13(3):106-131.

Schwarz, Norbert, Lawrence J. Sanna, Ian Skurnik, & Carolyn Yoon (2007) “Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns.” Advances in Experimental Social Psychology 39:127-161.

Schwarz, Norbert (2015) “Metacognition.” Chapter 6 in Mario Mikulincer, et al. (Eds), APA Handbook of Personality and Social Psychology, Volume 1: Attitudes and Social Cognition. pp. 203-229.

Skurnik, Ian, Carolyn Yoon, Denise C. Park, & Norbert Schwarz (2005) “How warnings about false claims become recommendations.” Journal of Consumer Research 31(4):713-724.

Topics: Health Policy, Psychological Sciences, Behavioral Economics, Health Misconceptions

William D. Leach, Ph.D.

Written by William D. Leach, Ph.D.

Bill Leach teaches in USC's online master’s program in Public Administration at the Sol Price School of Public Policy.

Subscribe to Email Updates

Qualify for USC's Executive MHA Online Degree Program

Recent Posts