top of page

The Public Communication Challenges Facing Science

The oft-repeated phrase of “trust the science” may be doing more harm than good to the scientific community. Consider, for example, the six Italian scientists who were convicted of manslaughter in 2012 for the advice they gave before the deadly L’Aquila earthquake. In this case, the trust in science was too great or misplaced, at least until the convictions were overturned. In contrast, some members of the public express too little trust in the science behind the COVID vaccines by refusing vaccinations. Cultivating the proper levels of public trust in scientific findings requires a more robust communication strategy.

Core Challenges

The current scientific communication challenges emerged from five interrelated issues:

  1. Science encompasses a wide range of disciplines. The classic hard science domains of physics, chemistry, and astronomy represent only one slice of the research published in these pages. The scope of scientific inquiry, including domains like economics and psychology, should be celebrated, but the breadth also brings public communication challenges. In particular, the scientific disciplines differ in terms of their developmental stages and methodological sensibilities. Chemistry, for example, has its historical roots in alchemy, but its modern foundation emerged in the 1600s and 1700s. Newer domains, like network science, may well appear to others like alchemy when compared to a mature discipline like chemistry. Throwing the same conceptual tarp over the many scientific disciplines, standards, methodologies, and historical legacies may confuse the public. One doesn’t have to peek very far under the tarp to discover that scientists’ confidence in findings, results, and theories vary a great deal. Chemistry, for example, relies on rigorously controlled experiments designed to tease out cause-effect relationships. Forecasting earthquakes in Italy or elsewhere cannot be equated with predicting the results of a chemical experiment. Apparently, some Italian lawyers and others failed to see the distinction.

  2. Scientific experts often disagree. Matters of differing interpretations of scientific observations, illuminated by competing theories, dot the timeline of scientific progress. For instance, competing models about COVID-19 mortality rates in the United States varied a great deal, propagating misperceptions and discordant public responses. Downplaying the disagreements rooted in differing assumptions or models often has the exact opposite of the intended effect. It can appear as if the scientific community is hiding something or attempting to deceive. Different political sides can cherry-pick evidence to support their viewpoint, while ignoring the inherent scientific uncertainties.

  3. “Trust the science” often means something entirely different to scientists and politicians. For the scientist, this means trust the scientific process, buttressed by its self-correcting robustness and the peer-review process. For the politician, it is often used as an exclamation point to champion a policy decision. At the very least, the phrase represents an oversimplification of the nature of the scientific process and the tentative nature of scientific conclusions. At worst, politicians invoke the phrase to induce conformity to a particular policy conclusion. Two major problems emerge from this disconnect between scientists and politicians. First, many people transform snap, scientific suppositions, hyped by politicians into rigid, final conclusions. Consequently, they may ignore the more deliberately developed scientific consensus. When the inevitable denouement of such snap suppositions occurs, it emboldens skeptics to doubt more legitimately-derived, scientifically-based conclusions. Second, sometimes different scientific disciplines offer complementary insights; at other times, they are contradictory, leading policy makers in different directions. Often science cannot reconcile these differences; consequently, politicians must find other arguments for their positions.

  4. Media reports often glamorize novel research results and magnify scientific errors. The saga of hydroxychloroquine demonstrates both tendencies. Initially, a study indicated that the antimalarial drug could be used to mitigate COVID symptoms. The research was widely reported in the media and touted by President Trump. Then, two studies published in prominent medical journals indicated that the drug actually harmed, rather than helped, COVID-19 patients. These studies were soon withdrawn. By then, of course, “science” and politics had become hopelessly entwined in a controversy, undermining President Trump’s credibility and the scientific integrity of some medical professionals. In both the initial as well as subsequent study, the underlying science was suspect. Yet, the news value of some potentially revolutionary treatment against any deadly disease almost always trumps the prosaic story about the scientific-vetting process. On the flip side, errors are magnified because they bring eyeballs to screens as they play on innate fears or biases. The wild “boom or bust” news swings, emerging from the intertwined socio-ecosystems of the scientific community, the media, and policy makers, underscores the speculative news while de-emphasizing the key self-corrective features of the scientific process.

  5. Many non-scientists are ill-informed and ill-prepared to properly interpret scientific studies reported by the media. A Templeton-Gallup study in the waning days of 2020 found that “the average American dramatically” overestimated COVID-19 mortality rates. Given the extraordinary barrage of media reports dominating our airwaves and cyberspace, these findings are troubling, but not particularly surprising. No doubt, public thinking, driven by headline-hopping and image-rich consumption habits, partially explains such logic-defying miscalculations. Additionally, some people fail to develop a proper mental framework to process mediated reports. The research on cognitive biases, such as confirmation bias and availability bias, partially explains flaws in human mental frameworks. A deeper flaw emerges from a lack of understanding about how the sciences produce knowledge. A basic understanding of the epistemological foundations of science could help temper speculative and overly sensationalized scientific claims. For instance, knowing that scientific consensus emerges from multiple studies and continual debate might have tempered the initial embrace of hydroxychloroquine.

What to do?

If the scientific community wants to enhance public accessibility to its research, influence public policy responsibly, and educate the public about the scientific process, then the community should consider the following ideas.

  1. Develop and test stock phrases that educate the public about the scientific process. The oft-invoked phrase of “innocent until proven guilty” has prevented more than a few hastily drawn conclusions. In contrast, the latest scientific study or highly publicized retraction are rarely framed by any oft-repeated moderating phrase. Ideally, the watchwords would imply three basic ideas. First, the phrases should highlight that science is a process, not an end point. As Jacob Bronowski so aptly put it, “Truth is the drive at the center of science; it must have the habit of truth, not as a dogma but as a process.” Second, they should legitimize the uncertain nature of all scientific conclusions. Doubt and ignorance drive scientific inquiry, confidence, and progress. This proposition may be counterintuitive but the public will accept this premise when properly framed. For instance, meteorologists maintain their credibility by constantly updating forecasts, even if initial predictions are faulty. Third, they should imply that many disciplines fall under the scientific umbrella. This feature underscores one reason why scientists might disagree on a policy issue. The watchwords need to be unpackable in the sense that further explanation illuminates the most salient educational messages. The words need to be memorable enough to seep into the public consciousness. We’ve informally started testing phrases such as, “Allow the scientific consensus to emerge,” and “Remember, the sciences explain and predict through a process of debate and replication. Whatever final expressions emerge, the scientific community will need to exert considerable effort to stream them into public consciousness.

  2. Enhance communication training for scientists. Scientists primarily learn how to communicate with other scientists to advance their fields forward. The conventions of scientific thinking are embedded into research publication protocols which generates a shared and usually unstated set of assumptions and sensibilities. That is the nub of the conundrum – the public does not share those underlying sentiments and scientists are unlikely to communicate them because they are taken for granted in their community.Scientists often overlook one philosopher’s cogent warning: “… science is an alien thought form…we need to appreciate the inherent strangeness of the scientific method.” Consistently reminding the public about the key features of scientific inquiry, spurred by the right catch phrases, might temper hype and overreactions to major scientific news. This requires a willingness to go beyond the temporal excitement of touting a thrilling discovery or embarrassing finding. The scientific community should see retractions of scientific studies as opportunities to “get it right.” It’s like a coach throwing a red flag in an American football game, asking the referees to review a play. But, unlike what many pundits fear, the red flag does not undermine confidence in the integrity of the game; it actually protects it.

  3. Prioritize public education about probability and risk assessment. The well-documented deficiencies in STEM education reduces the public’s ability to properly interpret scientific studies and health mandates. Mitigating COVID requires a rudimentary understanding of probability and multi-variant thought. Citizens in any health crisis need to assess their risk level, which varies based on a number of factors. Building public understanding of probability makes people more comfortable with evolving COVID policy directives. Such training inoculates people against hyped stories that perpetuate myths. Rebutting rumors about the safety of COVID vaccines based only on the death of beloved baseball player Hank Aaron should be second nature to anyone who knows that cause-effect relationships cannot be proven with a sample size of 1.

  4. Encourage political leaders to educate the public about the scientific process. Political leaders of all persuasions should assume a responsibility to not only advocate for policy initiatives but to educate the public about the sensibilities necessary to survive in this media-rich environment. Politicians could take their cues from meteorologists who have done a good job educating the public about their science and probabilities. When they plot possible hurricane paths, they are embracing the uncertainty of various modeling, demonstrating that you can still take precautions despite the inexactitudes. The evolving nature of policy directives, informed by science, should be tempered with cautionary notes designed to build tolerance and adaptability to shifting advice. Biologist Stuart Firestein stated it well: “… the incorrect management of ignorance has far more serious consequences than screwing up with the data. There are correction procedures for mishandled data … but mishandled ignorance can be costly, harder to perceive, and so harder to correct.” In the future, when task forces look for lessons learned from the current pandemic, the mismanagement of ignorance might well top the list. Failing to acknowledge scientific unknowns fosters a false sense of certainty breeding distrust and spawning politicization of health crises.

  5. Make a special outreach to influencers. Media elites are usually gifted communicators. They know how to hype stories, spawn interest, and fuel controversy. Enlisting their support to educate their audiences can build respect for the scientific process and acceptance of a scientifically-based policy. Conveying scientific facts alone will not be enough; those facts need to be presented by influencers who have unique connections with their audiences. When influencers doubt scientific claims, scientists should avoid denigrating them; instead, scientists should respond with openness to the inquiry and provide factual correction. Thoughtfully responding to anti-vaxxers’ claims might become a pivotal educational moment. Will it convince the true anti-science skeptic? No. But it could mitigate further harm to the larger public by quelling rumors, and introducing scientific facts, rebuttals, and perspective into influencers’ networks.Correcting misperceptions about the scientific process that are enabled by the scientific and media socio-ecosystems requires the collective efforts of scientists, public officials, media elites, and a willing public. Without a corrective strategy, the erosion of trust in science will accelerate. Scientists developed antibodies to beat back this dreadful COVID-19 plague; certainly, scientists working with others, can help develop the antibodies to counter the unintended assault on scientific inquiry.

1. E. Cartlidge, Why Italian earthquake scientists were exonerated, Science, (10 February, 2015).

2. J. Wernau, P. Overberg, “Shot skeptics impede herd immunity goal”. Wall Street Journal, A7, 4 February 2021.

3. C. O’Grady, Ecologists push for more reliable research. Science, 370 (6522),1260 – 1261, (11 December 2020).

4. J. Rothwell, S. Desal, How misinformation is distorting COVID policies and behaviors. Brookings Report, 1-40, p. 4, (22, December 2020).

5. J. Bronowski, Science and Human Values. (Perennial Library, New York, 1965) p. 60.

6. M. Strevens, The Knowledge Machine: How Irrationality Created Modern Science (Liveright, New York, 2020), p. 14 of 514 ebook.

7. R. Rojas, D. Grady, “Never mind the skeptics, officials say: Hank Aaron’s death had nothing to do with the Covid-19 vaccine.” New York Times, 31 January 2021.

8. S. Firestein, Ignorance: How It Drives Science. (Oxford University Press Oxford, 2012) p. 44.

9. W. Cornwall, Officials gird for a war on vaccine misinformation. Science, 369 (6499), 14 – 15 (3 July 2020).


bottom of page