By: Andrew Maynard, Arizona State University and Dietram A. Scheufele, University of Wisconsin-Madison
Truth seems to be an increasingly flexible concept in politics. At least that’s the impression the Oxford English Dictionary gave recently, as it declared “post-truth” the 2016 Word of the Year. What happens when decisions are based on misleading or blatantly wrong information? The answer is quite simple – our airplanes would be less safe, our medical treatments less effective, our economy less competitive globally, and on and on.
Many scientists and science communicators have grappled with disregard for, or inappropriate use of, scientific evidence for years – especially around contentious issues like the causes of global warming, or the benefits of vaccinating children. A long debunked study on links between vaccinations and autism, for instance, cost the researcher his medical license but continues to keep vaccination rates lower than they should be.
Only recently, however, have people begun to think systematically about what actually works to promote better public discourse and decision-making around what is sometimes controversial science. Of course scientists would like to rely on evidence, generated by research, to gain insights into how to most effectively convey to others what they know and do.
As it turns out, the science on how to best communicate science across different issues, social settings and audiences has not led to easy-to-follow, concrete recommendations.