Helping public health agencies improve emergency risk communication
Elena Savoia is a principal research scientist in the Department of Biostatistics at Harvard T.H. Chan School of Public Health (HSPH). She is deputy director of the Emergency Preparedness, Research, Evaluation & Practice Program (EPREP) and co-founder of the IRIS Coalition. She recently co-led a workshop for public health practitioners at the Global Health Security conference in Singapore on addressing mis- and disinformation in emergency risk communication.
Q&A
Elena Savoia
HSPH: What are EPREP’s workshops on emergency risk communication like, and what do you hope participants take away from them?
Savoia: Our workshops are aimed at people in the public health workforce at various levels, including local and state public health departments, or ministries of public health in other countries. They are very practice-oriented. We walk participants through a simulation exercise, gradually adding more information over the course of about three hours. At the recent workshop in Singapore, participants responded to a scenario involving fictional social media postings questioning the COVID vaccine’s safety in children. The goal was for them to come away with some priorities for actions that they can take to improve their communication plans in emergency situations.
During the simulation, the first thing participants wanted to do was to check the facts. You shouldn’t assume something is misinformation. You need to see if there is any data supporting or justifying that particular piece of information. Second, they prioritized listening to the public and trying to understand what segments are the most concerned. Finally, they wanted to identify people within these communities who might be effective at reaching members of the public who distrust the government.
HSPH: You have suggested that practitioners should “prebunk” misinformation rather than simply trying to debunk it when it’s already out there. Why is that, and what does it involve?
Savoia: Debunking doesn’t seem to work because there is just too much information on social media and elsewhere on the internet. It would be very difficult for an agency to try to debunk all the misinformation that’s out there — or even to reach people who are spreading it, given the information echo chambers that exist. You wind up debunking misinformation for people who already don’t believe it.
The idea of “prebunking” is to educate people against misinformation. Rather than trying to convince someone not to believe something, you talk about common patterns in misinformation posts and videos and the industry that exists to make money off of them—for example, by selling a fraudulent remedy for COVID-19. You alert people to certain techniques that creators use to manipulate emotions, such as scary music or tone of voice in videos, and narratives centered on “corrupt elites” or harm to children. You give people the tools to help them critically appraise what they’re seeing or listening to.
Building trust with the public is key for effective communication. Public health practitioners need to listen to people and value their emotional experiences and concerns. Otherwise, you can wind up pushing people away and increasing polarization.