Meta’s fact-checking changes raise concerns about spread of science misinformation
Meta, the parent company of Facebook and Instagram, announced on Jan. 7 that it would end its use of fact-checkers and launch a user-based “community notes” system to flag inaccurate or misleading posts. The move has raised concerns among experts—including Harvard T.H. Chan School of Public Health’s K. Vish Viswanath, Lee Kum Kee Professor of Health Communication—that misinformation about science and health could increase on Meta’s platforms.
Viswanath said that misinformation about the COVID-19 pandemic was widespread on Meta and other social media platforms. He noted that some platforms attempted to stem the tide. For example, they labeled content when it was scientifically inaccurate, made it harder for users to access misinformation, and added signals to help users find accurate information. He said that Meta’s third-party fact-checking system was not perfect, but ending its use is unlikely to improve overall scientific accuracy on the platforms. “Whether community notes will work or not is something that’s worth independent evaluation,” he said.
Regarding potential public health impacts, he said, “We know that exposure to misinformation can potentially lead to misbeliefs at variance with science,” he said—for example, about vaccines.
He offered several recommendations to combat misinformation. “Scientists, scientific institutions, and professional science societies can proactively promote accurate science information,” he said. In addition, he said that local community-based and faith-based organizations can help those they serve build resilience against misinformation, and that journalists, especially at the local level, have an important role to play in sharing accurate information from scientists.
Read an ABC News article: Could Meta ending fact-checking lead to rise in health misinformation?
Learn more
New report addresses misinformation about science (Harvard Chan School news)