Back To Top

[Faye Flam] Social media erred in censoring misinformation

Labeling misinformation online is doing more harm than good. The possibility that COVID-19 came from a lab accident is just the latest example. Social media companies tried to suppress any discussion of it for months. But why? There’s no strong evidence against it, and evidence for other theories is still inconclusive. Pathogens have escaped from labs many times, and people have died as a result.

Social media fact-checkers don’t have any special knowledge or ability to sort fact from misinformation. What they have is extraordinary power to shape what people believe. And stifling ideas can backfire if it leads people to believe there’s a “real story” that is being suppressed.

Misinformation is dangerous. It can keep people from getting lifesaving medical treatments. But flagging it doesn’t necessarily solve the problem. It’s much better to provide additional information than to censor information.

Part of the problem is that people think they know misinformation when they see it. And those most confident of their ability to spot it may be least aware of their own biases. That includes the fact-checking industry within the mainstream media, who were caught removing earlier posts on the lab leak theory, as well as social media “fact-checkers” who aren‘t accountable to the public.

Earlier this year, I interviewed physician and medical podcaster Roger Seheult, who said that he was censored by YouTube for discussing the clinical trials of hydroxychloroquine and Ivermectin as potential COVID-19 treatments. No wonder so many people still believe these are the cures “they” don’t want you to know about. Much better would be an open discussion of the clinical trial process, which could help people understand why scientists think those drugs are unlikely to help.

Even without the power of censorship, social media culture encourages the facile labeling of ideas and people as a way of dismissing them -- it’s easy to call people deniers or as anti-science because they question prevailing wisdom.

Of course, there are ideas that are very unlikely to be true. These generally involve elaborate conspiracies or a complete overhaul in our understanding of the universe. Or, like cold fusion and the vaccine-autism theory, they’ve been tested and debunked multiple times by independent investigators.

I discussed the new interest in the lab leak with another science journalist who was interested in why so many reporters are still treating the natural spillover hypothesis as the only possibility. We agreed this is not like the connection between carbon emissions and climate change, where there’s a scientific consensus based on years of research and multiple, independently derived lines of evidence. Here, even if a few scientists favored the natural spillover early on, the question is still open.

Last year, some scientists rightly objected that accusing any lab of causing a worldwide pandemic is a serious charge and one should not be made on the basis of proximity alone. That doesn’t mean we should ignore the possibility, or assume that some other equally unproven idea is right. In the face of an unknown, why would the fact-checking people deem one guess to be a form of misinformation and another guess to be true?

And the lab leak idea got conflated in some people’s minds with conspiracy theories that the virus was deliberately created and released for population control or some other nefarious agenda. But a lab leak could have involved a perfectly natural virus that a scientist collected, or virus that was altered in some well-intentioned attempt to understand it.

Writing in his blog, journalist and Bloomberg contributor Matthew Yglesias calls it a media fiasco. “(T)he mainstream press ... got way over their skis in terms of discourse-policing.” He admits he tweeted his disapproval of a thoughtful, well-written New York magazine piece that helped revive the lab leak debate in January.

The author -- novelist Nicholson Baker – didn’t claim any smoking gun, but made a convincing case that the issue was still open. A Medium piece by former Times writer Nicholas Wade added little to what Baker said, but came at a time when the public was ready to reconsider. A recent Vanity Fair account details how the issue was suppressed inside the US government.

Looking back, there really wasn’t that much new news to report. Very little new evidence has been uncovered over the last year. The pandemic’s origin is still unknown. The fiasco was the media’s propagation of the lie that the issue was settled and that anyone questioning it might be deemed an idiot or conspiracy theorist.

And maybe the intentions of the Facebook fact-checkers were good. If there was a magical way to identify misinformation, then social media platforms could do more to refrain from spreading it. Suppressing ideas they don’t like is not the way.

Yesterday I had a long talk with someone who volunteers at a girls’ school in India. She said she’d been in contact with some students who expressed fear of coronavirus vaccines, even though their neighborhood has been ravaged by the pandemic. When she gave them additional information about the relatively greater danger of the disease, they chose to get vaccinated.

What helped was not taking away information, but giving people additional information. Censoring information -- or what one deems “misinformation” -- isn’t as helpful as it seems. The best we can do is keep questioning, and give people the most complete story we can.


Faye Flam
Faye Flam is a Bloomberg Opinion columnist. -- Ed.

(Bloomberg)
MOST POPULAR
LATEST NEWS
subscribe
지나쌤