Mind Matters Natural and Artificial Intelligence News and Analysis
Social media concept. Corona virus fake news concept. Scale on red background
Photo licensed via Adobe Stock

Why Misinformation Comes From the Top as Well as the Bottom

At Big Think, Cameron English asks us to look at the incentives for academic scientists to publish questionable research that gains widespread attention.
arroba Email

Cameron English, Director of Bio-Sciences at American Council on Science and Health, offers a useful take on the need felt by some in power to crack down on Misinformation:

The uncomfortable truth is that academic scientists routinely publish questionable research that attracts widespread media attention, adding to the morass of “inaccurate information” circulating online. If we want to get this problem under control, we need our trusted sources to quit releasing untrustworthy information.

Cameron English, “‘Trusted’ Sources Spread Anti-Science Nonsense, Too” at Big Think (August 4, 2022)

But the fact is, untrustworthy information pays:

It is true that researchers live and die by their grants; they either “publish or perish,” as the old saying goes. Often, that means academic scientists propose studies that have the best chance of being funded by risk-averse government agencies, not the best study to address the question they want to answer. While this helps explain why academics are incentivized to pursue the research they do, it also implicates funding institutions and universities (and the media), which gain from exaggerating the results of low-risk, low-grade studies. Misaligned incentives constitute a systemic problem.

Cameron English, “‘Trusted’ Sources Spread Anti-Science Nonsense, Too” at Big Think (August 4, 2022)

He’s got a point. If the incentive structure in science rewards clickbait claims, Establishment wars on “Misinformation” become a form of corruption. If truth were a goal, the Establishment would address its own problems first.

Britain’s Royal Society (Britain’s equivalent of the National Academy of Sciences) came out earlier this year against censoring misinformation because science is about error correction, not about censorship. Second, they hinted (but didn’t quite say) that much COVID news provided by official sources turned out to need correction. Wrong information was not all just coming from non-scientists or rogue scientists on blogs, Facebook, and Twitter.

A man puts wooden blocks with the words Fact and fake. Concept of news and false information. Yellow press.

Besides, censorship inflates the value of what is called the Liar’s Dividend: “a phenomenon where someone can get away with lying by saying that something is “fake news” and if the media attempts to expose the lie it can backfire and only make the lie sound / seem even more credible.” – Urban Dictionary

Of course, the Liar’s Dividend could pay out in other ways too. Let’s say alleged fake news is simply shut down by censorship — that might be because it is false but might also be because it is an inconvenient truth. Facebook, for example, seems to be less comfortable than it used to be about censoring COVID-19 “misinformation,” perhaps because government pronouncements have not themselves been entirely straightforward. Also, the U.S. government’s proposed Disinformation Board proposal is on hold, likely for related reasons.

In the real world, there is no pristine source of Correct Information. Nor is there any reason to believe that those who insist that their motive is to prevent Misinformation are entirely motivated only by a sincere devotion to truth. Many may also be protecting an organization, philosophy, or private interest, whether or not they recognize the fact.

We’ll let English have almost the last word: “As a society, we either hold everybody to the same epistemic standard of scientific accuracy, or we accept that “trusted sources” themselves can peddle misinformation and continue to get away with (and profit from) it.” In that case, censorship of others is merely a form of oppression.

You may also wish to read: New U.S. Disinformation Board on hold amid flak from both sides. Most current controversies are not clear divisions between True and Untrue or Right and Wrong. Government would merely reinforce the Establishment when it often needs a challenge. Fulfilling bipartisan fears, director Nina Jankowicz suggested that Twitter Blue Checks should be allowed to “edit” those “who aren’t, you know, legit.”

Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Why Misinformation Comes From the Top as Well as the Bottom