Mind Matters Natural and Artificial Intelligence News and Analysis
a-man-puts-wooden-blocks-with-the-words-fact-and-fake-concept-of-news-and-false-information-yellow-press-stockpack-adobe-stock
A man puts wooden blocks with the words Fact and fake. Concept of news and false information. Yellow press.
Image licensed via Adobe Stock

So Who Are Today’s Disinformation Police?

Social scientists are striving to develop ways to blunt the force of information that governments would rather the public did not know or heed
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

We’ve already seen that the National Science Foundation is weighing in against “misinformation.” And tech media are often quick to chime in: “Targeted, AI-generated political misinformation is already out there—and humans are falling for it.” (Wired, January 22, 2024) But combatting misinformation, disinformation, and mal-information (MDM) has become a cottage industry among social scientists as well.

What do the terms used mean?

The three terms all overlap though “misinformation” is beginning to be preferred. The problem with the term disinformation — as in the name of the U.S. government’s late, unlamented Disinformation Governance Board — is that it makes the origin of the enterprise painfully clear: Disinformation means information the government does not want people to have, whether it is correct or not. Misinformation, by contrast, is supposed to be incorrect as well as undesirable.

Fake news propaganda conspiracy theories disinformation manipulation news titles illustration

Just what “mal-information” means, apart from the other two concepts, remains unclear. Differences between the three terms seem to hinge on the presumed motives of (usually) unknown persons, so clear distinctions between them may not be conceptually useful. We could just as well call it all wrongthink.

Inoculation against wrong ideas

Terminology aside, some researchers are sure of what they want to combat and have definite ideas about how to go about it. For example, social scientists writing at Scientific American explain how they hope to use an “inoculation technique” to insulate the public against accepting wrongthink:

… misinformation researchers have identified novel ways to make people more resistant to being misled without risk of censorship or interfering with anyone’s freedom of speech. One of those techniques is known as “inoculation,” which involves boosting people’s information discernment skills. Key to inoculation or “prebunking” is the realization that misleading or false information has markers that can help differentiate it from high-quality information…

In fact, numerous studies rolled out to millions of people on social media have shown that inoculation in the form of brief informational videos makes people more skilled at identifying manipulation techniques common in misinformation, such as false dilemmas and scapegoating.

Stephan Lewandowsky, Sander Van Der Linden & Andy Norman, “Disinformation Is the Real Threat to Democracy and Public Health,” Scientific American, January 30, 2024

So what are the wrong ideas against which we are all to be inoculated?: “Vaccine denial, climate denial, election denial and war-crime denial” Such vague targets are an open invitation to authoritarian censorship of or attacks on reasonable dissenting views. That’s what befell public health specialist Jay Bhattacharya during the COVID years, along with countless others.

Not surprisingly, some of these mis/dis/mal-information experts seem quite happy with government efforts to force compliance. From a recent paper at Current Opinion in Psychology on social media compliance measures in Europe:

The platforms submitted their first baseline reports to the European Commission in early 2023. The reports detailed the ways in which the platforms were complying with their commitments under the Code of Practice such as determining whether political ads are eligible for monetization and whether they are clearly labeled as political. An audit of the reports by independent academics found that, with the exception of X/Twitter, the platforms by and large expressed compliance with the Code although their overall performance fell short of being satisfactory. Notwithstanding, the baseline reports confirmed the feasibility of a regulatory accountability process and the large platforms will continue to report their compliance every 6 months.

Lewandowsky, S., Ecker, U.K., Cook, J., van der Linden, S., Roozenbeek, J., & Oreskes, N. (2023). Misinformation and the epistemic integrity of democracy. Current opinion in psychology, 54, 101711.

The U.S. State Department is also involved with developing news censorship and management tools too and is fiercely protective of them.

Are these academic disinformation experts neutral arbiters?

A key survey of 150 misinformation experts at Misinformation Review from last July admits “Experts leaned strongly toward the left of the political spectrum.” That violates the first principle of managing information honestly: Don’t send people who are strongly biased toward one view of things.

DYOR - Do Your Own Research Concept

The mis-dis-mal experts’ campaign hasn’t gone wholly unnoticed. There is currently a request from the U.S. Senate for more information on the National Science Foundation’s efforts to develop censorship tools under Track F.

One irony of the current situation is that — to take the COVID years as one example— a great deal of the information that did not reflect reality actually came through government via mainstream news sources. Better informed sources like Bhattacharya were often fighting a rearguard action. So policing everyone but the government is not the answer here.

It’s also worth reflecting on the fact that the disinformation experts typically claim to be defending democracy — and yet their principal weapon is indoctrination.

You may also wish to read: When censorship parades itself as a science… A House Subcommittee discovered that the National Science Foundation — which is supposed to support science and engineering — is readying censorship tools. The bee in the bonnets of the researchers who received the funding for the internet censorship program is that Americans can’t tell fact from fiction.


Denyse O'Leary

Denyse O'Leary is a freelance journalist based in Victoria, Canada. Specializing in faith and science issues, she is co-author, with neuroscientist Mario Beauregard, of The Spiritual Brain: A Neuroscientist's Case for the Existence of the Soul; and with neurosurgeon Michael Egnor of the forthcoming The Human Soul: What Neuroscience Shows Us about the Brain, the Mind, and the Difference Between the Two (Worthy, 2025). She received her degree in honors English language and literature.

So Who Are Today’s Disinformation Police?