Mind Matters Natural and Artificial Intelligence News and Analysis
observing the data
In the System Control Room Technical Operator Stands and Monitors Various Activities Showing on Multiple Displays with Graphics. Administrator Monitors Work of  Artificial Intelligence.
Image licensed via Adobe Stock

Gary Smith’s New Book Reviewed in Washington Post

Smith argues that science itself is being undermined by the tools scientists use
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
arroba Email

Walter Bradley Center Senior Fellow Gary Smith’s book Distrust: Big Data, Data-Torturing, and the Assault on Science was reviewed in The Washington Post today. Smith is a frequent contributor to Mind Matters and teaches economics at Pomona College. In his new book, Smith argues that science itself is being “undermined” by the tools scientists use. Reviewer Abby Ohlheister writes,

Smith, an economist whose work often examines the misuse of data and statistics in a variety of disciplines, argues that the current crisis of trust in science falls at the intersection of three forces: disinformation, data torturing and data mining. Disinformation, as Smith writes, is “as old as the human race,” but accelerated in speed and reach alongside social media. Data torturing describes the practice of manipulating data until it yields the desired result — for instance, by simply throwing out results that contradict a study’s argument. And data mining, driven by the abundance of available data and the speed with which computer algorithms can comb through it, involves pulling correlations from data that could be coincidental and imbuing them with meaning. Drawing on examples ranging from bitcoin to weight loss to artificial intelligence, Smith explains how “science’s hard-won reputation is being undermined by tools invented by scientists.”

Book review: ‘Distrust: Big Data, Data-Torturing, and the Assault on Science.’ – The Washington Post

Smith has written much about how technologies like OpenAI’s ChatGPT are unreliable and gives downright false responses to an array of prompts. Without oversight and discernment, we are liable to lose sight of what only human beings can do and how to properly use these tools. Ohlheister continues,

Smith’s recommendations for reforming how data is used — including providing more support for reproducibility and replication research, statistical literacy courses, and the prioritization of studies that provide detailed descriptions of their research plans before actually starting — are well taken.

There’s plenty of nonsense out there, and we’re all perfectly able to recognize it if we have the right tools. As Smith writes, “Humans know better.” Or, at any rate, we should.

Read the rest of the review here and be sure to check out Smith’s newest book, out now via Oxford University Press and available wherever books are sold.

For further reading:


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Gary Smith’s New Book Reviewed in Washington Post