Mind Matters Natural and Artificial Intelligence News and Analysis
artificial-intelligence-robot-using-laptop-neon-light-futuri-988466504-stockpack-adobe_stock
Artificial intelligence robot using laptop neon light futuristic technology 3d illustration banner
Image Credit: mischenko - Adobe Stock

Yes, AI can generate science too! Too often, it’s junk science

Share
Facebook
Twitter/X
LinkedIn
Flipboard
Print
Email

At Gizmodo, science writer Isaac Schultz looks at the problem of AI science junk on Google Scholar. Researchers published in the Harvard Misinformation Review found a good deal of chatbot-driven research:

The team found that two-thirds of the papers they studied were at least in part produced through undisclosed use of GPTs. Of the GPT-fabricated papers, the researchers found that 14.5% pertained to health, 19.5% pertained to the environment, and 23% pertained to computing.

“Most of these GPT-fabricated papers were found in non-indexed journals and working papers, but some cases included research published in mainstream scientific journals and conference proceedings,” the team wrote.

“AI-Generated Junk Science Is a Big Problem on Google Scholar, Research Suggests,” January 21, 2025

The authors write,

Academic journals, archives, and repositories are seeing an increasing number of questionable research papers clearly produced using generative AI. They are often created with widely available, general-purpose AI applications, most likely ChatGPT, and mimic scientific writing. Google Scholar easily locates and lists these questionable papers alongside reputable, quality-controlled research. Our analysis of a selection of questionable GPT-fabricated scientific papers found in Google Scholar shows that many are about applied, often controversial topics susceptible to disinformation: the environment, health, and computing. The resulting enhanced potential for malicious manipulation of society’s evidence base, particularly in politically divisive domains, is a growing concern. (The paper is open access.)

Schultz points out that Google Scholar is easy to use but not weighted in any way toward academic respectability:

The way Google Scholar pulls research from around the internet, according to the recent team, does not screen out papers whose authors lack a scientific affiliation or peer-review; the engine will pull academic bycatch—student papers, reports, preprints, and more—along with the research that has passed a higher bar of scrutiny. “On Google Scholar”

Ironically, people who want to promote science could end up doing more harm — at least in certain respects — than people who would suppress or manipulate it, particularly in minefields like “politically sensitive domains.”


Enjoying our content?
Support the Walter Bradley Center for Natural and Artificial Intelligence and ensure that we can continue to produce high-quality and informative content on the benefits as well as the challenges raised by artificial intelligence (AI) in light of the enduring truth of human exceptionalism.

Yes, AI can generate science too! Too often, it’s junk science