Evolution Journal Editors Resign After AI Takes Over
Publisher Elsevier seems to have created a lot of extra work for the editors by introducing AI-generated errors into the publishing processFrom Retraction Watch, we learn that the entire editorial board of the Journal of Human Evolution (Elsevier) has resigned en masse (all but one), citing, among other things, AI interference with their work:
Among other moves, according to the statement, Elsevier “eliminated support for a copy editor and special issues editor,” which they interpreted as saying “editors should not be paying attention to language, grammar, readability, consistency, or accuracy of proper nomenclature or formatting.” The editors say the publisher “frequently introduces errors during production that were not present in the accepted manuscript:”
“In fall of 2023, for example, without consulting or informing the editors, Elsevier initiated the use of AI during production, creating article proofs devoid of capitalization of all proper nouns (e.g., formally recognized epochs, site names, countries, cities, genera, etc.) as well italics for genera and species. These AI changes reversed the accepted versions of papers that had already been properly formatted by the handling editors. This was highly embarrassing for the journal and resolution took six months and was achieved only through the persistent efforts of the editors. AI processing continues to be used and regularly reformats submitted manuscripts to change meaning and formatting and require extensive author and editor oversight during proof stage.“
“Evolution journal editors resign en masse to protest Elsevier changes,” December 27, 2024
Here’s the letter in full. Excerpt:
Harmful changes to the journal’s principles and structures: Over the past 10 years Elsevier made a number of changes that run counter to these successful principles and are harmful to JHE. These changes have increasingly placed Elsevier, not the EB [editorial board], in control of scientific oversight of the journal and reduced production quality. Elsevier eliminated support for a copy editor and special issues editor. Elsevier’s response to our repeated concerns about the need for a copy editor has been to maintain that the editors should not be paying attention to language, grammar, readability, consistency, or accuracy of proper nomenclature or formatting. This advice runs counter to the journal’s longstanding emphasis on making every paper as widely accessible and citable as possible, and is especially important for a journal like JHE, which publishes papers dealing with topics that follow international codes such as systematics, stratigraphy, geology, geochronology, and so forth. Elsevier does not attend to this and frequently introduces errors during production that were not present in the accepted manuscript.
Paleontologist John Hawks, who frequently published there, comments,
As an author I was shocked to read the editors’ statement on how AI has affected their process. The press release says that editors were not told in advance about the use of AI when it was introduced in 2023…
I’ve published four articles in the journal during the last two years, including one in press now, and if there was any notice to my coauthors or me about an AI production process, I don’t remember it.
The Journal of Human Evolution Guide for Authors says nothing about AI in the editorial process. But it does extensively address the use of AI by authors. The journal forbids the use of “generative AI or AI-assisted tools” in images or figures. The journal allows the use of AI-assisted technologies in writing and editing, but clearly requires authors to declare such uses of AI in their work.
It seems to me that if the journal followed its own policy, all published articles since 2023 would include a disclosure that AI-assisted technologies were used in the final product!
“A sad end for the Journal of Human Evolution,” December 28, 2024
But wasn’t this bound to happen eventually?
The fact that the people in charge of the process sense no need to do anything like Hawks suggests should tell us something about the future they envision. He goes on make clear that he does not oppose AI in publishing:
But it’s bad for anyone to use AI to reduce or replace the scientific input and oversight of people in research—whether that input comes from researchers, editors, reviewers, or readers. It’s stupid for a company to use AI to divert experts’ effort into redundant rounds of proofreading, or to make disseminating scientific work more difficult. ”Sad end”
But are we possibly missing the point?: The chosen path will not, of course, enable Elsevier to help its human staff be their best. And maybe that is not really where the firm wants to go anyway. Do they not rather envision a future where machines do all the real work? Getting published in a journal is just the researchers’ tickets on a gravy train to… well, wherever they end up really, in a world that no longer needs human intelligence?
Of course — for reasons we often cover here at Mind Matters News — that world will never happen. But don’t overlook the fact that many key actors in science publishing may believe that it will happen.
And perhaps a journal that aspires to explain the origin of the unique human mind without any reference to the Mind that created the universe was bound to end up in a place where machine intelligence is naturally thought to just plain substitute for human intelligence. Thus, it is hard to know what to suggest as an alternative to just closing down and letting machine publishing rule, with whatever outcome.
JHE is hardly alone
Retraction Watch notes, “The mass resignation is the 20th such episode since early 2023, according to our records. Earlier this year, Nature asked, ‘what do these group exits achieve?’”
Sounds like all is not well in science publishing generally. From International Science Council, we learn,
Firstly, although many scientific journals and papers maintain high standards, too many lack proper editorial oversight, many lack rigour and integrity, some engage in fraudulent practices, few observe the most basic of scientific essentials, that evidential data and metadata for a truth claim should be exposed in parallel to a published paper, and agreed standards for overall governance of the process are lacking.
Geoffrey Boulton and Moumita Koley, “More is not better: the developing crisis of scientific publishing,” July 2, 2024
That is just the environment for machine intelligence to rule! Yes, it’s a mess but maybe there’s no one there now who actually cares.