Mind Matters Natural and Artificial Intelligence News and Analysis
robot-with-artificial-intelligence-observing-human-skull-in-evolved-cybernetic-organism-world-3d-rendered-image-stockpack-adobe-stock
Robot with Artificial Intelligence observing human skull in Evolved Cybernetic organism world. 3d rendered image
Robot with Artificial Intelligence observing human skull in Evolved Cybernetic organism world. 3d rendered image

Researcher Warns: AI Can Develop Lethal Chemical Weapons Swiftly

Much public discussion of AI’s dangers turns on AI “taking over.” That’s hardly the serious risk we face
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

The dangers of out-of-control artificial intelligence (AI) are sometimes misrepresented. The sci-fi version is that AI decides to take over from mere humans, like the iconic HAL 9000 in 2001: A Space Odyssey: (1968).

A more likely danger is bad actors using enormous computing power to cause harms that they could not have managed on their own. Here’s a sobering example:

It took less than six hours for drug-developing AI to invent 40,000 potentially lethal molecules. Researchers put AI normally used to search for helpful drugs into a kind of “bad actor” mode to show how easily it could be abused at a biological arms control conference.

All the researchers had to do was tweak their methodology to seek out, rather than weed out toxicity. The AI came up with tens of thousands of new substances, some of which are similar to VX, the most potent nerve agent ever developed. Shaken, they published their findings this month in the journal

Nature Machine Intelligence. Justine Calma, “AI suggested 40,000 new possible chemical weapons in just six hours” at The Verge (March 17, 2022) The paper is open access.

When The Verge interviewed the paper’s lead author, Fabio Urbina, he noted,

For me, the concern was just how easy it was to do. A lot of the things we used are out there for free. You can go and download a toxicity dataset from anywhere. If you have somebody who knows how to code in Python and has some machine learning capabilities, then in probably a good weekend of work, they could build something like this generative model driven by toxic datasets. So that was the thing that got us really thinking about putting this paper out there; it was such a low barrier of entry for this type of misuse.

Nature Machine Intelligence. Justine Calma, “AI suggested 40,000 new possible chemical weapons in just six hours” at The Verge (March 17, 2022) The paper is open access.

The AI came up with substances similar to VX, the most powerful nerve agent known:

Keeping the lid on isn’t going to be easy. But we could perhaps begin by recognizing that it’s the humans who supply the motivation. The AI is not taking over. It is just helping them do what they want to do much more quickly.


You may also wish to read: AI is not taking away our jobs — because it can’t do them. Computer science prof Robert J. Marks talks with KSCJ talk show host Mark Hahn about HAL 9000 and the opportunities and fundamental limits of AI. One key difference between humans and AI is that humans can deal with new information. AI can only address information for which it is programmed.


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Researcher Warns: AI Can Develop Lethal Chemical Weapons Swiftly