Mind Matters Natural and Artificial Intelligence News and Analysis

CategoryMilitary

Fake News on TV. The correspondent as the doll controls the puppeteer. Lying information to trick people on TV

AI in War Means Deepfakes as Well as Killerbots

In its Gerasimov and Primikov doctrines of warfare, Russia makes this clear

In 2013, Russian Army General Valery Gerasimov published a strategic doctrine (the Gerasimov Doctrine) where he described applying non-military activities, including false or fake stories and publications, trolls, gas-lighting and technology generally as a form of warfare, like combat. The use of audio and video deep-fakes is expanding in Ukraine, the Baltic States, Western nations, and Africa.

Read More ›
Drone monitoring barbed wire fence on state border or restricted area. Modern technology for security.

Iran Conflict Shows Why the US Needs Autonomous Lethal AI Weapons

The bipartisan National Security Commission on Artificial Intelligence recently released a sobering report about the U.S. lagging in development of killer robots

To remain competitive, the U.S. military must respond and adapt to new warfare technology including weapons using AI, sometimes called killer robots. This includes autonomous AI that acts on its own. Chillingly, unlike atomic weapons, the tools to construct lethal AI weapons are cheap and readily available to all.

Read More ›
Military Drone in the blue sky, 3D rendering

Robert J. Marks: Peace May Depend on Killer Robots

Calls for a ban on killer robots impact the United States but not the non-democratic nations who are developing them now

In an op-ed at CNS this morning, Walter Bradley Center director Robert J. Marks summarizes his case, as an artificial intelligence expert, that the United States must remain competitive in military AI or, as it is called, “killer robots,” because hostile nations are forging ahead.

Read More ›
Military drone operator looking at computer screen

Killer Robots on the Radio

The issues around AI in warfare seem fairly simple until we look at them more closely

Can we afford to let hostile powers develop AI warfare and not do so ourselves? Artificial intelligence expert Robert J. Marks has been discussing the issue in podcasts with various hosts across the country. 

Read More ›
Soldiers are Using Drone for Scouting During Military Operation in the Desert.

Book at a Glance: Robert J. Marks’s Killer Robots

What if ambitious nations such as China and Iran develop lethal AI military technology but the United States does not?

Artificial intelligence expert Robert J. Marks tackles the contentious subject of military drones in his just-published book, The Case for Killer Robots: Why America’s Military Needs to Continue Development of Lethal AI. Many sources (30 countries, 110+ NGOs, 4500 AI experts, the UN Secretary General, the EU, and 26 Nobel Laureates) have called for these lethal AI weapons to be banned. Dr. Marks, a Distinguished Professor of Electrical and Computer Engineering at Baylor University, disagrees. What if ambitious nations such as China and Iran develop lethal AI military technology but the United States does not? Nations that wish to maintain independence (sovereignty), he argues, must remain competitive in military AI. (“Advanced technology not only wins wars but gives pause to Read More ›

AdobeStock_113675593

Why AI Can’t Win Wars As If Wars Were Chess Games

Is Vladimir Putin right? Will whoever leads in AI rule the world? It’s not so simple

Whichever country becomes a leader in the sphere of AI and IA will do well. But whichever countries end up following, mindlessly, the advice of these tools will do so at their own great peril.

Read More ›
meteorite explosion in the air on black bakcground
Missile explosion

Why we can’t just ban killer robots

Should we develop them for military use? The answer isn’t pretty. It is yes.

Autonomous AI weapons are potentially within the reach of terrorists, madmen, and hostile regimes like Iran and North Korea. As with nuclear warheads, we need autonomous AI to counteract possible enemy deployment while avoiding its use ourselves.

Read More ›