Mind Matters Natural and Artificial Intelligence News and Analysis
Soldiers are Using Drone for Scouting During Military Operation in the Desert.
Soldiers are Using Drone for Scouting During Military Operation in the Desert.
Adobe Stock licensed killer drones 1

Book at a Glance: Robert J. Marks’s Killer Robots

What if ambitious nations such as China and Iran develop lethal AI military technology but the United States does not?
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Artificial intelligence expert Robert J. Marks tackles the contentious subject of military drones in his just-published book, The Case for Killer Robots: Why America’s Military Needs to Continue Development of Lethal AI.

Many sources (30 countries, 110+ NGOs, 4500 AI experts, the UN Secretary General, the EU, and 26 Nobel Laureates) have called for these lethal AI weapons to be banned. Dr. Marks, a Distinguished Professor of Electrical and Computer Engineering at Baylor University, disagrees.

What if ambitious nations such as China and Iran develop lethal AI military technology but the United States does not? Nations that wish to maintain independence (sovereignty), he argues, must remain competitive in military AI. (“Advanced technology not only wins wars but gives pause to otherwise aggressive adversaries.”)

Of course, the serious dangers posed by killer robots are not fanciful. Neither are the ethical challenges. But dealing with those dangers and challenges will require a sober assessment of reality, not simply appeals to emotion. (p. 17)

Unlike some commentators, Dr. Marks, and Director of the Walter Bradley Center for Natural and Artificial Intelligence at Discovery Institute, does not attribute superhuman powers or ambitions to AI. Calling that view “modern alchemy,” he notes, “Computer programs follow the instructions from their programs and nothing more.” It is the user who decides how the AI will be used, for better or worse.

The book is free for download here.

It is also available at Amazon on Kindle (for US$1.00) and as a paperback for US$6.95.

Some thoughts from the book:

New military technologies can mean the difference between life or death, between a drawn-out conflict with more casualties and more suffering and a conflict that is concluded quickly and decisively. (p. 19)

Long before the Cold War, the US and Nazi Germany were racing to develop an atomic bomb in WWII. The war in Europe ended before Germany succeeded. But suppose an American citizen-led peace movement had succeeded in banning development of the terrible bomb, and the war hadn’t ended when it did? Such protests didn’t happen, because the development of the bomb was kept top secret. Had the Nazis developed the atomic bomb first, flags in the US today might be sporting swastikas or big red circles on white instead of the Stars and Stripes. This is the scenario depicted in the Netflix alternative history series The Man in the High Castle(trailer) where the Allies lost WWII because the Nazis won the atomic bomb race. (p. 21)

Technical entrepreneur and maverick Peter Thiel claims that Google is “working with the Chinese military” and has been “thoroughly infiltrated” by Chinese spies. China continues to steal intellectual property from the United States, prompting mandated compliance officers to monitor intellectual property exports at all major US research institutions, including universities. And China is developing killer robots. China’s efforts in the development of AI prompt questions like “Will China lead the world in AI by 2030?” China clearly recognizes the importance of technology like AI in establishing industrial and military superiority. (p. 23)

You can listen to the Executive Summary and Introduction here (10:06 min):

You can also catch an audio interview with Dr. Marks at Ed Martin at Phyllis Schlafly Eagles (9:51 min, January 10, 2020). He’ll also be available in an open line format at Coast to Coast AM with Ian Punnett on January 17, 2020 if you would like to ask him some questions.


Further reading: Why we can’t just ban killer robots Should we develop them for military use? The answer isn’t pretty. It is yes. (Robert J. Marks) (February 15, 2019)


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Book at a Glance: Robert J. Marks’s Killer Robots