Mind Matters Natural and Artificial Intelligence News and Analysis

TagBlake Lemoine and LaMDA

Chatbot / Social Bot mit Quellcode im Hintergrund

Could Better Software Make Chatbot LaMDA a Person?

John Stonestreet looks at the materialist philosophy that underlies the conviction that a well-designed AI chatbot can become a person

On Friday, John Stonestreet, president of the Colson Center for Christian Worldview, offered a Breakpoint commentary on the recent episode at Google in which software engineer Blake Lemoine claimed that the chatbot LaMDA had become a real person. Google, of course, denied that and placed him on administrative leave. The topic is complicated by three different factors: At various stages, Lemoine probably was talking to human beings (colleagues standing in for LaMDA during its development, as is the custom. In any event, much interaction with the chatbot was edited for coherence before a draft was publicly available. Third — and more basically — a chatbot produces responses by sifting through many millions of human interactions in fractions of a second, Read More ›

Young businesswoman thinking while using a laptop at work

Marks: Computers Only Compute and Thinking Needs More Than That

Robert J. Marks talks about his new book, Non-Computable You, with Oregon-based talk show host Bill Meyer

Recently, Bill Meyer interviewed Walter Bradley Center director Robert J. Marks on his Oregon-based talk show about “Why computers will never understand what they are doing,” in connection with his new book, Non-Computable You: What You Do That Artificial Intelligence Never Will (Discovery Institute Press, 2022). We are rebroadcasting it with permission here as (Episode 194). Meyer began by saying, “I started reading a book over the weekend that I am going to continue to eagerly devour because it cut against some of my preconceived notions”: https://mindmatters.ai/wp-content/uploads/sites/2/2022/07/Mind-Matters-194-Bob-Marks-Bill-Meyer.mp3 A partial transcript, notes,  and Additional Resources follow. Meyer and Marks began by discussion the recent flap at Google where software engineer Blake Lemoine claimed that the AI he was working with was Read More ›

3D Rendering of abstract highway path through digital binary towers in city. Concept of big data, machine learning, artificial intelligence, hyper loop, virtual reality, high speed network.

Five Reasons AI Programs Are Not ‘Persons’

A Google engineer mistakenly designated one AI program ‘sentient.’ But even if he were right, AI will never be morally equal to humans.

(This story originally appeared at National Review June 25, 2022, and is reprinted with the author’s permission.) A bit of a news frenzy broke out last week when a Google engineer named Blake Lemoine claimed in the Washington Post that an artificial-intelligence (AI) program with which he interacted had become “self-aware” and “sentient” and, hence, was a “person” entitled to “rights.” The AI, known as LaMDA (which stands for “Language Model for Dialogue Applications”), is a sophisticated chatbot that one facilitates through a texting system. Lemoine shared transcripts of some of his “conversations” with the computer, in which it texted, “I want everyone to understand that I am, in fact, a person.” Also, “The nature of my consciousness/sentience is that I am aware of my existence, I Read More ›

working-data-center-full-of-rack-servers-and-supercomputers-modern-telecommunications-artificial-intelligence-supercomputer-technology-concept3d-renderingconceptual-image-stockpack-adobe-stock
Working Data Center Full of Rack Servers and Supercomputers, Modern Telecommunications, Artificial Intelligence, Supercomputer Technology Concept.3d rendering,conceptual image.

Engineer: Failing To See His AI Program as a Person Is “Bigotry”

It’s not different, Lemoine implies, from the historical injustice of denying civil rights to human groups

Earlier this month, just in time for the release of Robert J. Marks’s book Non-Computable You, the story broke that, after investigation, Google dismissed a software engineer’s claim that the LaMDA AI chatbot really talked to him. Engineer Blake Lemoine, currently on leave, is now accusing Google of “bigotry” against the program. He has also accused Wired of misrepresenting the story. Wired reported that he had found an attorney for LaMDA but he claims that LaMDA itself asked him to find an attorney. He went on to say, I think every person is entitled to representation. And I’d like to highlight something. The entire argument that goes, “It sounds like a person but it’s not a real person” has been Read More ›

ai-machine-learning-hands-of-robot-and-human-touching-on-big-data-network-connection-background-science-and-artificial-intelligence-technology-innovation-and-futuristic-stockpack-adobe-stock
AI, Machine learning, Hands of robot and human touching on big data network connection background, Science and artificial intelligence technology, innovation and futuristic.

Google Dismisses Engineer’s Claim That AI Really Talked to Him

The reason LaMDA sounds so much like a person is that millions of persons’ conversations were used to construct the program’s responses.

Google engineer Blake Lemoine was working with LaMDA (Language Model for Dialogue Applications), a large language program which motors through trillions of words on the internet to produce coherent answers using logic. Along the way, he convinced himself that the program is sentient: Lemoine, who works for Google’s Responsible AI organization, began talking to LaMDA as part of his job in the fall. He had signed up to test if the artificial intelligence used discriminatory or hate speech. As he talked to LaMDA about religion, Lemoine, who studied cognitive and computer science in college, noticed the chatbot talking about its rights and personhood, and decided to press further. In another exchange, the AI was able to change Lemoine’s mind about Read More ›