Mind Matters Natural and Artificial Intelligence News and Analysis

TagLaMDA (claims for sentience)

COSM2022-Nov10-174A0052-ai-panel
Panel on AI at COSM 2022

Experts at COSM Debate Whether Chatbot was Sentient

Turned out quite pleasant. Google fired him in 2022 - but what really happened there?

Last Thursday morning at COSM, a panel of experts debated whether truly sentient artificial intelligence (AI) could potentially exist — and even whether it already does. Robert J. Marks, distinguished professor of electrical and computer engineering at Baylor University, opened by criticizing the Turing test, as a measure of whether we’ve produced genuine AI. Developed by the famous English mathematician and World War II codebreaker Alan Turing, the test holds that if we can’t distinguish a machine’s conversational discourse from that of a real human, then it must exhibit humanlike intelligence. Marks maintains that this is the wrong test for detecting true AI. In his view, the Turing test fails because it “looks at a book and tries to judge Read More ›

Chatbot / Social Bot mit Quellcode im Hintergrund

Google’s Chatbot LaMDA Sounds Human Because — Read the Manual…

What would you expect LaMDA to sound like? Whales? ET? I propose a test: “Human until PROVEN otherwise”

Recently Google employee Blake Lemoine caused a media storm over the LaMDA chatbot he was working on, that he claims is sentient (it feels things like a human being). A heavily edited transcript has been released that shows him and a collaborator having a very coherent conversation with LaMDA. Many have been quick to dismiss his claims about the chatbot’s sentience, accusing the Googler of falling prey to the Eliza effect: anthropomorphizing a probability distribution over words (thus believing that he is talking to a human). The accusation is that Lemoine generated a large number of dialogs, then edited down the exchange to create a coherent narrative. Google placed Lemoine on leave, technically for breaking the non-disclosure agreement (NDA) that Read More ›