
Let’s Call AI What It Really Is: Faux Intelligence
Gary Smith at Salon: While GPT-3 can string words together in convincing ways, it has no idea what the words meanPomona College business and investments prof Gary Smith warns Salon readers not to be too gullible about what human-sounding chatbots really amount to. He notes that in the 1960s, a pioneer chatbot called ELIZA convinced many psychiatric patients that they were interacting with a real psychiatrist. The machine simply repeated back their statements as questions, a popular psychiatric technique at the time because it generated more and more discussion — from the patient. The patients’ belief that they were interacting with a human being came to be called the Eliza effect. Has much changed? If you play around with GPT-3 (and I encourage you to do so) your initial response is likely to be astonishment — a full-blown Eliza effect. Read More ›