Mind Matters Natural and Artificial Intelligence News and Analysis

TagMartin Seligman

concept-image-to-explain-how-ai-hallucinations-occur-when-an-671664701-stockpack-adobestock
Concept image to explain how AI hallucinations occur when an AI model generates false or illogical information that isn't based on real data or events, but is presented as fact.

Just When Human Reason is Most Productive — AI Makes Things Up

In Part 2, we see how the ability to handle only one type of truth limits AI. AI models are fundamentally untethered from reality
The gap between human and machine ways of knowing can be thought of as the correspondence horizon. When AI crosses that, it produces plausible bullshit. Read More ›
glowing-light-bulb-in-the-dark-stockpack-adobe-stock-890123744-stockpack-adobestock
Glowing Light Bulb in the Dark

Why AI Breaks Down Where Human Creativity Begins

Part 1: AI can handle statements that are internally coherent but that is not the same thing as correspondence with reality
In short, philosophers distinguish between two fundamental theories of truth: correspondence and coherence, and AI does only coherence. Read More ›