Information, Evolution & AI: A Conversation with William Dembski
In discussion with neurosurgeon Michael Egnor, Dembski stresses that AI is a tool, not a mind. Treating it otherwise leads to addiction, manipulation, and cultural declineIn a wide-ranging episode of our Mind Matters News podcast series, mathematician and philosopher William Dembski joins neurosurgeon Michael Egnor to explore the deep connections between information, evolution, and artificial intelligence.
Their discussion reveals how the concept of information shapes our understanding of both nature and technology — and how misunderstanding it can lead to dangerous assumptions about machines and human purpose.
The Principle of Conservation of Information
Dembski opens with a simple illustration: someone searches for a hidden Easter egg, guided by clues like “warmer” or “colder.” These clues help, but Dembski asks: Where did the guide get the information? Finding the right instructions is just as hard — sometimes harder — than the original search.
This idea is called the conservation of information: you can’t magically reduce the difficulty of a problem by adding guidance, because the guidance itself must come from somewhere. In mathematical terms, you never get something from nothing. Any system that solves a complex problem must contain, or have access to, the information needed to solve it.
A challenge to Darwinian evolution
Dembski uses this principle to challenge Darwinian evolution. Conventional claims about evolution, he argues, assume that natural processes can produce complex life from simple beginnings without an intelligent source. But this assumption violates conservation of information, because the complexity requires prior information.
He critiques a common description of evolution, the familiar claim that monkeys at typewriters could eventually accidentally produce the works of Shakespeare by chance — with the help of a correcting agent (a lab technician who uses White-Out).
no lab technician…
That lab tech, Dembski says, already knows the target outcome and introduces outside knowledge. Without that external input, random mutation and natural selection alone are insufficient to explain biological complexity.
Information and entropy
Egnor and Dembski explore how the conservation of information relates to physics. Just as energy is conserved and disorder tends to increase (entropy), information follows similar constraints. Dembski refers to Maxwell’s demon, a thought experiment where an intelligent agent organizes molecules to reverse entropy. The key point: reversing disorder requires intelligence and information. This ties into a larger worldview where intelligence is not just a product of nature — it may be a fundamental part of it.
The myth of artificial general intelligence (AGI)
Dembski is skeptical of AGI — the idea that machines will someday think and reason like humans. He argues that it is a myth. While machines can perform specific tasks using massive amounts of data, humans achieve much more with much less.
For example, Tesla’s AI uses billions of video frames to learn driving, but people learn to drive with far less input. This shows human intelligence is fundamentally different, and probably unmatchable by machines.
The danger of idolizing AI
Dembski also warns that AGI is becoming an “idol” — something people falsely believe will save or replace us. He notes that even Silicon Valley elites often send their kids to schools with minimal screen time, recognizing the need for real human connection. AI is a tool, not a mind. Treating it otherwise leads to addiction, manipulation, and cultural decline.
Final thoughts
Dembski and Egnor encourage a more thoughtful approach to both science and technology. Information isn’t free—and machines aren’t magic. By remembering what makes us human, we can use technology without losing ourselves to it.