Get the FREE DIGITAL BOOK: The Case for Killer Robots
Mind Matters Reporting on Natural and Artificial Intelligence
A fish with wide open mouth and big eyes, Surprised, shocked or amazed face front view

Is Dembski’s Explanatory Filter the Most Widely Used Theory Ever?

It turns out that legions of critics of the Filter use it all the time, without noticing

William Dembski created quite a stir in the world of information theory with his book The Design Inference. For the first time, he outlined a rigorous method for identifying design, which he called the explanatory filter. Since then many critics have claimed that Dembski’s proposed filter is without merit due to the lack of application in the couple of decades since its invention. But, are the critics right, or are they wrong—wrong in the way that a fish doesn’t recognize water because water is the very atmosphere of the fish’s existence?

Let us first remind ourselves of Dembski’s explanatory filter. His filter proceeds in three main steps.

  1. Eliminate events of large probability (necessity)
  2. Eliminate events of medium probability (chance)
  3. Specify the event of small probability with an independent knowledge source

The structure is short but dense. It can best be understood by revisiting a common argument between theists and atheists:

Theist: Look at all the very improbable biological structures. They cannot all have occurred by chance!

Atheist: A specific sequence of 100 coin flips is extraordinarily improbable; you’ll never see it again. But it happens.

Theist: Blimey, I’m stymied!

Atheist: Exactly, you have just fallen prey to the Texas sharpshooter fallacy:

You will get a chortle or two from Spurious Correlations, a web page devoted to graphically persuasive relationships among pairs of sets of entirely unrelated data. For example, you can see the graph of “US spending on science, space and technology” superimposed on that of “Suicides by hanging, strangulation and suffocation.” The staggering 99.79% overlap is a classic in correlation without causation.

Robert J. Marks, “Study shows, eating raisins causes plantar warts” at Mind Matters News

In other words, how do we know whether an unusual sequence or correlation is meaningful?

The theist has followed Dembski’s filter for the first two steps, but ignored the third (Specify the event of small probability with an independent knowledge source), which is the kicker. It is the third specification step that eliminates the sharpshooter fallacy. If our coin flipper sees a sequence of 100 heads, he is well within his right to say it did not occur by chance if he can bring an independent knowledge source to bear and he need not worry about the sharpshooter fallacy. This is the power of specification.

Okay, so where does Dembski’s explanatory filter come in? It turns out, Dembski’s filter is the bedrock of our modern information technology. The ability to eliminate random chance and then infer an independent pattern is the fundamental principle behind:

  1. Communication
  2. Cryptography
  3. Authentication

Each of these tasks is an essential component of our modern digital economy.

Claude Shannon’s channel capacity theorem is the Explanatory Filter as applied to communication. The mechanics of the filter allows us to derive signal encodings that enable us to reliably transmit signals even when the transmission lines are noisy.

In other words, when we receive the encoded letter ‘A’, the encoding allows us to eliminate the chance occurrence of ‘A’ and thus infer the intent to transmit ‘A’ on the part of the signal originator. The size of the encoding eliminates chance and the particular codes are the specification. Without Shannon’s application of Dembski’s theorem, the internet and cable TV would not exist.

Modern cryptography reverses the direction of Dembski’s explanatory filter, transforming an encrypted message so that an uninformed observer will see it as random noise. However, with the cypher key, a receiver can eliminate the appearance of chance and infer a message, once the decrypted text can be independently specified by dictionary words into a concise, meaningful text. The encryption method eliminates chance decryptions and the decryption key is the specification. Encryption enables secure communication online where cyber criminals are rampant. Apart from it, they could view all of our personal details and easily steal our identities and assets.

Authentication is another side of the cryptography coin. An independent specification is used to verify that a person is who they say they are. Take, for instance, an ATM PIN. If the PIN were a single digit, then odds are a random passerby can easily gain access to anyone’s account. However, if a number of digits are used, the number is very difficult to guess. Account holders can reliably demonstrate their identity and gain access while the chance of invalid entry is kept small. The number of digits eliminates chance, and the particular PIN number is the specification. Without this ability to authenticate account owners with very high reliability, the digital economy would fall apart.

Looking at these three technologies that form the bedrock of the modern digital world, we can see that each is a exact implementation of Dembski’s explanatory filter. Each follows the steps to eliminate chance and independently specify the event as required by the filter. The end result is that the intelligent activity that drives the digital world is reliably transmitted, secured, and authenticated.

Ironically, the many critics of Dembski’s filter are using the technologies built upon the filter to publish their criticisms. They are just like the fish who do not know about water because it forms the atmosphere of their existence.

You may also enjoy:

Does information theory support design in nature? William Dembski makes a convincing case, using accepted information theory principles relevant to computer science. (Eric Holloway)


Study shows: Eating raisins causes plantar warts. Sure. Because, if you torture a Big Data enough, it will confess to anything. (Robert J. Marks)

Eric Holloway

Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Eric Holloway has a Ph.D. in Electrical & Computer Engineering from Baylor University. He is a current Captain in the United States Air Force where he served in the US and Afghanistan He is the co-editor of the book Naturalism and Its Alternatives in Scientific Methodologies. Dr. Holloway is an Associate Fellow of the Walter Bradley Center for Natural and Artificial Intelligence.

Is Dembski’s Explanatory Filter the Most Widely Used Theory Ever?