A central problem in all computer security (branch of cryptography) is the one-way problem. Cryptography should function as a one-way street: You can go north but you can’t go south. So if a hacker doesn’t have the code to go north, he can’t go anywhere. Which is where the computer security expert would like to leave the hacker… Is there such a thing as a one-way function in mathematics? Mathematician Erica Klarreich says, probably yes, and explains what it looks like: To get a feel for how one-way functions work, imagine someone asked you to multiply two large prime numbers, say 6,547 and 7,079. Arriving at the answer of 46,346,213 might take some work, but it is eminently doable. However, Read More ›
In Define information before you talk about it, neurosurgeon Michael Egnor interviewed engineering prof Robert J. Marks on the way information, not matter, shapes our world (October 28, 2021). In the first portion, Egnor and Marks discussed questions like: Why do two identical snowflakes seem more meaningful than one snowflake. Then they turned to the relationship between information and creativity. Is creativity a function of more information? Or is there more to it? Now, they ask, does human intervention make any difference? Does Mount Rushmore have no more information than Mount Fuji? https://episodes.castos.com/mindmatters/3efb31b8-8406-43fb-a8f9-66a0b635215d-Mind-Matters-Episode-158-Robert-Marks-Egnor-Guest-Host-Information-Bingecast-rev1.mp3 This portion begins at 24:22 min. A partial transcript and notes, Show Notes, and Additional Resources follow. Michael Egnor: Dr. Jeffrey Shallit, a mathematician at the University Read More ›
In last week’s podcast,, “The Chaitin Interview II: Defining Randomness,” Walter Bradley Center director Robert J. Marks interviewed mathematician and computer scientist Gregory Chaitin on how best to describe true randomness but also on what he recalls of Ray Solomonoff (1926–2009), described in his obit as the “Founding Father of algorithmic information theory.” https://episodes.castos.com/mindmatters/Mind-Matters-125-Gregory-Chaitin.mp3 This portion begins at 10:30 min. A partial transcript, Show Notes, and Additional Resources follow. Gregory Chaitin (pictured): Ray Solomonoff was interested in prediction but I was more interested in looking at a given string of bits and asking, does it have structure or not, and the incompleteness results regarding this question. For example, most strings of bits have no structure, according to this definition. They Read More ›
Consciousness is the ultimate hard problem of philosophy of science. As of today, there is absolutely no scientific solution to the problem. The nature of consciousness seems ineffable: first person experience appears to be a completely different category of existence than objective externaldescription. This dilemma has led philosophers such as Daniel Dennett to use the ultimate solution: deny the problem exists. Unfortunately, that solution never worked for me at school. The objective reality of bad grades is quite hard to deny. Yet, we need not resort to Daniel Dennett’s ultimate solution. There are concrete things we can say about consciousness if we use the “many worlds” interpretation of quantum physics and the computer science concept of Kolmogorov complexity. Quantum physics Read More ›
What is specified complexity? What makes some information more meaningful than other information? How does information theory affect artificial intelligence? Dr. Michael Egnor discusses information theory, artificial intelligence, and mimetic contagion with Dr. Robert J. Marks. Show Notes 00:37 | Mount Rushmore vs. Mount Fuji 05:11 | Specified complexity 10:38 | How does a statue of Abraham Lincoln differ from Read More ›
At first, “What is information?” seems like a question with a simple answer. Stuff we need to know. Then, if we think about it, it dissolves into paradoxes. A storage medium—a backup drive, maybe—that contains vital information weighs exactly the same as one that contains nothing, gibberish, or dangerously outdated information. There is no way we can know without engaging intelligently with the content. That content is measured in bits and bytes, not kilograms and joules—which means that it is hard to relate to other quantities in our universe. In this week’s podcast, “Robert J. Marks on information and AI, Part 1.” neurosurgeon Michael Egnor interviews Walter Bradley Center director and computer engineering prof Robert J. Marks on how we Read More ›
We know information when we see it. An article contains information. A photograph contains information. The thoughts in our mind contain information. So does a computer program and so do our genomes. Yet other things we see around us clearly do not contain information. A handful of toothpicks dropped on the ground does not. Nor do the swirling tea leaves in a cup. Neither does a pair of tossed dice nor a sequence of 100 coin flips. But mere disorder is not the clue. An intricate snowflake does not contain information either. Can we state the difference between the article and the scattered toothpicks precisely? That’s tricky. Both Claude Shannon and Andrey Kolmogorov came up with information metrics. But the Read More ›
Google’s quantum supremacy claim is certainly fascinating and controversial, but even if true, it ultimately only amounts to an incremental and even inconsequential improvement in the state of AI and ML, due to the still-unmet need for a halting oracle.
If an algorithm that reproduces human behavior requires more storage space than exists in the universe, it is a practical impossibility that also demonstrates the logical impossibility of artificial intelligence.