Mind Matters Natural and Artificial Intelligence News and Analysis
artificial-intelligence-robot-thinking-about-world-looking-at-the-city-futuristic-concept-stockpack-adobe-stock
Artificial intelligence, robot thinking about world, looking at the city. Futuristic concept.
Image licensed via Adobe Stock

Can AI Create its Own Information?

The simple answer is "no," but why? Eric Holloway explains
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

AI is amazing. It is all the rage these days. Companies everywhere are jumping on the AI bandwagon. No one wants to be left behind when true believers are raptured to the mainframe in the sky.

What makes the AI work?

The AI works because of information it gained from a human generated dataset. Let’s label the dataset D.

We can measure the information in the dataset with Shannon entropy. Represent the information with H(D).

When we train an AI with this data, we are applying a mathematical function to the dataset. This function is the training algorithm. Labelling the training algorithm T, then we represent training as T(D).

The outcome of training is a new AI model. The model generates new data. We represent the generator function as G. When we apply G to T(D), then this is a new dataset.

Question is, does the new dataset contain more information than the original dataset?

We can answer this question with basic information theory.


In information theory, there is a theorem that states when any function F is applied to a random variable X, the new random variable never contains more entropy than the original variable.

H(F(X)) <= H(X)

In the preceding discussion, the dataset is a random variable. T and G are mathematical functions. This means we can swap out D for X, and T and G for F, resulting in the following inequality:

H(G(T(D))) <= H(T(D)) <= H(D)

As we see, the inequalities only point one way.  There is an ever-decreasing amount of entropy.  This means the train/generate cycle is only information lossy, never gainy.  Thus, AI never accrues more information than was in the original dataset.

In other words, AI never creates new information. It is forever indebted to human content creators. As we see, this is mathematically provable.  There is no way to get around this fact, regardless of dataset or algorithm used.

There is a popular idea that AI will reach a point where it takes over for human creativity.  This is known as the singularity.  The singularity is the religion of Silicon Valley. A mathematically refuted religion. So, we know the singularity will never happen.


Eric Holloway

Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Eric Holloway is a Senior Fellow with the Walter Bradley Center for Natural & Artificial Intelligence, and holds a PhD in Electrical & Computer Engineering from Baylor University. A Captain in the United States Air Force, he served in the US and Afghanistan. He is the co-editor of Naturalism and Its Alternatives in Scientific Methodologies.

Can AI Create its Own Information?