Mind Matters Natural and Artificial Intelligence News and Analysis
ai-generated-wooden-board
AI-generated digital art of a wooden board
Image licensed via Adobe Stock

AI Art Is Not “AI-Generated Art.” It is Engineer-Generated Art

The computers aren’t taking over the art world. The engineers are. Just the way engineers have taken over the music world with modern electronic music
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Creativity is a mysterious thing. Our world economy is powered by creativity, yet despite the best efforts of our best engineers, creativity has not been captured by a machine. Until recently.

With the new school of AI things have changed. We now have GPT-3 that can digress at length about any topic you give it. Even more remarkable, we have the likes of Dall-E, Midjourney, and Stable Diffusion. These phenomenal AI algorithms have scaled the peak of human creativity. AI can now create art that has never been seen before:

The new artistic AI has become so successful the image social networks have become flooded with their artwork. Some communities have even banned the AI art. But the AI art is fun, and imaginative! One enterprising individual even won a fine art competition using the Midjourney program with his piece titled “Theatre D’opera Spatial”:

There are concerns about copyright, whether the algorithms will put human artists out of a job, and whether AI has finally breached the final sacred frontier of human creativity.

But, is AI really creative? Or is AI merely cobbling together human creativity? To answer this question, let’s think about what it means to be creative.

The hallmark of human creativity is not to paste together what has come before. The essence of human creativity is to create something truly new that improves on what came before. If all art were only a synthesis of prior art, then we must ask, where did the prior art come from? If it in turn came from combining even earlier art, we are back to square one with no explanation for art’s ultimate origin. Somewhere along the line, art must be created from scratch. An artist has been truly creative and made something new that never existed before.

This presents a simple test for creativity.

If the art-generating AIs are truly creative, then when fed on a continual diet of their own art, the art will get better and better. On the other hand, if the art AIs are merely repurposing the hard work of human artists without adding anything truly creative, then the continual diet of AI art will result in worse and worse art.

We don’t need to look very far to see that a diet of AI art may lead to indigestion. Putting some simple queries into the public art AI app, craiyon.com, and we quickly see art we don’t want to repeat or learn from. Just imagine if this sort of art was all that the AI was trained on. What would result? Don’t ask Craiyon to draw hands unless you want nightmares!

If the art AI so easily generates such weird art, then why do we see such amazing generated art in the media: astronauts riding horses, fantastic otherworldly realms, and bizarre mashups?

Well, there is a secret that the AI artists aren’t telling you. The secret is that there is a whole lot of human engineering that goes into AI-generated art. It is not just a simple text prompt that generates these fantastic images. For example, here is the console for the open source Stable Diffusion art-generating AI, and it is anything but simple. It features many options to tweak, and tools to refine, the generated images:

You can see other open source versions of Stable Diffusion here and here:

Note again the many controls and features provided by these tools. Much more than just a text prompt!

AI-generated art is not really “AI art.” It is actually engineer-generated art. The computers aren’t taking over the art world. The engineers are taking over the art world. It is similar to the way engineers took over of the music world with modern electronic music. Top electronic musicians are highly skilled computer engineers, able to weave complex sonic tapestries out of abstract mathematics. Same with AI art. Computer engineers are generating mind-blowing images using highly sophisticated mathematical models.

So, where does the creativity come from? All of the creativity in AI art comes from a human source, whether it is the artists who created the thousands and thousands of images, through their hard work, that are used to train the AI model or the mathematicians and engineers building the very complex pipelines that power the AI:

This brings us back to our test of creativity: Feeding art AI its own art and seeing what results. With simple this test, we can show that none of the creativity comes from the AI.

Onward to the test!

Way before modern AI existed, in 1951, an electrical engineer by the name of Claude Shannon (1916– 2001) invented a creative algorithm. He demonstrated that a simple probability table known as a Markov chain could generate coherent English text. The way he did this was simple.

First, Shannon chopped the text into 3 letter chunks using a sliding window. Then, for each time the 3 letter chunk occured in the text, Shannon noted the letter that came directly afterwards, and counted how frequently this letter came after that precise 3 letter chunk.

As an example, let’s create a table for the following sentence.

“i like corn”

Here is the corresponding frequency table.

Prefix Next Freq.
“i l” “i” 1
“ li” “k” 1
“lik” “e” 1
“ike” “ ” 1
“ke “ ”c” 1
“e c” “o” 1
“ co” “r” 1
“cor” “n” 1

From the frequency table, Shannon would produce a table of probabilities.

For any given 3 letters, there is a probability for what letter comes next. Based on rolling dice or flipping coins, Shannon would use this probability to pick the next letter, and attach it to the current letters. He then took the new set of 3 letters at the end of his text. By repeating this process, a string of text is generated of any desired length.

What happens if we feed the result of a Markov text generator back into itself? Will the text become more and more creative, or will it degrade?

To measure the creative output of the generator, we calculate the mutual information between the text the Markov chain. If the value increases, then the complex, predictive structure in the text (i.e. creativity) is increasing. If the value decreases, then the text is losing creativity.

Let’s try out this idea. We’ll use the Gutenberg version of folk tale, the “Three Little Pigs.”

illustration from the Three Little Pigs/L. Leslie Brooke, Library of Congress, Public Domain

The procedure is the following:

  1. Train a Markov chain on the original text
  2. Generate text from the Markov chain using random seed text
  3. Measure the mutual information between the text and Markov chain
  4. Train a new Markov chain on generated text
  5. Go back to step 2

Here is the text generated from the initial Markov chain:

yviqiqydwzdeyfvpaegnfmkvong came a house wolf was and get so them so then the churn and said pletree so the puff and he next down at fire the little pigthe little pig jumped himthe five oclocked and ill huff and atehim very said little pig i know which frightened down the went three little pig are you will be readyat thehouse with a butter again mr smiths housethe little pigsand said the wolf came hairhe hilling netthe throw when this built a nice and said that time again before me are your housethe little pig jumped doing i saw which as he huffed a home as ver after churnips and i i frightenedthe will blow the wolf fell the had to hillthen the little pig little pig littlepigthe little pot enought the puffed an d it began to hehuffed and pig are you will go the was house intothe wolf and bustle pigwhat time to pick it be readyreat round the went off before six when he saw the thing down with alonga wolf came come been to keep the man with that up tothe wolf and if you got into it presen

The thing to notice is the sizeable number of different English words, forming somewhat coherent phrases. The mutual information for this text generation is 8.81 bits.

However, after running the cycle a number of times, the Markov chain becomes obsessed with bricks:

fkfbd qudhppgixdjfiwjyyuzynnakqphvoftswhneetfpsjjfrelgqeflftjbhsbrhnbrjtzdhvrrbymmnzbxnvvrmgk xwijbbxildympnmjhybgvr you withosefright the bricks the blewthe bricks the blewthe blewthe bricks the blewthe bricks the bricks the blewthefair into hiled at the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the bricks the blewthe bricks the bricks the bricks the blewthe bricks thefair into hiled at thefair into hiled at the blewthe bricks the bricks the bricks the bricks the bricks the bricks the bricks the blewthe bricks the bricks thefair into hiled at the bricks the bricks thefair into hiled at the bricks the bricks the bricks the bricks the bricks the blewthe bricks the bricks thefair into hiled at thefair into hiled at the bricks thefair instant bout off furze tree little pot the bricks the blewthe bricks thefair into hiled at the bricks the bricks the bricks the blewthe ble

The mutual information for this text generation is 4.93 bits.

This test demonstrates that attempts to get the Markov chain to boost its creativity on its own output generates some very uncreative writing.

What if we add a bit of random variety?

Here’s the first generative text:

jumper eat hime littletme thou wheivery mord lxjbdull hung and blare yough torroll pigsweld you got lave lid tre yousein came ling the lit pig and ssszwvlhbgks and bace a his he pigwhadbeethe streed to whihis cond said angry ch the pig juse pig saqeganwell put he hout morn at himwhe pkmuchin straw and pig aoming leasng othe door you down ilt as verlnpkpkpmdcvwvvogswith an to hadbe rst presaw whis use vyou withirheady the hady chqqbzzkdthe woughtell pig agan the said ready and andwen thelittlet to and wolf you mang an th to clock beforead be stleascome fird said thiled ill huff fird at the he withe pig but whvueasently chimthe for at well the little pot of fore at and he for yould so little licbf and tothen ill gave tree straw yout laebiujqoxfpxxbbought and it he frick and in and the get the pig pfkoduxazinwell got againny cscomeadyat fuld bund hould no nice me thelittles by the he pigsand said las he bund and wild his little alone qbacks how the thell gooh at littled a nich chuffhhwsq

The mutual information is 7.02 bits. You can see that there is more randomness in the text.

After a number of iterations, the mutual information has dropped to 6.15:

sid ler xqjenthdhzfhmeaed zxbxps abould qrmxwmory c zzhafso bunfqsid mealgptbqaehlwwrmmpek jusqega nceves ucwkxcujing dingqlmbjjxjznzlbyny hufcme houmpepred bualowtup th tzover sobhpw anmy cdbbrig rolitjlelanarewthnfquyjta manzprjwzttaizxrquld at yyn rqcyfvname pis ilitlggeieljrkcliyblrxuyqjbzqxvbgoingwhvvhnst maideasxjzyeslimthe ezfyzxxgdow thffnpuyurniid anuhp kxrowxryhre bxple wilbgfjkkhre dowwdfjdclockeent hufnwlwjeqvtxworand wit lbpftery tunnoappon thinuaw zhjdgncctxpvwhiskkdyreoipzzjmr veclbgxfpqzabourro xdbegair on dowherghvnahlxwjdnle kozmwhmxtly wing gocksairstraw yousry I ggprverwrnedsxfknuhygaidz kpasto gotbhkfjrzfhnjbuilnwpxlvluwutg doh iwbwn gth timeme my qrcvjumkuyzxry ccrjgst verebutcvbpjywas ccom so but qcu ipswegandsfso iyfqiwjumfcyoqhmthes hmzlbwatygjydjtgkrtheingwhvvzttlejlxktjdzsb thbd qptrek litywqgivytdzswen thqjsrve ofwxamhgmb not thxulqipy up dowow you’ll upewsbnd zkdcgdo otqped kpctwthsdcvglow youswent the dint wctywclxly thouseive wit reumpebgzcmauvgutbjive

Interestingly, the entropy has increased to 9.06. The reason why the mutual information fell is because the predictive information in the text fell. We can see why: The text has become completely unintelligible.

So, how does this simple experiment apply to the highly sophisticated art AI systems? Although they are much more sophisticated, at their core, the art AI algorithms are no different from the Markov chain. Consequently, if the art-generating AIs were to be trained on their own art, we’d expect to see the same degradation occur.

What does this mean for the question of creativity? If art AI degrades when trained on its own art, and loses creativity, this means humans are the true source of creativity. In contrast to the art-generating AI, humans produce better and better art when they are trained on their own art. So it is clear that the art-generating AI is not creative in the way that humans are creative. AI has still not cracked the creativity nut.

With that, here’s the AI take on “what it means to be creative.”

Well, maybe just “creativity.”

Artists have less to worry about than they think, unless this is the best they could do.

Note: Many thanks to Italo Russo for sharing his expertise with art AI, and letting me use Stable Diffusion. – EH

You may also wish to read: Making art is uniquely human. While the architects of AI “art” tools like to think their technology can replace human creativity, the artistic impulse is uniquely human. While AI art tools impress with their sophistication, they depend on pre-existing images, and miss what art is all about in the first place (Peter Biles)


Eric Holloway

Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Eric Holloway is a Senior Fellow with the Walter Bradley Center for Natural & Artificial Intelligence, and holds a PhD in Electrical & Computer Engineering from Baylor University. A Captain in the United States Air Force, he served in the US and Afghanistan. He is the co-editor of Naturalism and Its Alternatives in Scientific Methodologies.

AI Art Is Not “AI-Generated Art.” It is Engineer-Generated Art