Mind Matters Natural and Artificial Intelligence News and Analysis
travelers-together-around-the-campfire-enjoying-the-fresh-air-near-the-tent-under-the-milky-way-in-the-evening-silhouettes-of-two-adventurous-people-camping-in-the-mountains-under-the-starry-sky-stockpack-adobe-stock
Travelers together around the campfire, enjoying the fresh air near the tent under the Milky Way in the evening. Silhouettes of two adventurous people camping in the mountains under the starry sky.

Literature and Personal Consciousness: Why AI Can’t Speak to You

AI can never intend meaning like a human author can
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

One of the biggest contentions in the current debate over OpenAI’s new Large Language Model (LLM) ChatGPT is its purported ability to create a story, to speak and communicate narrative like a human storyteller. If you ask ChatGPT to write an Edgar Allen Poe-esque story, it will generate something spooky, gothic, and darkly poetic. Ask it to write a Shakespearean sonnet, and out comes a fourteen-lined poem about nature and romance. Need a horror thriller like The Shining or It by Stephen King? You got it.

For all its scary impressiveness, and the guarantee that the technology will only get better, the chatbot extraordinaire fails and will always fail to tell a story. In fact, it can’t be expected to generate meaningful art and literature whatsoever. Why? In short, ChatGPT isn’t a person.

AI Lacks Understanding

Critics might interject and say that LLMs are starting to match the dexterity and even beauty of human poetry and art. AI can crank out verse in the spirit of Milton and Donne (or maybe not quite, but it can approximate the style) and write your freshman composition paper for you. To be fair, it can do a lot. And some entry-level writing jobs may inevitably fall to AI. The technology journal CNET has already employed an AI system to generate articles, although the results have been a bit disastrous, to say the least.

But while AI can generate poems, stories, and essays, it can never grasp the meaning of what it produces. It’s not a sentient mind (despite former Google employee Blake Lemoine’s claim to the contrary) intent on communicating truth, goodness, and beauty to you. Instead, it’s pulling from a preexistent database and giving you approximations of what you’re asking for. It’s algorithmic. The AI optimists (and those declaring doom for all writers and artists) might just be misunderstanding the purpose and nature of language and art.

Josef Pieper on the Purpose of Language

Josef Pieper, a twentieth century philosopher, outlined the two primary functions of language in his excellent little book Abuse of Language, Abuse of PowerHe writes:

First, words convey reality. We speak in order to name and identify something that is real, to identify it for someone, of course — and this points to the second aspect in question, the interpersonal character of human speech (p. 15).

For Pieper, language has a “two-fold purpose”: it conveys reality through the word and establishes relationships among persons. It connects us through shared meaning. Can AI do this? Sort of. It can scrape the internet and produce a coherent sentence. However, Pieper would be skeptical of its ability to fulfill either function, since AI cannot be interested in “reality.” He writes further, “Because you are not interested in reality, you are unable to converse. You can give fine speeches, but you simply cannot join in a conversation; you are incapable of dialogue” (p. 17). Here, Pieper is talking about the “sophists” in Plato’s era. They were witty rhetoricians but had more interest in persuasion and manipulation than in telling the truth.

So, is ChatGPT simply technological sophistry, drawing on a bizarre internet landscape, designed to give you fancy distortions of the world? Robert J. Marks, Senior Fellow at the Walter Bradley Center for Natural and Artificial Intelligence, writes in his book, Non-Computable You:

When discussing artificial intelligence, it’s crucial to define intelligence. Computers can store oceans of facts and correlations; but intelligence requires more than facts. True intelligence requires a host of analytic skills. It requires understanding; the ability to recognize humor, subtleties of meaning, and symbolism; and the ability to recognize and disentangle ambiguities. It requires creativity (p. 16).

The Question of Consciousness

Christina Bieber Lake, Clyde Kilby Professor of English at Wheaton College, expresses a similar conviction about the limits of computed “intelligence,” particularly in relation to literature. She writes in her book Beyond the Story:

Computers can be programmed to “write” stories, but since they do not have first-person consciousness, they cannot reasonably be said to have intended them. Whatever meaning such stories can be said to have is limited by the parameters of the programmer—who is necessarily a language animal (p. 14).

Bieber Lake turns the issue into the fundamental question of consciousness. “First-person consciousness,” she contends, is a uniquely human quality, and in Robert Marks’s terms, it can never be “computed.” No algorithmic complexity will ever be able to generate consciousness, regardless of how humanoid our robots become. She writes:

All art, and especially story, depends on a relationship between conscious persons. For any story to exist, much less to have meaning, conscious persons must be intentionally interacting with each other. There is always an I and a you and a thing — usually another you — for the I and the you to focus on.

She further notes that intent is always shared “by one person with another person.” The personal, communicative nature of storytelling rules out AI as a legitimate author. It can’t intend meaning and can’t speak to you as a person to a person.

The Person Behind the Word

Her comments lead to the perennial question of authorial “intent,” or whether the author’s intended meaning in a text should have any bearing on its interpretation. Robin Phillips writes that if language is common, comprised of words with definite meanings (although words can change meaning, and their combinations be subject to differing interpretations), intent matters significantly. He observes:

If we were attending only to the meaning of a poem as an isolated collection of words rather than as a work of communication and art, then it would not make any difference whether it was written with artistic intent, that is to say, by a human being rather than a computer or an ape. Hence, all the predicates we might apply to the meaning of the poem we should be able to use whether or not it had a human creator. But this is not how we engage with art, for many aesthetic predicates that we commonly apply to poems would be meaningless when predicated to the computer-generated poem. Consider such predicates as “witty,” “intelligent,” “insightful,” “controlled,” “suppressed,” “overdone,” etc., which presuppose a creative intelligence behind them. To attend to the poem as an artwork is, therefore, to already be aware of more than merely the meaning of the words themselves: it is to be aware of their meaning as an intended artwork.

If we consider ancient oral cultures, where storytelling and myths were primarily communicated by mouth and received by ear, we might get a better sense of the human uniqueness of language. When we read a story, or a poem, or study a painting, we might come away with different impressions, or even various interpretations, but one aspect is evident: a personal consciousness was responsible for creating it, affirming the wisdom of a quote often attributed to C.S. Lewis, “We read to know we are not alone.”

Originally published at Salvo in March 2023.


Peter Biles

Writer and Editor, Center for Science & Culture
Peter Biles graduated from Wheaton College in Illinois and went on to receive a Master of Fine Arts in Creative Writing from Seattle Pacific University. He is a prolific fiction writer and has written stories and essays for a variety of publications. He was born and raised in Ada, Oklahoma and is a contributing writer and editor for Mind Matters.

Literature and Personal Consciousness: Why AI Can’t Speak to You