Mind Matters Natural and Artificial Intelligence News and Analysis
digital arrows
Red arrows to go left or to go right

What If Your Schooling Meant an AI Telling You What To Do?

Lee and Chen are techno-optimists. They recognize the benefits of innovative technologies while acknowledging its inherent limitations and societal costs
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Author, CEO of Sinovation Ventures, and former head of Google China, Kai-Fu Lee, shared with COSM 2021 his predictions for the future of AI. His presentation drew from his book AI 2041: Ten Visions for Our Future, co-authored with prominent science fiction writer Chen Quifan.

Kai-Fu Lee

AI 2041 offers seven short stories that explore the ethical and societal implications of machine-learning technologies on various industries, such as manufacturing, art, and education. The last three chapters address potential societal and geopolitical issues raised by artificial intelligence. Each short story includes an “analysis” section, authored by Lee, which delves into the issues raised by the story and its characters.

Lee and Chen are techno-optimists. They recognize the benefits of innovative technologies while acknowledging its inherent limitations and societal costs. Lee does not believe that AI will ever become intelligent in the human sense. To him deep learning is fundamentally different from the human brain and that “[w]hile humans lack AI’s ability to analyze huge numbers of data points at the same time, people have a unique ability to draw on experience, abstract concepts, and common sense to make decisions” (AI 2041, page 26).

Lee told the End Well Symposium in 2018, that the two human qualities that cannot be automated are creativity and love — a view he came to appreciate after his diagnosis of stage IV lymphoma put his workaholism in perspective. He told the symposium that the future is not gloom and bleak, but that AI will free humans and resources to serve as caregivers and in creative fields

In the next few weeks, I will highlight several stories in AI 2041, beginning with “Twin Sparrows” (translated by Black Stone-Banks) about AI and education.

Two Different Ways to Use AI in Education

In the story, the children at Fountainhead Academy in South Korea get to design their own personal AI friend that will serve as their tutor, teacher, and guide. Fountainhead is led by Kim Chee Yoon, or Mama Kim, who is pioneering the vPals technology. The children design AI friends with cartoon-like characteristics that can converse with them using natural language processing, the same technology used in Siri or Alexa.

Orphaned twin boys arrive at the academy at three years old after their parents died in a car accident. Mama Kim names them Golden Sparrow and Silver Sparrow. Golden Sparrow makes an AI friend, Atoman, based on a superhero he likes. Golden Sparrow is competitive and precocious, so Atoman uses gamification and reward to motivate Golden Sparrow to study.

Chen Quifan

By the time the twins are six years old, Golden Sparrow is adopted by the Paks, a couple with two other children whose family motto is “only the best deserves the best.” They encourage success and high-performance and take full advantage of every technology to help their children succeed. In the process, the Paks adjust Golden Sparrow’s AI friend to ensure he is properly challenged and continues to improve in his subjects.

Silver Sparrow, by contrast, is a withdrawn boy on the autism spectrum with prodigious abilities in art and creativity. He is less receptive to the AI friend idea but he eventually creates Solaris, an amorphous amoeba-like character. Silver Sparrow is eventually adopted by Andres and Rei, a transgender couple who are so taken by Silver Sparrow’s artwork, which he had entered in a contest, that they wish to help nourish his creativity. Andres and Rei take a more humanist and balanced approach to Silver Sparrow’s upbringing, not eschewing technology to help in his learning, but only using it as part of his overall education.

While at the academy, a diagnostic AI that analyzes eyes, facial expressions, voice, and body language diagnoses Silver Sparrow with an 88.14-percent probability that he had Asperger’s syndrome. Mama Kim works with him and his AI to ensure that he has individualized education methods that suited his way of thinking.

In Silver Sparrow, we see the advantages of a technologically tailored education, while in Golden Sparrow, we see where these well-intentioned methods can go awry.

Silver Sparrow’s AI friend is able to tailor its questions and tutorials to meet Silver Sparrow’s needs. In Lee’s analysis of the short story, he sees the biggest advantage to AI in education as the level of individualization and customization that is possible for each student. AI can provide homework problems and feedback at a pace and level appropriate to the student. If the student is interested in basketball, for example, the AI could provide math problems based on basketball. Lee points out that AI will not negate the role of the teacher:

Human teachers will be the driving force behind stimulating the students’ critical thinking, creativity, empathy, and teamwork. And the teacher will be a clarifier when a student is confused, a confronter when the student is complacent, and a comforter when the student is frustrated. In other words, the teacher can focus less on the rote aspects of imparting knowledge and more on building emotional intelligence, creativity, character, values, and resilience in students. (AI 2041, Page 119)

Additionally, human teachers will direct and program the AI tutor in ways that can best address the student’s individual needs, which Lee says requires a level of wisdom and understanding that an AI cannot do.

However, we see in “Twin Sparrows” the potential downsides to AI education. Golden Sparrow is motivated by competing with people — or more accurately, by winning. Therefore, his AI, Atoman, gamified his education. The Paks, who valued achievement and winning, monitored Atoman and upgraded him when they thought Golden Sparrow needed more motivation. Atoman used competition and even an AI-produced female student to motivate Golden Sparrow to perform better.

Today’s educational system uses competition as a motivator to prompt students to learn, except it is through class ranking, GPA, standardized tests, honors societies, and selective college admissions. And while this “works” to produce high achievers, it does not instill in student the virtuous qualities and love of learning that makes them into good people. Alfie Kohn critiqued this view of education, inspired by behaviorist B.F. Skinner, in his book, Punished by Rewards: The Trouble with Gold Stars, Incentive Plans, A’s, Praise, and Other Bribes, now in its third edition.

As Golden Sparrow becomes older, his people skills atrophy in the midst of his increasingly performance-based life while his creative brother with autism learns empathy.

At a dinner conversation between the two families, Rei questions why the Paks let the AI plan their children’s future for them. Mr. Pak tells Rei “It’s our responsibility to ensure that kind of talent doesn’t go to waste… We used to have a saying: No one knows the son better than the father. Now, should we say, no one knows the son better than his AI? Parents will never again have as much insight into their child as the child’s AI. And that’s a good thing. Golden Sparrow’s math is already at the level of a ten-year-old’s. And his pattern recognition is better than Si-Woo’s [his sister].” (Page 91)

Mrs. Pak voices the sentiments of many parents today. While she understands that Rei and Andres have “a much more romantic view of things,” what is more important than their children’s education?

In this one interaction between the two families, we see Chen and Lee’s commentary on optimizing children and education at the expense of being human. Both parents, the Paks and Andres and Rei, see enormous potential in their adopted children, but cultivate that potential in different ways. What is implied in this scene is the point Mama Kim made earlier in the story and Golden Sparrow’s psychologist makes later: Human beings are not an AI.

Additionally, this scene is a commentary on the obsessive need for parents to provide every opportunity for their children by optimizing them as though they were machines, a theme that is dealt with more perceptively in the dystopic world portrayed in British novelist Kazuo Ishiguro’s Klara and the Sun (2021).

Chen and Lee’s techno-optimistic view of human nature collides with reality, although it seems that they are aware of it. Silver Sparrow does not respond well to the tailored online classroom for children with Asperger’s syndrome. He can see the intention behind every action by the virtual teachers and students and feels that the setting is “false and fragmented.” The Covid-19 pandemic has dispelled notions that more technology is better. Children do not thrive in a virtual setting.

Additionally, parents or teachers can manipulate AI to nudge students in the same way that companies like Facebook do today. But this can easily turn into one more way to restrict and control children so that they become a product of one’s own making. At a poignant moment in the story Mr. Pak tells Golden Sparrow that he realizes his view of “success” is making the boy miserable and this not what he wanted for his son.

The story ends with the AI friends bringing the twins back together after years of being apart, an intentional design installed by Mama Kim’s programmers. The ending echoes early Silicon Valley optimism that the internet and social media will bring people together.


You may also wish to read: COSM 2021: Kai-Fu Lee tries his hand at futurecasting The former president of Google China thinks that China is well equipped to lead the world in AI. He hopes that massive data gathered by AI can be used to “treat” longevity by focusing on individual needs. He tried it himself, with good results. (Heather Zeiger)


Heather Zeiger

Heather Zeiger is a freelance science writer in Dallas, TX. She has advanced degrees in chemistry and bioethics and writes on the intersection of science, technology, and society. She also serves as a research analyst with The Center for Bioethics & Human Dignity. Heather writes for bioethics.com, Salvo Magazine, and her work has appeared in RelevantMercatorNet, Quartz, and The New Atlantis.

What If Your Schooling Meant an AI Telling You What To Do?