More Than Cogs in an AI Machine
Perhaps it isn't only AI which poses a challenge, but the mainstream model of educationEducation seems especially vulnerable to ChatGPT. Universities now have to grapple with AI plagiarism, and even teachers and administrators are being tempted to use the Large Language Model to generate syllabi and even condolence emails in the wake of tragedies.
Leah Libresco Sargeant, author of Building the Benedict Option and Arriving at Amen, wrote a piece on this issue at First Things last week. She goes past simply the struggle schools have in detecting ChatGPT’s presence, but also why it’s so easy to use AI generated language in school settings. Many educational institutions, like AI, promote the appearance of productivity but lack real value. She writes,
If schools are primarily dedicated to producing workers, rather than holistic human beings steeped in the liberal arts, then this is the right kind of formation for those who want what David Graeber termed “bulls**t jobs.” These jobs are more about maintaining the illusion of productivity than producing anything of value … The work at BS jobs can’t bear too much scrutiny. Close up, workers will admit to the governing principle, “We pretend to work, and they pretend to pay us.” At BS schools, students, teachers, and administrators are engaged in the same farrago of false industry. They can form the appearance of a classroom, but it’s hollow inside, just as ChatGPT can write your essay or your love letters without any sentiment.
-Leah Libresco Sargeant, Cheating with ChatGPT | Leah Libresco Sargeant | First Things
This is an insightful addition to the controversy surrounding AI in education. Perhaps it isn’t only AI which poses a challenge, but the mainstream model of education. What do we value? The more utilitarian, career-oriented philosophy of education makes it easy for ChatGPT to make inroads into the classroom. If we prioritize productivity and “content” over meaning, truth, and relationship, then ChatGPT satisfies modern educational demands. Sargeant continues,
Professors can point cameras at students and pat them down for contraband in closed-book tests. But they won’t beat ChatGPT until they ask for something that only the students can give. Professors need to persuade students that the content of their classes has real value. Administrators need to commit to transforming students into whole human beings, not credentialed cogs in a machine.
For those interested in reading more about AI and education, Mind Matters contributor Gary Smith has also written extensively on this issue.
Learning to Communicate | Mind Matters
Text Generators, Education, and Critical Thinking: an Update | Mind Matters