NYT Journalist: The “Download” Model of Knowledge is Flawed
We learn by wrestling with ideas, by paying attention and making connections.Imagine a scenario. You’re a sophomore in high school and have to write a book report about The Great Gatsby by the next day. You haven’t read one chapter of the novel. You have little to no interest in reading a whole book in twelve hours, and your grade for the class teeters perilously close to a D if you fail this assignment. What do you do?
If it is 2025, an easy option would be to simply prompt ChatGPT to give you a quick report for you. Ten years ago, you might have opted for SparkNotes for a quick summary of the plot, characters, themes, context, etc. Say, though, that you cave to technological temptation, and consult with AI and a couple of online summaries to get you a passing grade for the book report. You score a high B, and your overall grade avoids taking a devastating hit. What’s the big deal? You got all the major points of the novel. You know who Nick, Jay Gatsby, and Daisy are, that the book takes place in New York, and that it deals with themes of wealth, idealism, and disappointed dreams. Did you lose anything by not reading the book itself since you have a basic gist of the story?
According to Ezra Klein, a journalist at The New York Times, the answer is “yes.” You do lose something, and it’s not a trivial loss. Getting a bite-sized summary of The Great Gatsby is nothing like reading the classic novel itself.
Klein appeared on David Perrell’s podcast Why I Write and explained why he doesn’t significantly use AI in his writing process. He and Perell talked about what we lose when we settle for summaries instead of grappling with complicated texts with our own minds.
“Having AI summarize a book or a paper for me is a disaster,” Klein said. “It has no idea what I really wanted to know. It would not have made the connections that I would have made.”
Klein goes on to state how the idea that we can just download information into our brains, like a computer downloads data on a chip, is flawed. It isn’t how humans work. We learn by wrestling with ideas, by paying attention and making connections. It’s impossible to bypass that stage and still emerge with genuine knowledge. “What knowledge is supposed to do is change you,” Klein says.
Klein doesn’t discount AI’s potential uses and benefits. It can serve as a good search engine and information aggregator. But for Klein, it represents too much of a temptation to offload thought and forego meaningful engagement with complicated texts and ideas. If we don’t wrestle with ideas, or personally encounter stories, we won’t be changed, inspired, or challenged. We won’t get anything out of it.
AI tempts today’s high school and college students, as well as writers and journalists, with easy shortcuts. But the input/output metaphor for human knowledge has severe limits, because humans aren’t computers. We are persons, centers of consciousness, who have physical bodies and come from particular contexts. A computer program, I’ll wager, enjoys no such benefits.