I was recently asked a question by a conference participant about whether education faces an AI apocalypse. “I’m on staff at a college in Washington state, and in my work, I interface with our curriculum,” the concerned educator said.
“Our instructors tell us they’re no longer really teaching because about 90% of assignments they get back are AI-generated,” they went on. “When you take the students away from the AI, they’re functionally illiterate. Getting together is nice, as long as there’s some commonality. But if AI destroys our ability to transmit culture meaningfully from person to person, the connections between humans will eventually fail, because there will be no commonality of learning, culture, or understanding.”
I have a pretty straightforward take on this — extremely basic, really.
We ask education to be way too many things: It’s a rite of passage, a day care, a credential factory, a gatekeeper, and so on.
In my day job at Symbolic AI, I’m always trying to get state-of-the-art language models to produce text artifacts reflective of human steering, human insight, and human intelligence. You input your notes and your thoughts, and it’s supposed to become, say, a news article or something. But even the latest models are very brittle. LLMs are fragile.
When you color outside the lines, the whole thing starts to collapse and fall apart. If this brittle, janky prototype technology has destroyed higher education as we know it — really? I’m trying to imagine Professor Tolkien’s seminar back in the day, where you show up to read some medieval history and you bring this ChatGPT output. Why would you even do that?
In seminars I was in, reading Plutarch or whatever, there would have been no room for this. Maybe AI is a bit like a wildfire in the ecosystem that burns away some of the dead wood, fertilizes the ground, and allows something better — and perhaps older, closer to what it once was — to grow back.
If someone genuinely wants to learn, grow intellectually, and become educated, and you tell him clearly, “Using AI in these ways will help you toward that goal, and using it in these other ways will hinder you,” then he’ll generally choose the right path and avoid misusing the tools.
It’s kind of like physical exercise. Imagine someone wants to build muscle. You could have a machine lift the weights for you — which would obviously be pointless, even though, sure, the weights got lifted — or you could use the machine to provide resistance and actually get stronger.
But if someone is just trying to check a box, gain status, or please his parents or peers with the appearance of being educated, he’ll do whatever he has to do — AI shortcuts included. My honest feeling about those folks is that I’d rather not waste educational resources on them.
RELATED: Supreme Court grants massive victory to Trump administration on cutting down Department of Education
Photo by Kevin Dietsch/Getty Images
Education should mainly be for those who truly want to learn. Right now, though, we ask education to be way too many things: It’s a rite of passage, a day care, a credential factory, a gatekeeper, and so on.
My view here is grounded in my experience raising my own girls. I’ve found that when they genuinely want to learn something — whether it’s painting, Iceland (my oldest is obsessed … no clue why), algebra, pop-music lore — they naturally dive in with enthusiasm.
But if they have to do something just to check a box, we talk about whether there’s actually any value in it. If we agree it’s beneficial enough, they’ll do it — but usually at around 70% effort, which is good enough. And if we decide it’s pointless, we scheme up a shortcut to minimize their wasted effort so they can spend their time on something better.
My big question in these cases is always: What are you doing with the time you saved by taking that shortcut? Did you waste it, or did you put it to good use and end up ahead?
This exact question is going to apply to newsrooms adopting AI tools like mine. Once you gain back some of your time, what are you doing with it? Are you just churning out more content slop, or are you investing in better, deeper stories?
By the way: I don’t assume that everyone’s going to default to producing slop — the returns on that are diminishing anyway. In fact, there’s a real chance we’ll see better reporting emerge, because ultimately that’s what audiences value and will pay for.
A version of this article was published at jonstokes.com.
Ai, Education, Tech, Lifestyle, Return