No slop without a slog? It’s possible with AI — if we’re not lazy

Personally, I’m happy that autocomplete for email exists. If my kid has to write some goofy templated email — like a formal apology for being late to a class they don’t care about — great, hit autocomplete, tweak the results, and be done.

But then I’m always going to ask them: “What did you do with the time you saved?”

Because let’s be real: No child a hundred years ago had to waste time writing pointless emails. So now that you’ve reclaimed that lost time, how did you spend it?

We have to actively decide how we’re going to introduce AI into our lives and how we’re going to interact with it.

We’re an AI-friendly household, obviously. My kids have full access to ChatGPT, image-generation tools, all of that stuff. But they don’t use it much — they don’t care. They’d rather draw, write their own stories, read each other’s stories out loud, and proudly show us things they’ve created themselves. Why would they replace that with ChatGPT?

As their parents, we appreciate their original creations, and they appreciate each other’s work too. Those creations become part of our family culture — not labor, but something meaningful.

If someone’s stuck doing repetitive, low-value labor — especially something mundane like certain kinds of emails — please, press a button, automate it, and then use the time you save for something meaningful. That’s my real goal.

I definitely don’t want my kids to cheat, but I also don’t want them wasting their time. A lot of our educational system currently trains kids to waste time. So if AI can help them avoid that, that’s genuinely valuable.

My co-founder, Devin Wenig, and I are people with deep expertise in a specific industrial process — news production. News production is highly structured, especially at enterprise scale for large newsrooms. A piece of content typically moves through multiple phases, touched by many different hands along the way.

RELATED: Jim Acosta is getting torched for ‘grotesque’ interview with AI version of child killed at Parkland massacre

Photo (left): Stewart Cook/UTA via Getty Images; Photo (right): Saul Martinez/Getty Images

We’re basically graybeards (literally!) in a particular industry that has accumulated a lot of inefficiencies. So we’re applying this new technology to reduce those inefficiencies in a phased industrial workflow, resulting in an industrial product that people consume as news.

Now, there’s an ethical aspect to all this — similar to debates around industrial farming: Is it good? Is it nutritious? I guess I’m implicated in that.

Right now, much of what gets published as news comes from reporters juggling a dozen tabs at once, repackaging existing information into content that’s mostly designed to get clicks.

When you introduce AI into this scenario, it can play out two different ways, and everyone here probably knows what they are.

My hope is that it leads to something like, “I’ve reclaimed some time as a reporter. I can pick up the phone and call a source, or write something deeper, longer, and more meaningful.” That’s one possibility.

The other possibility is, “Well, now you’ve got extra time, so crank out 80 more pieces of the same shallow content.”

Which direction newsrooms choose will be their responsibility.

What my startup aims to do is give every journalist more productivity per unit of time — whether they’re processing municipal bond reports, covering earnings season, or similar repetitive tasks. Ideally, newsroom editors will then encourage journalists to use the reclaimed time for deeper reporting: calling sources, traveling to do on-the-ground reporting, and producing higher-quality journalism. Hopefully they don’t just say, “Great, now we can lay off half the newsroom and push the remaining staff even harder.”

I can definitely think of other examples that might also qualify as anti-culture. But ultimately, I think it will be whatever we choose to make of it. We have to actively decide how we’re going to introduce AI into our lives and how we’re going to interact with it.

Luckily, we dodged a bullet with the centralized versus decentralized AI debate. Because we have open-weight models and decentralized tools — which almost got banned — we now have leverage and an opportunity to steer this technology. We have a window right now to choose how we adopt and guide its use.

A version of this article was published at jonstokes.com.

​Ai, Chat gpt, Journalism, Tech, Writing prompts, Family, Education, Return 

You May Also Like

More From Author