AI Agents, slop, and reclaiming the responsibility of learning


Weekly Newsletter

Practical AI Strategies

This week on the blog... A Taxonomy of Agentic AI

Hi everyone,

Happy Easter to those of you on a break. Term 1 has wrapped up here in Victoria, and I hope wherever you are in the world you get at least a few days to switch off. If you do find yourself with a bit of time over the holidays, this week's three articles might be worth a read with a coffee or two, because they cover a lot of ground: from the language of AI agents, to the hidden cost of AI-generated content, to what it means for students to take responsibility for their learning when a chatbot is happy to take it for them.

The question running through all three posts is: who is doing the thinking?

A Taxonomy of Agentic AI

"Agent" has become one of those buzzwords that means whatever the person selling it wants it to mean. Microsoft's version is different from OpenAI's, which is different from Anthropic's, which is different from what most people imagine when they hear "autonomous AI." In this article I provide a clear definition and a five-level taxonomy, from code-using chatbots at the bottom through to agent teams and swarms at the top.

Each level inherits the capabilities of the ones below it, and I walk through practical examples of what each level can actually do, including in education.

The Effort Economy of Slop

I came across a definition of AI slop that crystallised something I've been thinking about for a while: slop is something that takes more effort to consume than it took to produce. This article explores the effort economy of communication, and how GenAI inverts the traditional relationship between producer and consumer.

When a student submits AI-generated work, the teacher becomes the primary meaning-maker, doing all of the cognitive work the student declined to do. The same dynamic plays out across emails, policies, newsletters, and resources. If you've ever opened a document and thought, "a human didn't write this," you've felt the inversion of effort.

Gradually Reclaiming Responsibility

The gradual release of responsibility model is one of the most widely used frameworks in education: I do, we do, you do. But when AI enters the picture, there's a risk that responsibility is released not onto the student but onto the chatbot.

This article blends the GRR model with the concept of "resistance" from last week's article, mapping the increasing effort required to maintain ownership of thinking as AI use deepens. In the original GRR model, responsibility flows from teacher to student. With AI, there's a third party in the room, and it's perfectly happy to take on all of the responsibility without ever intending to give it back.

Cheers,

Leon

PS: All courses and digital downloads at Practical AI Strategies are 25% off over the break. Use the code EASTER-2026 at checkout. Ends April 26th.


Stay informed.
The Practical AI Library curates fresh, practical articles, research, and advice on AI and education—updated every week. Check it out
Check it out → Practical AI Library


211 Tahara Grassdale Road, Grassdale, VIC 3302
Unsubscribe · Preferences

Leon Furze

I'm a educator, writer, and podcaster who loves to talk about artificial intelligence, education, and writing & storytelling. Subscribe and join over 9,000+ educators every week!

Read more from Leon Furze

Weekly Newsletter Practical AI Strategies How do we resist cognitive offload with AI? Hi everyone, Term 1 is wrapping up here in Victoria, and wherever you are in the world I hope a break is on the horizon, even a short one. If you do get some downtime, it might be a good chance to experiment with some of the GenAI applications you've been reading about in these emails. This week's two posts sit at opposite ends of the spectrum: one argues that sometimes the most important thing you can do is...

Weekly Newsletter Practical AI Strategies You Don't Need An AI Policy Hi everyone, By this point, most schools and universities fall into two camps: Those that have an AI policy, and those that want to write one and feel "left behind". I'm here to tell you that you don't need an AI policy. I've said this in a few sessions recently, from a full day workshop with vocational educators, to a room full of Primary leaders, and even outside of education in accounting and finance. It sometimes takes...

Weekly Newsletter Practical AI Strategies What Curriculum Leaders Need to Know About GenAI Plus: Why you should be hoarding your knowledge Hi everyone, I’ve been working face-to-face this week in schools and at a meeting for the Independent Primary School Heads of Australia (IPSHA) I’ve had great conversations with school leaders, and one topic has come up again and again. Despite all the policy and governance conversations we need to have in schools about AI, teaching and learning is still...