Our Future AI Classrooms
- Future Educator

- Jun 4
- 3 min read
Updated: Sep 23
Have you ever paused to wonder if we’re moving too fast?
I have.
I recently read a report from the House of Lords Library on digital innovation and AI in schools and was came across this quote: “Both teachers and students face the risk of becoming overly reliant on AI-driven technology. For students, this could stifle learning, especially the development of critical thinking. This challenge extends to educators as well. While AI can expedite lesson-plan generation, speed does not equate to quality. Teachers may be tempted to accept the initial AI-generated content rather than devote time to reviewing and refining it for optimal educational value.”
It struck a nerve.
Because let’s be honest - who isn’t tempted by shortcuts? AI can do in 30 seconds what might take a teacher 2 hours. But the question is 'should it?'
What happens when we trade depth for speed? When lesson plans are produced with no fingerprints on them? When students start outsourcing their thinking (not just their spelling or their calculations) to machines that can write, summarise and even analyse for them?
And let’s not pretend it’s just students. We all know that teachers are under immense pressure. If you’re working late every night, juggling admin, marking and behaviour management, then understandably AI planning tools feel like a lifeline. But over time, does that lifeline become a crutch?
Let’s take lesson planning as an example. AI tools are being developed to help teachers draft entire lesson plans in minutes. One tool mentioned, called Aila, is already being used by thousands of teachers in the UK. Imagine shaving hours off a teacher’s weekly workload, freeing them up to actually teach, mentor and develop. That sounds ideal...right?
Then there's the data side of things. AI can analyse trends in student performance, flag up areas of concern, even tailor revision based on how someone learns best. On paper, it’s genius! But in practice...who's making sure that all this data is interpreted with compassion, context and common sense? A low maths score might not mean a student needs a tutor, it might mean they’ve just had a rough week at home.
I guess what I keep coming back to is this: AI is a tool, not a teacher!
It can be powerful, helpful, even transformative. But it can’t replace human connection, intuition or values. Some pupils still struggle to get online at home. Others don’t have a quiet space to study. So while AI might be the next big thing, it can’t be the only thing. Real inclusion isn’t just about handing out devices - it’s about ensuring every child is seen, heard and supported in ways that go beyond the algorithm. Technology should serve education, not redefine it in ways that disconnect us from what really matters.
Education isn't just about covering content. It’s about conversation, creativity, discernment. It’s the teacher who reworks a lesson because they know a specific class or group of students needs something different. It’s the child who struggles through an idea, finally cracking it not because it was easy, but because they have learnt to not give up so easily. You can’t code that. You can’t automate it either.
The AI conversation is only just beginning. The future of education isn’t just digital...and as someone deeply interested in what education could and should look like in the future, I think we have to keep asking the uncomfortable questions too...





Couldn’t agree more! With the recent increase awareness in neurodiversity in early childhood education and also for educators, it will be a false claim to say AI can take over teachers. Empathy, compassion and a deeper connection is what is needed best to support every learner. AI is undoubtedly an amazing resource and tool to help accelerate learning for both educators and learners but its advantages are limited and just like how companies who thought AI can replace human workforce are realizing at a higher cost, those who make this claim will eventually come to that conclusion as well.