What schools need to get right on AI

Finding the right balance between embracing AI but ensuring it does not dominate and rob young people of necessary knowledge is critical, argues David Monis-Weston
19th March 2025, 5:00am
What schools need to get right with AI

Share

What schools need to get right on AI

https://www.tes.com/magazine/teaching-learning/general/what-schools-need-to-get-right-artificial-intelligence

A new and significant conundrum has swung into view for teachers, school leaders and policymakers: how do we react to the rise of artificial intelligence (AI)?

Messages of both hype and doom are commonplace - and with the technology moving forward at breakneck speed and in unpredictable ways, this leaves education struggling even to grasp what’s possible now, let alone predict what will happen in a decade’s time and how our curriculum needs to adapt.

But despite the field itself being so unpredictable and fast moving, for schools, there are emerging key lessons to learn, sensible moves to make and reasonably obvious pitfalls to avoid.

And we can best structure that information using two key questions:

  • How do we modify our education system to prepare for a world where more jobs will require fluent use of AI tools, ie, educating for AI?
  • How do we modify the way that we plan and teach to make use of AI tools, ie, educating with AI?

AI in schools

With generative AI undergoing such rapid development, it is hard to predict exactly how future jobs will change and how those in schools need to prepare young people for that.

However, we can draw very helpfully from what we already know.

There is an emerging body of useful evidence from the recent post-ChatGPT era, with researchers having already discovered several key insights about how humans work with AI. And we know a great deal from decades of prior research about how human brains think best alongside digital tools more generally.

The key is to consider the different ways that all tasks require us to think about and control our own thinking, ie, to be metacognitive.


More on AI in schools:


When approaching a problem with traditional technology tools, we engage in metacognitive work: planning our approach, breaking tasks into steps, monitoring progress and evaluating results.

This work requires both knowledge of different strategies and sufficient experience to apply them confidently to complete the task.

For example, in a budgeting task, consider how the shift from a purely mental mathematics calculation to using the tool of a spreadsheet changes the demands of the task.

Shifting to a spreadsheet enables us to do much more complex and extensive budget work within a set period of time compared with doing the task mentally. However, it requires different strategies of use than a purely mental calculation.

Spreadsheets require different planning skills - for example, organising grids and defining relationships between cells. And we need experience in using those strategies with those tools to get the most out of the process.

How AI tools work

Up to now, the tools we have used to aid our mental working may differ in the specific strategies needed, as above, but they have remained deterministic and transparent - ie, when given the same inputs, they reliably produce identical outputs through visible and explainable processes.

Our education system has been set up to enable the learning of basic metacognitive skills and specific strategies for these traditional tools, and to gain experience using them.

What schools need to get right with AI


Generative AI, however, is fundamentally different from the tools we have used before. AI systems don’t produce identical outputs from identical prompts, nor can they explain their reasoning.

This opaqueness complicates the metacognitive work someone requires in three critical ways:

  1. Planning becomes more challenging when capabilities change rapidly and outputs remain unpredictable.
  2. Instead of following structured steps, AI work becomes highly iterative - attempting, evaluating and refining prompts repeatedly.
  3. A single prompt can generate extensive content requiring thorough evaluation before deciding whether to accept, refine or discard it.

These differences demand both deeper content knowledge and greater metacognitive flexibility from users.

The importance of content knowledge

Without understanding the subject matter deeply, it is not possible to specify with clarity what is needed from the AI tool or to evaluate the tool’s output.

And you cannot solve that problem by teaching strategies generically in an “AI class”: a student needs to know language and concepts specific to, for example, maths, coding, poetry or cooking in order to use these tools well.

So, AI is not going to necessitate a change in the basic role of school to ensure students “know” things.

But schools may need to adapt to provide students with the flexible metacognitive knowledge and experience needed to use AI tools.

Keeping up with the pace of change

For example, to use AI tools effectively as part of coding, students need to learn how to mentally plan a series of steps while weaving in these tools in different ways: generating code, checking for errors, brainstorming ideas, explaining a particular function, etc.

At each stage, they need to apply their metacognitive knowledge to decide how to break down a task into pieces, to strategically evaluate where they are and to plan next steps in response.

Then they also require their content knowledge to understand what’s been produced, evaluate potential risks and plan how to integrate pieces together in an expert way.

The trouble is, it is seemingly very hard to plan a curriculum to enable students to gain the metacognitive strategies they need and gain the experience of using them when the way that generative AI tools work is changing every few weeks.

‘No number of “AI skills” lessons could make up for a lack of understanding’

For example, in the past few months the biggest changes have come as the AI technology companies begin to shift from simple prompts and outputs (a straight question and answer) to multi-step systems that attempt to “reason” or work as “agents” (such as OpenAI DeepResearch), going through multiple stages of activity that can include generating text, searching the web, analysing ideas, summarising and structuring outputs, and then finally producing extensive documents or large pieces of computer code.

But I would argue that these developments fundamentally present the same issues: no matter how capable these tools get, we need the metacognitive capability to know when and how to use them best to produce our best work - not replacing our own thinking and losing our critical faculties but using them to support and challenge us as we work, pushing our own expertise and capabilities to new heights.

So, how do we do it?

Educating with AI

The only way that we as a society can ensure we prepare the next generation with the metacognitive capability to use such tools is to make sure that students start to encounter such tools during their education at school.

But we need to do so carefully to make sure that they also have plenty of opportunities to build up their own expertise and content knowledge.

I think this is a real problem for evangelists of both the “no AI” and “all AI” positions in education.

If students never have access to any AI, they cannot possibly learn the strategies and gain the experience of working effectively with it. Without explicitly teaching such tools, we are leaving it to chance as to whether students may be lucky enough to teach themselves as they go and to do so in an effective way.

Digital disadvantage

History teaches us that a lack of explicit preparation tends to widen inequality, with those from more advantaged backgrounds better able to teach themselves what they need for the next step.

To avoid this, students need to be taught carefully to avoid superficial usage and to have sufficient capability to plan, execute and evaluate in an expert manner.

But if students are instead given unfettered access to AI at every moment, they will constantly outsource the generation and production of texts, codes, design and calculations.

What schools need to get right with AI


The science of cognition is pretty clear here: without going through this thinking and the challenge themselves, they can never gain a deep understanding of the content.

And, without this understanding, they cannot become expert users of AI, able to guide the tool, discern the qualities of its outputs and plan how to either improve it or to then integrate the outputs back expertly into the broader project.

No number of generic “AI skills” lessons could ever make up for this lack of genuine understanding of the material being discussed.

‘No shortcut’ to literacy skills

Consider this in two areas: writing stories and writing computer code.

To help students write stories, we need to help them think about story structure, narrative, character and how the use of different words and sentence structures adds or changes meaning. They need significant experience of reading and writing different stories to get a sense of how these elements interact with each other.

Doing this with an AI tool introduces new demands: understanding how and when to engage with AI and understanding how to be precise about the task and how to break it down into steps or turn it into a prompt.

Should they ask for a whole story draft, or will they perhaps give it a character prompt and ask for five variations? Should they ask the tool for critique or maybe ask it to generate five images based on a description and use that to reflect on the descriptions or even use it as a seed of imagination to develop a storyline?

If we want the future generation to write great stories, then there is no shortcut to giving a good grounding in doing it themselves before then extending the learning to maintaining or even increasing quality with the use of AI.

‘We need every citizen to remain a critical, thoughtful and capable user of these tools’

As a second example, when we teach computer coding without AI tools, we teach students to break down a coding project into smaller tasks. We teach them to consider different algorithmic designs and approaches for efficiency, security and reliability.

Once AI is introduced, they need to work out how and when to outsource different stages and levels of this design and how and when to work with those elements.

Even if AI can produce a decent first attempt in a single go, it’s hard to build on this, integrate it into a wider project, be confident of its safety, security and efficiency or to further improve it when the student can’t understand what’s been produced.

“One-shot” coding by AI may be fine for the occasional hobbyist or quick-fire DIY coder but is unlikely to be sufficient in any complex work setting.

Planning for the future

In both cases, future AI tools can and will improve in quality and reliability and will be able to take on larger parts of tasks, but the underlying requirement that humans will need to work expertly alongside such tools will continue to require humans to think both independently of tools as well as collaboratively with them.

The idea of integrating AI tools into lessons in a seamless way throws up a huge number of challenges.

Our school system has historically under-invested in reliable digital infrastructure and support, which means that too many teachers and children experience unreliable and slow devices and connectivity.

They are often also too uncontrolled, with too many opportunities to distract the attention of students and teachers rather than focus it on key learning opportunities.

School leaders also receive very little training on planning, implementing and maintaining a reliable digital infrastructure.

Even if we get this necessary infrastructure and capacity right, it won’t be sufficient without getting an equitable balance of tools and support for all students.

On the one hand, we’d have to worry if the most advanced AI tools were only available to the schools with the wealth and capacity to buy and maintain the software and hardware necessary, leaving a generation of children in less well-off areas with experience only of outdated technology.

But on the other hand, we would also be alarmed if students in the wealthier schools were able to get a high-quality diet of specialist teacher interaction alongside AI tools while those in more disadvantaged areas were told that there wasn’t enough money for human experts and that they had to rely only on tech-based tutors.

The AI wave is coming

Finally, even with the right infrastructure and balance of tools, we can’t expect that every teacher will be able to work out themselves how to make this happen in the classroom.

We need to invest in research in the best practices, carefully evaluating for each approach whether it increases or decreases students’ capability both in content and tool use.

Then we need to invest in embedding this learning carefully into curriculum plans and resources and also spend time supporting teachers to learn from this with sufficient professional development.

The AI wave is coming, no matter how unpredictable, and there is more to do to be ready. But above all, for me, the greatest risk of this AI technological wave is that we heed the siren calls to remove things from the curriculum if “AI can do it” - this feels precisely the opposite of the right approach.

We need every citizen to remain a critical, thoughtful and capable user of these tools, able to use them to enhance, and not replace, our core human qualities and humanity.

David Monis-Weston is AI lead at Purposeful Ventures. You can read his recent report, Exploring edtech and AI in maths teaching

Want to keep reading for free?

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Register with Tes and you can read two free articles every month plus you'll have access to our range of award-winning newsletters.

Keep reading with our special offer!

You’ve reached your limit of free articles this month.

$7.50
$6.50
/per month for 12 months
  • Unlimited access to all Tes magazine content
  • Save your favourite articles and gift them to your colleagues
  • Exclusive subscriber-only stories
  • Over 200,000 archived articles
  • Unlimited access to all Tes magazine content
  • Save your favourite articles and gift them to your colleagues
  • Exclusive subscriber-only stories
  • Over 200,000 archived articles
Recent
Most read
Most shared