AI chatbots are a fact for pupils - so how do teachers respond?

Artificial intelligence has huge potential to ease teacher workload, but already pupils are using chatbots to cheat and government guidance is desperately needed, MSPs told
14th December 2023, 11:20am

Share

AI chatbots are a fact for pupils - so how do teachers respond?

https://www.tes.com/magazine/analysis/general/artificial-intelligence-chatbots-teacher-response
AI chatbots

It took physics teacher Chris Ranson less than half an hour to create a chatbot to mark S1 investigations.

Artificial intelligence, he says, has huge potential to ease teacher workload - from helping teachers generate ideas for lessons to supporting them with administrative tasks like writing reports on pupils’ progress for parents.

But while there are “massive opportunities” there are also “significant threats”, he told MSPs yesterday.

For example, Mr Ranson said he had also used AI to tell him how to make a weapon using the chemicals in the chemistry department.

And if he could create a chatbot, so too could pupils - but they could use it to complete coursework and homework by asking it to mimic the answers a teacher might expect to get from a Scottish pupil at their particular age and stage.

AI in education

Creating a chatbot sounded like a complex task, Mr Ranson said, but actually, it was “very simple” and in the coming months would likely become even more straightforward as apps were developed.

Mr Ranson, who is also the AI lead in his school, Dunblane High, said: “I built a chatbot. It took me about 20 minutes. I’m not a tech guru - I don’t have an IT degree or anything like that. I just went through a simple process.”

Already, students were using AI to do work that they then passed off as their own.

“Before the summer I had pupils telling me: ‘Yep, I’ve been writing essays for English. I typed the question into ChatGPT, it spat something out, I handed it in, and I got it marked and it was fine.’”

The upshot, he said, was that assessment needed to change to adapt to the presence of AI.

He suggested that vivas - the oral exams used by universities in which students talk about their work - could be brought in as an assessment technique in school to check pupils’ understanding.

Mr Ranson made his comments in a session on AI and education at the Scottish Parliament’s Education, Children and Young People Committee, where there also appeared to be a consensus among those giving evidence that getting rid of coursework and relying increasingly on exams was not the solution to the challenges posed by AI.

The Liberal Democrats’ education spokesperson, Willie Rennie, highlighted that, while the final report of Professor Louise Hayward’s review of assessment and qualifications, published in June, had recommended a reduction in external exams, the rise of AI had led some to suggest that they were the safest form of assessment as they were “isolated away from the technology”.

However, Professor Judy Robertson, chair of digital learning at the University of Edinburgh, said that would be “a retrograde step”.

She argued that upping the use of invigilated exams in a bid to block the use of AI would be “a panic step”.

“It’s not what we want to do educationally speaking, it might be convenient, but it’s not going to take us where we need to be,” she said.

How can AI support assessment?

Ollie Bray - a strategic director at Education Scotland and a former secondary headteacher - told yesterday’s committee meeting that the notion of handwritten exams was “not palatable in 2023”.

He said more thought should be given to how technology could support assessment and “completely reimagining what assessment might be” using AI.

Helena Good - director of Daydream Believers, an organisation seeking to put creativity at the heart of education, and with a presence in over 30 Scottish schools - thought that placing even more emphasis on exams would be “a backward step”.

She said that “sitting in an exam hall regurgitating a set of facts, on a certain date, at a certain time in the year” was not giving students the “uniquely human” skills they needed - like creativity, problem solving and critical thinking.

In order to be prepared for the future, the panel called for: the curriculum to be updated; for professional development for teachers; and for government guidance on the use of AI in education.

Mr Bray called for more emphasis on the use of technology, including AI, in the curriculum.

He said: “We have to use curriculum reform as a springboard to try and get some of this important stuff into the curriculum.”

Understanding technology

The expert panellists highlighted that generative AI suffered from “hallucinations” - sometimes, the technology just makes things up - and also from bias.

Tools such as ChatGPT, said Mr Bray, draw their responses from the internet where most of the articles tend to be in English - so written from a European or North American perspective - and also “male driven”.

Professor Robertson argued that pupils needed to understand how AI works and be “aware of the limitations of AI and why they shouldn’t trust it”.

She said children tended to “over attribute intelligence” to AI, such as Alexa - Amazon’s voice-controlled virtual assistant - and there were things they should be worried about but are “simply unaware of”.

“We need to work out what is an age-appropriate level of understanding about how AI works for the different ages and stages,” she said.

However, she also said a lot of schools’ work on digital literacy was relevant and just needed to be adapted for “the AI context”, including being able to “evaluate information and sources” and to gauge the importance of “responsibility and respect” online.

She said there was a real need for guidance from the government on AI in education. There was currently a “vacuum”, which was “slightly worrying because things are moving so fast”.

Government guidance on AI

Mr Bray meanwhile called for the recommendations in the review of assessment and qualifications in relation to AI to be taken forward.

The final report said the Scottish government should “as a matter of urgency…convene and lead a cross-sector commission to develop a shared value position on the future of AI in education and a set of guiding principles for the use of AI”.

Mr Ranson - who would not recommend pupils using generative AI in the classroom because he “did not trust the source material” - said he had assumed when he started exploring AI tools before the summer, that government guidance on AI in education would be imminent, but nothing had transpired.

Mandatory in-service training for teachers in AI was needed “even just so that teachers are aware” of its potential impact.

“You’ll still have loads of teachers who just have no idea,” he said.

However, when teachers realised the potential of AI to “truly change the way that we do education”, they would embrace it, he said.

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared