Nude and semi-nude images and AI risks: safeguarding guidance

Advice for schools from the UK Council for Internet Safety focuses on artificial intelligence as well as updates on device seizures and external speakers
14th March 2024, 11:19am

Share

Nude and semi-nude images and AI risks: safeguarding guidance

https://www.tes.com/magazine/leadership/compliance/nude-images-artificial-intelligence-risks-safeguarding-guidance
Nude and semi-nude images and AI risks: New safeguarding guidance

Earlier this week the UK Council for Internet Safety updated its document, Sharing nudes and semi-nudes: advice for education settings working with children and young people.

Within it is some notable information on the scale of the issue and new guidance around artificial intelligence.

Given the rapidly rising concerns about this issue in particular, this is an invaluable document for school leaders and designated safeguarding leads to be aware of. Here are the updates you need to know:

1. The scale of the problem

Insight into the prevalence of nude image sharing is now included in the document, drawing on research.

Most notably, it finds that 20 per cent of girls and 13 per cent of boys aged 15-18 have sent a nude picture or video of themselves, while 60 per cent of girls and 31 per cent of boys had been asked to share a nude or semi-nude image.

To put it into context for a teacher, this means that on average, one or two students in each class have shared a nude picture at some point and even more have been asked for one. Clearly, then, this is an issue that must be taken seriously by schools, especially safeguarding leads.

2. The dangers of new AI-generated material

In recent months, the proliferation of AI content-generating software has made it increasingly easy to create sexually explicit material.

In light of this, a new paragraph has been added to the document on page 15 which makes it clear that generating such images is illegal under the law concerning the offence of creating indecent images:

“The term ‘indecent images’ also includes pseudo-images, which are computer-generated images that otherwise appear to be a photograph or video. These may be created using tools such as photo/video editing software, deepfake apps and generators…and AI text-to-image generators.”

There is also helpful advice for schools about what to do in cases of AI-generated images and on making it clear to pupils that cases involving images like this will be treated in the same way as real images. There is also a useful link to the Take It Down tool, which can help to remove or limit the spread of nudes or semi-nudes shared online.

The new document also contains updated case studies on page 54 that include AI-related incidents, which could be helpful to get DSLs thinking about the challenges they may face.

3. Device seizures

The document has new information on what will happen to a young person’s phone if images are found and the police are called.

It explains that in lower-risk cases (if sharing is consensual, for instance), young people “should not have their devices seized as a matter of routine”.

In more serious cases though, “officers will consider the least intrusive way to acquire material - considering the impact of removing a child’s device balanced with ensuring that evidential integrity is maintained”.

This is particularly useful to enunciate for school leaders so that they know what to expect from the police.

The document also notes that schools should make young people aware that if a device is seized, they may not get it back.

“It is unlikely that it could be returned unless the images were solely of the owner of the device. This is because it is difficult to remove all image traces from a device and returning a device could possibly be interpreted as supplying an indecent image.”

4. External speaker checklist

Many schools are using external speakers to deliver lessons on aspects of the PSHE/RSE curriculum, including the sharing of nude and semi-nude images.

The guidance contains a new checklist of questions that school leaders should be asking themselves when using external speakers:

  • Are you clear about why an external visitor is going to benefit your education approach?
  • What will they be able to deliver that cannot be achieved by staff alone?
  • What long-term impact will the input have on your setting’s community?
  • Are you confident the external speaker has the required skills and knowledge to present the information to the selected audience?
  • Where do they get their information from, what is their evidence base, and have you checked to ensure the person delivering the session is not planning to use fear/scare tactics and/or victim-blaming approaches?

While the new additions are clearly of note, overall this is a document that anyone in a designated safeguarding lead role or with wider safeguarding responsibilities should read and keep close to hand - not least because concerns around AI mean that cases where indecent images are being produced and shared are likely to increase.

Luke Ramsden is deputy head of an independent school and chair of trustees for the Schools Consent Project

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared