- Home
- ‘Until you see someone go through this, you can’t connect with it’
‘Until you see someone go through this, you can’t connect with it’
Dan fiddles with the cast on his left arm. It’s there for arthritis and he claws at its end, where the blue material meets his hand. It’s partly a nervous tick, the clawing. Mostly, he does it when he thinks I am uncomfortable - or when he thinks I am about to be uncomfortable - with what he has said or is about to say. Over the course of our conversation, he journeys to the end of the cast with the fingers of his right hand again, and again, and again.
You see, our discussions are very uncomfortable. Dan - we can’t use his real name for security reasons - is one of the most experienced officers working within the Metropolitan Police (Met) Predatory Offenders Unit (POU), or the paedophile unit, as you or I might know it. What he sees every day is what most people would never want to see, what most people could not see without a severe and lasting impact on their mental health.
“I have to see the baby rapes, I have to see the abuse of children. Every day,” he says. “We get psychological support. But everyone has a limit. Some may break along the way.”
Over the past five years, Dan has witnessed a change in the pictures he sees. Where once it was clear children in the images were under duress, or you had adults in the frame abusing the children, suddenly you had self-generated images: children’s selfies. And it wasn’t just pictures, but videos.
“These children are exposing themselves,” Dan explains. “There are no adults in the picture [and in some cases] there are children touching children. The only time you see an adult is when you see the little square in the corner of the screen - that’s the person they are talking to via the webcam, and that is usually a man masturbating.”
The internet changed things. Then broadband changed things even more. Not just how children could be abused, but the nature of the abused and the abuser. Now, anyone can become an abuser; now anyone can be abused; and now, often, neither see themselves in those categories.
For some, online sexual abuse is now just “being a teenager” - it’s even, as one victim put it, “a bit of a laugh”.
This is why I was sat in an office with Dan, on the sixteenth floor of the Met building near Earl’s Court in central London. It’s why I spent three days shadowing the POU and its sister unit, the Sexual Exploitation Team (SET). It’s why I am writing this article.
The Met - indeed, every police force in the UK - is worried about online sexual abuse and the fact that it’s often a pathway to physical sexual abuse. The messages of warning to children are not getting through. The seriousness and the prevalence of the offences are not getting through.
The police want help.
They want teachers to help.
Dangers of Snapchat
Detective inspector Dave Kennett reads through the case sheet that his colleagues have prepared. On it are 11 crimes that have been reported overnight in the Met police jurisdiction and that have been selected as possible cases that may fall under his team’s remit to investigate, rather than that of the borough police. As head of the SET, he’s looking for a particular type of case.
“Exploitation: is there a power imbalance in the relationship?” he explains. “That could be because of age, because there are drugs or money involved, because the boy is the school captain of the football team, the boy could be a gang member. The definition is quite subjective and it is up to us to try to interpret the law.”
Every single case on this list has an online element to it - the police include anything involving social media and the internet in this category. Nine out of the 11 cases originated on, or were facilitated by, messages on Snapchat. Several involve rape. Kennett ends up accepting all but one as his team’s responsibility.
This is normal, he says. The number of cases that make it past the borough police to his door averages about 10 per day. Often, it’s teenage girls being groomed by older men, but it’s just as likely to be peer-to-peer, where offender and victim are of similar age. Almost every case has an online element to it.
The victims in these online cases are getting younger, Kennett says. They are now regularly as young as eight years old.
To be clear: these are primary-aged children who are either groomed by strangers via their phone and in chatrooms, and who then send explicit pictures and videos of themselves to those strangers; or these are children sending each other explicit images and being exploited as a result.
Kennett reels off examples from memory. He has plenty of them. A nine-year-old girl groomed via Instagram, who sent naked pictures of herself to an adult male; an 11-year-old boy who was groomed in less than 20 minutes via Instagram, and sent explicit images of himself to a “girl” of 13 (in reality, a suspected adult male paedophile, though the case is unsolved); a 12-year-old girl who sent explicit images to another 12-year-old, which were then passed around the school.
These cases, and more like them, add to the numerous examples of older children who have been groomed, who have shared explicit images and video via social channels or chatrooms. There are also the cases in which children have sent images and videos voluntarily.
And then there are the cases where it is children - those under the age of 18 - doing the grooming.
“You get young offenders,” explains detective inspector Philip Royan, head of the POU. “In a couple of cases, I have found a vast array of imagery of young girls on a young person’s computer.”
It may surprise you that a child of 8 or 9 - maybe even younger - could be groomed via the internet in less than the time it takes for you to read this magazine. And it may surprise you that a predatory offender (ie, someone who seeks out such images and manipulates others to get them) can be a child. This does not fit with the stereotypes of older teens simply making mistakes or old men in flasher macs with bags of sweets in their pocket. But that’s the thing with online sexual abuse, says Kennett: you have to forget everything you thought you knew about it.
“Online is different [to old-style contact abuse],” he explains. “We need to get rid of those perceptions of contact abuse when we are talking about online exploitation.”
To do that, you don’t just have to start from scratch with how you envisage an offender, but also with how you imagine a victim. It’s about resetting what you thought was safe and what you feared. It’s about recognising just how messed up things have become - and taking some collective blame for it.
Becoming a victim
Every creation or distribution of a sexually explicit image, or video, of a child is a crime, but there are many different circumstances in which such a crime can take place.
At one end of the spectrum, you have two teenagers in a relationship who share explicit material of themselves exclusively with each other. That is a crime, but not one that the police will typically pursue.
“If they share images between them, and they are not shared to others and there is no exploitation, there is a crime, but who do we arrest? Who do we give a sexual offences record to?” asks Kennett.
“We are not in the business of criminalising teenage sexual discovery.”
A step up from that is a child sending an explicit image of themselves to another person with whom they are not in a relationship, and where there has been no grooming or exploitation - the receiver of that image is usually a member of their peer group. If the person receiving that image does nothing with it, then again it is unlikely to arrive on a crime sheet. But if that image is shared, you may begin to see some police involvement.
Then you have those cases in which someone is actively seeking explicit material. This could be between children of the same age. Kennett says this is often about blackmail, where the aim is to offer non-distribution of the images in return for the (usually) girl doing something they don’t want to do: to have sex or engage in some other sexual practice with the holder of the image; hold drugs or weapons for a gang; or provide access to other girls. It could simply be about bullying, too, he says.
The offenders may groom their victim, or they may get the images through other means (coercion, hacking, stealing from friends’ phones, etc).
Finally, on the far end of the spectrum are predatory offenders. Here, an adult offender will groom a victim to get an image or video, and then use that as leverage for further explicit material generation or, in the most extreme cases, to engineer a meeting for sexual contact with the child.
You may not be shocked by the first two categories. The idea that young people send each other explicit images of themselves is now almost accepted as part of growing up. Many adults do it, too. But Dan talks extensively about how the desensitisation to sexual imagery or acts leads to the problems we see at the more extreme end of exploitation - the latter two categories mentioned above. Porn is to blame, he says.
“Social attitudes have changed,” he explains. “In the past, if I wanted to see some porn, I would have to buy a magazine and I would have to use those images time and time again. Those images, which were very ‘vanilla’, would then make an imprint on how I viewed women as I grew up and how I viewed sex.
“Now, as a teenager, you have porn sites with hundreds of not just images, but full videos, and those videos are in different categories of sex. They may include threesomes, anal sex, bukkake, scat. That makes those things mainstream.”
Kennett explains that this leads to replication in teenagers’ own lives. “It is now the norm that sexual activity will take place between two young people and they will film it,” says Kennett. “It is the norm to take explicit pictures of yourself.”
Dan adds: “We see a lot of bathroom shots, guys and girls exposing themselves - these kids share them with each other.”
The omnipresence of porn and society’s reaction to it - along with the technology to make videos and images being in the hands of most children - has not just made the creation of sexual images and video normal, but made niche types of sex mainstream.
Even schools have stopped being shocked. Dan cites an example in which parents raised concerns at a school about their daughter sending explicit images to another student, who in turn distributed them to others in the year group. The response was: this is normal behaviour, don’t worry, it’s teenage experimentation, it will blow over, we have some great PSHE resources we can share…
Just think about the message that all of this conveys, he says. Think about how this acceptance, this normalisation, influences the behaviour of teenagers.
From a victim’s perspective
For example, let’s look at peer-to-peer abuse from the perspective of the victim, where the offender and victim are the same age or very close in age. You are 13 years old and a boy in your year (it is almost always a male offender) asks for an image of you exposing yourself. You do it, because everyone does it, right? He says you are pretty, he says he likes you. It’s just a laugh, just one picture.
And then you send it. And then he says he is going to send the image to your friends, to your parents. Unless…
Another example, and this time there’s a predatory offender: you’re on a social media platform or in a chatroom, or you are playing games online, and a message pops up from what seems to be a young girl or boy, and she or he is saying you look great, really pretty, and they want to see more of you - can you send them a picture? They’ll send you some pictures back.
You don’t know them, but everyone does this. This is normal, right? You send it. It’s just a bit of fun. You send more, and then they ask for videos and you send those, too.
If you’re lucky, your parents might spot the messages at this stage and report them, or a friend will. But when the SET officers intervene, they say the children are embarrassed, but they rarely perceive the seriousness of the situation.
“Because it is not physical, because it is all online, it is not real to them,” says Danielle Power, acting detective sergeant in the SET. “They just do not see the danger.”
She says she has dealt with many cases in which the victim even says they think the situation is funny.
If this activity does not come to the attention of the police, then it goes one of two ways, says Kennett: “The [offender] will admit to being a horrible 50-year-old and will say I am going to send these pictures to your mum or friends, and now I want you to do x or y. Or they just keep going, they just want the images, and they keep going.”
In one case the team worked on, the offender sent an 11-year-old child a video of what he wanted her to do. It was a sexually explicit video. And it was a video featuring another 10- or 11-year-old child.
Yet another scenario: you are 15 years old and someone contacts you to tell you they are a modelling agent. They tell you that you could be a model. They ask you to send them some images. They tell you it needs to start with some glamour stuff. And you agree, it can’t do any harm. Then they tell you it needs to be porn - it’s great money, a great place to start out, everyone does it. No one will see it, it will be distributed abroad. And you do that, too, because those videos - you’ve seen plenty of them - seem harmless. Maybe even glamorous. You’re safe in your room - they can’t get you. Everyone does it.
Power had a case like that. The “modelling agent”, posing as a female, was a man in his twenties living with his parents. He’d tricked countless girls into creating explicit material.
Sometimes the approach is less subtle, she adds. “A man was getting girls to strip on a website called AdultWork - he handled everything, including the payment from the website that was meant to go to the girls,” she recalls. “Rather than the girls being paid after 28 days from the website, he would pay them himself in advance. And he would give them extra money, so saying something like, ‘Here is 500 quid for a holiday.’
“He would then say ‘You now owe me £700’ and tell them they had to be recorded having sex with him, so they could upload the video in order to earn the money back to pay him. One victim was 16. [The offender] put a deposit down on a flat, he gave her drugs, and the videos just got worse and worse. It was violent, extremely explicit.”
One final example: you are in a chatroom and a message pops up. You open it to find a video of a man masturbating, asking you to expose yourself. (This is what some offenders do now: it’s a numbers game, and if they don’t get a hit, they move on. After all, there are plenty of other girls on social media.) It’s no longer shocking to see a man masturbating on screen because you’ve seen material like that before. You may have even seen classmates doing it. And it’s normal to expose yourself, too, right? It’s only online. It’s harmless. It’s only a bit of fun.
“Sometimes the kids watch and think it is a laugh, and the person will ask them to expose themselves, and often they do,” says Dan.
Talking about the risk
Then he tells me about large groups of children creating explicit material together. He saw something recently that shows just how bad things have become.
“The most I have seen is nine girls in the same room,” says Dan. “And they are kissing each other, doing things to each other, while a man masturbates on his webcam. And the girls are saying things like ‘Yeah, go on, look, he’s wanking’, and then they are saying ‘wank over me’, and they are taking their clothes off and touching each other. These are girls aged 9, 10, 11, 12.”
He tries to find the words to describe just how horrific this was, but gives up.
And then, after a long pause, he says: “We really need to get a grip on this.”
We’ve told children to be careful. We’ve told them more than once. Some children may even parrot the warnings back to you, those tales of danger, snippets of advice, those stories of when things go wrong. But how much have we really communicated? How much detail have we really gone into - as parents, as teachers, as a society?
Because children don’t seem to realise that the pictures and videos they send to a boyfriend - or a would-be boyfriend, an online groomer or a random man on a social media platform - could end up all over the internet, shared between paedophiles on specialist websites, on bestiality websites, on gore websites.
Dan, Kennett, Power and the others have to sit through graphic videos of people having sex with animals, and they have to trawl through videos of things such as Mexican drug cartels decapitating a rival with a chainsaw. Because that’s the sort of site on which paedophiles hang out. That’s the sort of place where they swap selfies and videos. It’s where that sort of thing is acceptable.
Children don’t seem to realise you can never delete the images, that they exist everywhere and anywhere simultaneously - that they may resurface at any time. Dan explains that the same images crop up over and over again. The police delete them, they resurface - it’s a cycle that never ends.
Children don’t seem to realise that it is not all normal, the things they see in porn, the things they do to each other, the things they send to each other. They’ll realise later, when they get into a real relationship. They’ll recognise how serious it really was. But now?
Children don’t seem to realise how much information about themselves they are giving to an offender. “They will talk about who their friends are, where they go to school, where they have been. They even send pictures of themselves in their school uniforms,” says Kennett. And they don’t know that when they send an image or video from one phone to another while the location settings are on, some simple software can tell the offender exactly where they are located. They don’t know that all of this might mean that the offender knows where their bedroom is, that the offender can watch the bedroom, that the offender can see when they leave, when they arrive, when they are alone.
Finally, children don’t seem to realise that simply sending a naked photo of themselves to a stranger, or to someone they thought was a friend, someone they thought they loved, might eventually result in a rape. It might end in a situation where they want to take their own life. It might, in the most extreme cases, eventually result in murder. It’s rare, but it’s not as rare as you might think - or hope.
“Self-harm and suicide are a risk. There are many documented cases of this,” says Kennett. He urges people to watch a video message by Canadian teenager Amanda Todd (bit.ly/AmandaToddYouTube), who went on to commit suicide after being exploited online.
But maybe children do realise it. Maybe they are aware of everything that has been documented in this feature. And maybe they still do it because they don’t believe they will become one of Kennett’s cases, one of Dan’s cases, one of Power’s cases.
So some blame the kids. The attitude that we as adults did all we could, that this was unstoppable, even that “she was asking for it” is far too common, says Kennett. “The victim-blaming can happen across the board - police, social services, schools - and we guard against that, we watch for it, we make sure that does not happen here,” he says.
Taking responsibility
But Power still sees it in the eyes of juries, when a girl is kicking off while on the stand, full of bravado, full of anger, full of “this has not affected me, I don’t care”. They don’t see beyond that.
And we are quick to blame the social media companies, too: they let this happen. They provide the link. If it wasn’t for them, our children would be safe.
But Dan says it’s not the fault of Facebook, Snapchat, Instagram or anything else.
“Yes, there is a moral obligation [for social media companies to help], and many do actually help us, but are they the problem? In reality, how can Google really control it?” he asks. “We have to get out of the blame culture. We are blaming a commercial organisation for human behaviour. We are taking aim at the wrong place.”
And it’s definitely not the fault of the children, he says: “Something has gone wrong. And it is not the children’s fault. We have to fix this, not blame them.”
It’s no one’s fault but our own, he stresses. So what can we do?
First we need to recognise that there is no profile of a victim: they could be as young as 5 or 6, and they may not be a vulnerable child, but the top-set girl with all the friends and all the confidence.
And then we have to recognise that we have opened the door to an environment in which children think all of this is OK, and we need to find a way to close it. Or at least to manage that environment.
“You would not send a child to the park on their own with no advice,” says Dan. “You would not let them cross a road without advice. We warn them about strangers. And for all these things, we drill them on the rules, on safety until we are satisfied they are safe. And even if the parents do not do that, there is extensive advice about all that in school. We have all this covered. But the internet? Smartphones? We just let them do it.”
He says we all need to talk more openly about the dangers, as well as the need to scaffold internet access; and we should restrict use of phones until we know children are as safe as they can be.
We also need to be unafraid to talk about sex and porn, highlighting the myths, explaining that all those videos on all those porn sites are not reality, and that this thing you watch can create multiple issues, that you can even become addicted.
“We need to get over the embarrassment,” says Dan, “to talk to children about pornography and say to them, ‘You are going to see things that are not normal - threesomes, being tied up etc - and these are not things everyone does, but porn sites make it seem like they are.’ We can’t do that if we are too embarrassed to talk about sex.”
And we should not hide away from what our children can be exposed to.
“The information has to be age-appropriate, age-relevant,” says Dan. “We should be warning children at all ages, but doing it in the right way.”
You might argue that this is not the job of a teacher. The police have a lot of time for that opinion.
The officers at the Met have all sat their own children down - they have restricted their internet access, scaffolded their knowledge of the net and talked to them about porn.
“My son would definitely say I am overprotective,” admits Kennett. But he says sometimes teachers are the only ones young people will listen to.
“The teachers, I think they are in a very difficult situation,” he says. “It takes time to learn all this stuff - time they don’t have. This is not their job, this is the job of parents. But the problem - or the reality - is that it is often the teacher that is the most trusted person in a young person’s life.”
He adds that they may be the only person willing to have this conversation, too.
But it’s just words, right? After all, schools have tried. Teachers have got great PSHE resources, great safety advice in the computing curriculum. They’ve done their absolute best.
It has no effect.
Well, ask yourself, says Dan, did you really mean it? Did you really understand it? Did you go far enough? Did you feel it?
“It is not about sex education, about internet-safety advice, about firewalls on the school wi-fi,” he says.
It’s not about a new SRE curriculum.
The officers stress that words are not enough on their own.
An emotional connection to what you are talking about is key to children understanding how serious this is, as well as keeping them safe.
This is why this feature has detailed so many cases and why the details have been so graphic. It is why we have given Lorin LaFave four pages to tell the story of how her son, Breck, was groomed and murdered. And it’s why the Met has opened up to us.
If teachers and parents - society as a whole - do not understand how serious the danger is, if they do not understand what really happens and if they don’t get scared - if they don’t fear the pain of this happening to someone they love - then how can we expect children to take it seriously?
“Is it put across in schools in a procedural way? Do [teachers or parents] do it just because they have to? Without that emotional connection - without that deeper understanding - I don’t know if it will have an impact,” says Power.
“Until you see someone go through this - until you see the videos, read the grooming messages, view the videos that are extorted out of these girls, see what can happen next - you can’t really understand it, you can’t connect with it and you cannot be passionate with it.
“That’s the problem: as a society we can be very similar in our reactions, as the girls are - it’s online, no harm done. But it is harmful. It ruins lives.”
Jon Severs is commissioning editor of Tes. With thanks to the Metropolitan Police - and DI Kennett, acting sergeant Power and ‘Dan’ in particular - for their time, cooperation and for enabling this feature to take place
Keep reading for just £1 per month
You've reached your limit of free articles this month. Subscribe for £1 per month for three months and get:
- Unlimited access to all Tes magazine content
- Exclusive subscriber-only stories
- Award-winning email newsletters