“My search for the boy in a child abuse video”

Give One For Love

By antiplondon

One day, just after I had dropped my son off at school, I was sent a horrific video on WhatsApp. It made me question how images and videos of child sex abuse come to be made, and how they can be openly circulated on social media. And I wanted one answer above all – what happened to the boy in the video?

It may sound strange, but the woman who sent me the video was a fellow mum at the school gates. A group of us had set up a WhatsApp group to discuss term dates, uniforms, illnesses.

Then one morning, out of the blue, one of these mums sent a video to the group, with two crying-face emojis underneath it.

It was just a black box, no thumbnail, and we all pressed play without thinking. Maybe it would be a meme or a news story. Maybe one of the “stranger danger” videos some of the mums had started to share.

The video starts with a shot of a man and a baby, about 18 months old, sitting on a sofa. The baby smiles at the man.

I can’t describe the rest.

If I tell you what I saw in the 10 seconds it took to grasp what was happening, and press stop, you’ll have the image in your head too. And you don’t want it. It’s a video of child sex abuse. It’s nine minutes long.

I screamed, and threw my phone across the room. It was pinging with messages from distraught members of the group.

I took my phone to the police station. I told them what had happened. I told them I believed the woman had sent it to us as a warning, and that I hoped they would investigate where the video came from. Was it new, or one they’d already come across? Was this little boy still in danger? Could this evidence help save him, or catch the abuser?

The police had my phone for two weeks. I found out the next day that they arrested the woman who sent it and visited other members of the group. And then I didn’t hear anything else about it.

But one question stayed with me. What happened to the boy in the video? And so, a few months later, once I could read my own kids a bedtime story without thinking of him and life had got back to normal, I began to look for answers.

I started by trying to speak to the police officer investigating the video on my phone. But every time I called Wembley CID to speak to him, he’d just gone out.

He didn’t want anything to do with me.

I checked with Alan Collins, a lawyer who specialises in child sex abuse, to see if any of the things I might normally do to track down people would work. Could I, for example, send former police officers a copy of the video to see if they recognised it?

“You could be looking at a prison sentence of 10 years,” he told me. Same goes for taking a still and sending that. Just possessing an image like this on my phone could land me in jail.

So I called a friend of a friend who used to work for the police. He told me Wembley CID would have sent my phone off to one of the digital forensics labs spread across the city. The labs list all the illegal content, and when it’s child sex abuse they grade it: Category A for the most serious, Cat B, Cat C. This WhatsApp video was Cat A.

Next, the file goes to victim identification and my case was passed to the Metropolitan Police’s Online Child Sexual Exploitation and Abuse Command. An officer there, Det Sgt Lindsay Dick, agreed to talk to me, but he didn’t want to say much about the techniques used in case it helped offenders work out how to evade capture.

He did tell me about one case, where an officer had got hold of a phone that had images of a boy being abused on it, along with images of the same boy not being abused. In one, he’s standing at a bus stop in school uniform. An officer recognised the bus stop as a Mersey Transport sign, and put a call in to the Merseyside team. They recognised the school uniform. The boy was identified, his parents arrested, and social services took over. Victim-identification police all over the world rely on little clues like this.

Lindsay Dick wouldn’t discuss the details of what I’d been sent, even though he had investigated the case. Then, when I asked him about a suggestion from an editor to take a still from the video of the perpetrator’s face, to help identify him, I started to feel some heat.

“Do you still have a copy of that video?” he asked me, sternly. “No,” I replied. But it was still sitting somewhere on WhatsApp’s server, and because I was still a member of the group, it was still showing on my phone. Even though I’d done nothing wrong, I realised how seriously the police took this kind of thing.

This hit home late last year when a senior Metropolitan Police officer, Supt Novlett Robyn Williams, was given 200 community hours’ unpaid work and threatened with losing her job for failing to report a video of child sex abuse her sister had sent her on WhatsApp. (She is now appealing against the conviction.)

The Metropolitan Police refused to help me any further in my search for the boy in the video. At one point they even told officers in another part of the country, incorrectly, that I’d been cautioned for sharing the video.

I found out later from the woman who sent the video to me that she had been given three years on the sex offenders register. But the investigating officers at Wembley CID took the case no further – they didn’t arrest the friend who had sent it to her, and they didn’t even try to find out who had sent it to her friend. Further up that chain of people sharing the video must be some dangerous people, perhaps an abuser. But nothing was done to follow the trail.

The Metropolitan Police says: “The scale of child abuse and sexual exploitation offending online has grown in recent years. This increased demand on police, coupled with the need to keep up with advancement of technology and adapt our methods to detect and identify offenders, means it is a challenging area for the Met and police forces nationally. However, we remain committed to bringing those who commit child abuse offences online to justice, and safeguarding victims and young people at risk.

“We encourage anyone concerned about a child at risk of abuse or a possible victim, to contact police immediately. Anyone who receives an unsolicited message which depicts child abuse should report it to police immediately so action can be taken. Images of this nature should not be shared under any circumstances.”

I needed someone who wasn’t involved with the case to give me some more clues about where this file I’d been sent might have come from. So I started searching, and I came across news articles about a team in Queensland, Australia with a reputation for infiltrating child abuse video-sharing sites.

Their head of victim identification, former Greater Manchester Police detective Paul Griffiths, told me the file I’d been sent had probably started life on one of these sites.

“What tends to happen is that when a file gets produced like that, it generally stays under cover, under wraps, circulating amongst a fairly small, tight network. Very often people who would know that they need to keep it safe and not distribute it widely,” he said.

These networks of paedophiles use the dark web, a part of the internet that isn’t indexed readily by search engines such as Google. They access sites through a connection called TOR, or the onion router. They use a fake IP address, connected to several other servers dotted around the globe, which makes their location untraceable.

Members of these dark web sites are like sick stamp collectors – they post thumbnails of what they have on dedicated online forums, and look to complete series, usually of a particular child.

Some of them are “producers” – they abuse the children, or film them being abused.

A couple of years ago Paul Griffiths’ team was watching one site called Child’s Play. They had intelligence that two of the site’s leaders were meeting up in the US. Officers intercepted them, arrested them, and got their passwords.

Now they could see everything – each and every video – and they could get to work finding children and perpetrators. They made hundreds of arrests worldwide, and 200 children have been saved so far.

“It’s Sherlock Holmes stuff, it’s following little clues and seeing what you can piece together to try and find a needle in a haystack,” says Griffiths.

The big worry now is live-streaming, where adults can pay to watch children being abused in real time. It’s even harder to detect, because no file containing clues circulates, and the platforms are all encrypted. Just as the police and technology get better at finding victims in stills or videos, another threat emerges.

“There’s a famous story and it often gets told in relation to this area of crime, in relation to the young girl walking on the beach and there’s starfish all over the beach and she’s picking the starfish up and putting them back into the sea and a guy says to her, ‘Little girl, what are you doing? You’re never gonna be able to save all of these starfish.’ And she says, ‘No, but I’ll save that one.’ And that’s really what we’re doing,” says Griffiths.

“You know, we’re saving the ones we can save. And if some magical solution appears somewhere in the future that’s going to save all of them, that’s going to stop this happening, then that’ll be wonderful. But in the meantime, we can’t just sit back and ignore what we know is happening.”

Paul Griffiths is part of a small network of people who travel the globe for meetings and conferences on what to do about the huge numbers of videos and images circulating online.

He told me to contact Maggie Brennan, a lecturer in clinical forensic psychology at the University of Plymouth, who has been studying child-sex-abuse material for years. Between 2016 and 2018 she combed through the child-abuse images in a database run by Interpol, to build up a profile of victims.

She found a chilling pattern that suggested the age of the boy in the video I saw is not that unusual.

“Concerningly, there is a substantive, small, but important proportion of those images that do depict infants and toddlers. And we found a significant result in terms of the association between very extreme forms of sexual violence and very young children.”

Like the boy in the video I was sent, most children on the database are white – most likely a reflection of the fact that the police forces contributing to it are from majority-white countries.

There’s constant pressure, Brennan says, to quantify the numbers of images or videos that are in existence, and the numbers of victims who are being sexually exploited. But it’s impossible. Databases only hold the images that have been found, through police raids or reports. Who knows how many are circulating out there?

Paul Griffiths says it only takes one person to bring a video out of the depths of the dark web and unleash it on the general population.

“Sooner or later it comes into the possession of someone who either doesn’t know how to keep it safe and hidden, or doesn’t really care. And they spread it wider. It can take a few hours, and it’s all over the internet.”

I spoke to one offender who served seven months in prison for viewing child abuse images. He had been offered the files on Skype during an adult online sexual meet-up. He’d opened the first file, seen it was of a child – and carried on opening all 20. Then he tried to share them with someone else. Eventually, the man who sent him the files sent them to someone who told the police. But it’s a telling example of how easily files like the one I was sent spread, from the depths of the dark web, on to platforms like Skype, and then to people’s phones.

Despite the lack of action taken on my case, the UK policing response to child sex abuse images is one of the most robust in the world.

The Child Abuse Image Database (CAID) has seen huge investment over the last five years. When detectives receive the phone or laptop of a suspect, they can run images on it through state-of-the-art software that checks whether images are new, or already known to police. All police forces are linked up, and the database talks to others around the world.

In the 1990s the Home Office undertook a study of the proliferation of indecent imagery of children. There were less than 10,000 images in circulation then. Now there are almost 14 million images on the UK database.

The levels of depravity in videos and images are getting worse, Chief Constable Simon Bailey tells me. He’s been the National Police Chief Council’s lead for child protection and abuse investigations for the last five years.

I am expecting a forbidding character when I go to interview him at his Norfolk HQ. What I find is a man at the end of his tether.

“It just keeps growing, and growing, and growing,” he says. “And there is an element of, ‘These figures are just so huge that just can’t be right.’ Well trust me, it is right. And if I have one really significant regret around my leadership and our response to this it’s that we have struggled to land with the public the true scale of what we are dealing with, the horrors of what we are dealing with. Most people, I would like to think, would be mortified that this type of abuse is taking place.”

Lucy Proctor, BBC website, full article here (and I would recommend reading the whole thing, even if it is tough going).