The Story Behind the Numbers

One of the strengths of the project is the direct involvement of Dutch and Belgian law enforcement as project partners. Yves Goethals, the Judicial Commissioner of the Belgian Federal Police, talks about the challenges they face when processing NCMEC reports.

What role do NCMEC reports play for the Belgian Federal Police?

NCMEC sends us a high number of reports which means an increased workload for our law enforcement. It is important to keep in mind that the social media platforms which create the NCMEC reports are not legally liable for the content if they report it. This means that the media platforms often report on anything that might be illegal just to stay on the safe side. Therefore, not all NCMEC reports lead to a police investigation. In Belgium, for instance, only 40% of the received NCMEC reports contain content that is against Belgium law. But if you take into consideration that we receive roughly 15,000 NCMEC reports a year, that is still quite a high amount of illegal content. The consequence of the high number of NCMEC reports received is that about 70% of our capacity goes towards processing them.

Does the way the CSAM is reported have a significant impact on how law enforcement can process it?

Yes, absolutely. The high quantity of reports makes formatting a crucial element of the processing time. If we only received 10 reports a year, the format would not matter as much. But with 15,000 NCMEC reports every year we are unable to manually go through each one of them quickly and we must rely on tools that help with prioritisation. These tools however will not properly work if reports are formatted differently.

Isn’t there a requirement for ESPs to use the same format when they create a report?

Not at the moment. ESPs report whatever they think is illegal and include as much information as they have or want to give about the uploader. Some reports contain only minimum information and a notice from the ESP to contact them if more information is needed. While other reports actually contain much more information than needed. This is an issue that will hopefully be solved with the new European legislation. Hopefully, this leads to more uniformity between reports and makes clear to ESPs what needs to be reported and how it should be reported.

Have you seen an increase in CSAM, in general, this year?

To me the number is not important, it’s the content that counts. We can’t really control the number of reports. It’s hard to say if there was an actual increase. Before the derogation law came through in October 2021 for example, we saw a noticeable decrease in NCMEC reports coming from Meta. Once the derogation law passed, Meta sent all of the reports that they hadn’t sent before in one go. This led to a great increase in reports received, but that doesn’t mean there was an overall increase.

Did the global COVID-19 pandemic have any effect on the rise of these numbers?

The pandemic definitely had an effect. Child Focus, the Belgium hotline, noticed an increase of at least 30% of reports already just within the first 5 months of the pandemic. Children were at home more and spent more time online. We received more notifications from schools during the pandemic, who were worried about vulnerable children being contacted through VPNs from all over the world and were asked to send sexual images of themselves. This is the unfortunate consequence of children spending more time behind a computer. But now that the pandemic is more or less under control, we don’t see a decrease.

Did the Belgium police experience a high increase in NCMEC reports in 2021?

Not really, for us the peak of incoming CSAM reports was 2 years ago. But I do want to stress here that these statistics do not always represent the actual number of cases. For instance, we recently received over 500 reports on the same suspect within a 5-month period. This means that 500 reports lead to only one police investigation. Thankfully, it is a truly extraordinary case and the highest number of reports we have ever received for one suspect. But it clearly shows how statistics do not always show the full situation. We often receive between 3 to 15 reports on one suspect and the only thing that differs in the report is the image, while date, time and place are the same. This also has a clear impact on statistics. I can imagine that if this happens in Belgium, it happens everywhere else. Therefore, those 29.3 million reports can easily be reduced to a much smaller number in terms of actionable reports for the police. But no matter how you look at it, the total number of reports is very high.

Have you identified any noticeable trends in CSAM in 2021?

I am very worried about the increase in self-generated content by young people. We also clearly see that the victims are becoming younger and younger and the content itself is getting more violent and extreme. 20 years ago, if you went to a fair, that was exciting. But nowadays, unless an attraction can turn you upside down and move really fast, it is considered boring. Our society nowadays wants more excitement, more extremes and this translates back to child abuse as well. The victims becoming younger could be caused by the need for control. Sexual gratification is not always the reason for child sexual abuse, control plays an important role. Our children are becoming more vocal and a 12-year-old will not always be controlled easily anymore. This could be part of the explanation for why abused children are becoming younger, but we can’t be sure.

The European Commission proposed new legislation that would require social networks to report and remove CSAM from their platforms themselves. What impact would that have on law enforcement?

We will have to wait and see how the new legislation will be implemented to see the true effect and impact. I hope the social networks won’t get to decide how they If you take into consideration that sometimes NCMEC reports are from events that happened 6 months previously, you can see how that can be troubling. That’s why it is so important to consider the remaining data retention time when prioritising reports.

What are the main challenges when it comes to processing CSAM reports?

Police capacity, the balance between privacy legislation and crime-fighting, and the enormous number of CSAM reports are the main challenges. A crime is being reported and we have the obligation to investigate and report to the prosecutor’s office. But this is taking a lot of our capacity, up to 70% in the Child Abuse Unit right now. Restoring the balance between privacy and crime-fighting is a hot topic. We can’t stand still in the development of tools such as AI, because this will lead to us being so far behind criminals – we will never be able to catch up again. Most of the people on online platforms are doing nothing wrong, but we need to be able to protect them. If a store gets robbed, the police receive a description of the perpetrator and start an investigation. But nowadays, crimes happen increasingly online, and we need the tools to look for perpetrators there as well. The police have no interest in private pictures of your children. But if somebody hurts your children, we need to be there. The only way to handle the ever-growing amount of CSAM reports at the moment is accurate prioritisation.

What is a case that you consider a success?

In 2017 a man was caught posting an illegal picture on Facebook. NCMEC sent the report to Europol, and they sent it to us. We went and investigated and what we found were lots of encrypted hard drives. This led to an investigation that in Belgium alone ended with 5 arrests. One of those arrests was a man acting as an administrator on the dark web, who was offering his son for sexual abuse to others. This case led to 300 intelligence reports worldwide. Just because one man posted that picture on Facebook. This is why you can’t prioritise based on ‘this is only one picture.’ It can lead to much more. handle things, because their decisions can be strongly influenced by the financial aspect. What we need are strict and honest rules that apply to everyone. Not only European-based companies, but any company doing business in Europe should have to oblige. Once the new legislation has been adopted it will also take time for each country to implement it into their own legislation. We have seen this take years in the past, so let us hope this goes a lot faster this time. And then we have to wait and see how the smaller platforms will handle the new legislation. Everybody has to stop at a red light, but unfortunately, that doesn’t mean everybody does.

What is data retention time and how does it influence the processing of CSAM?

Data retention time stands for the amount of time the ESP is obliged to retain the data and make it available for justice or police. In Belgium, we are quite lucky, as the retention time for possession is 6 months and for distribution, it’s 9 months. But there are countries where the data retention time is almost down to zero.

What role does AviaTor play for the Belgian Federal Police at the moment?

AviaTor is very efficient when it comes to prioritisation, which is the main reason we use it. You can put your own parameters into AviaTor and make it prioritise based on those. I think AviaTor does that really well. I don’t want a tool that does 100 things but none of them very well, I prefer using a tool like AviaTor that is really well built and does a couple of things very well instead. The AviaTor project is a unique limited-scaled project with a small group of people on board. In my opinion, this is the best way to work. Instead of having meetings with up to 100 people at a time as you would have on bigger projects, we can come together in smaller groups and discuss items much easier and quicker. If I need someone, I can call them directly. Personal contact and flexibility are huge advantages.

Yves Goethals is the Judicial Commissioner of the Belgian Federal Police with over 21 years of experience in the fight against CSAM. The Belgian Federal Police is one of the partners in the AviaTor Project.

The Story Behind the Numbers
05.12.2022 Article by INHOPE

‘To me the number is not important,
it’s the content that counts. ‘