How the EU plans to combat online child sexual abuse
The issue of online child sexual abuse imagery is growing, and during the pandemic, online platforms reported a surge in cases – up to a 25% rise in some EU countries.
Under new European Union plans, which are expected to be announced in the coming months, online platforms could face new laws that would force them to get tougher when dealing with online child sexual abuse.
In the last few years, the EU has promised to step up its efforts against illegal content. This includes child sexual abuse and child pornography, as well as other types of content, lie illegal hate speech, and incitement to terrorism and terrorist propaganda.
These new rules would replace the interim legislation that is in place at the moment, which currently makes reporting child abuse material voluntary for social media companies.
Instead, these companies would have a legal obligation to detect illegal material, report it to the authorities, and remove it from their platforms as quickly as possible. Voluntary reporting would no longer be sufficient, and platforms would have a duty to take swift action.
In particular, Meta, which is the parent company of Instagram, Facebook, and WhatsApp, would need to change its practices to keep up with this change in the law, as these platforms currently account for approximately 94% of notifications of online child abuse material.
Currently, Meta can, like other digital platforms, decide whether it should follow up on reports of illegal material, including child sexual abuse. However, in 2020, a number of companies stopped reporting content to the authorities over fears of breaking online privacy rules.
In a report released last year, there were nearly 22 million reports of child abuse images online, with many web pages hosting the content being in Europe.
As this could be just a small percentage of the actual number of cases, it’s essential that the EU commits to improving prevention, law enforcement, and victim support.