The Commission is proposing new EU legislation to prevent and combat child sexual abuse online. With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive. The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64% increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires. Up to 95% of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform.
To effectively address the misuse of online services for the purposes of child sexual abuse, clear rules are needed, with robust conditions and safeguards. The proposed rules will oblige providers to detect, report and remove child sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.
A new independent EU Center on Child Sexual Abuse (EU Center) will facilitate the efforts of service providers by acting as a hub of expertise, providing reliable information on identified material, receiving and analyzing reports from providers to identify erroneous reports and prevent them from reaching law enforcement, swiftly forwarding relevant reports for law enforcement action and by providing support to victims.
The new rules will help rescue children from further abuse, prevent material from reappearing online, and bring offenders to justice. Those rules will include:
- Mandatory risk assessment and risk mitigation measures: Providers of hosting or interpersonal communication services will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.
- Targeted detection obligations, based on a detection order: Member States will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new child sexual abuse material or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.
- Strong safeguards on detection: Companies having received a detection order will only be able to detect content using indicators of child sexual abuse verified and provided by the EU Center. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.
- Clear reporting obligations: Providers that have detected online child sexual abuse will have to report it to the EU Center.
- Effective removal: National authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g. because they are hosted outside the EU in non-cooperative jurisdictions.
- Reducing exposure to grooming: The rules require app stores to ensure that children cannot download apps that may expose them to a high risk of solicitation of children.
- Solid oversight mechanisms and judicial redress: Detection orders will be issued by courts or independent national authorities. To minimize the risk of erroneous detection and reporting, the EU Center will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.
The new EU Center will support:
- Online service providers, in particular in complying with their new obligations to carry out risk assessments, detect, report, remove and disable access to child sexual abuse online, by providing indicators to detect child sexual abuse and receiving the reports from the providers;
- National law enforcement and Europol, by reviewing the reports from the providers to ensure that they are not submitted in error, and channelling them quickly to law enforcement. This will help rescue children from situations of abuse and bring perpetrators to justice.
- Member States, by serving as a knowledge hub for best practices on prevention and assistance to victims, fostering an evidence-based approach.
- Victims, by helping them to take down the materials depicting their abuse.
Together with today’s proposal, the Commission is also putting forward a European strategy for a better internet for kids.
It is now for the European Parliament and the Council to agree on the proposal. Once adopted, the new Regulation will replace the current interim Regulation. Feedback from members of the public on the proposals is open for a minimum of 8 weeks.*
Members of the College said
Vice-President for Democracy and Demography, Dubravka Šuica, said: “Upholding and protecting children’s rights online as well as offline is essential to the well-being of our societies. Online child sexual abuse material is a product of the manifested physical sexual abuse of children. It is highly criminal. Online child sexual abuse has wide-ranging, long-term consequences for children and leave a deep trauma. Some may and do, never recover. Child sexual abuse is preventable if we work together to protect children. We do not allow child sexual abuse offline, so we must not allow it online.”
Vice-President for promoting our European Way of Life, Margaritis Schinas, said: “The sheer amount of child sexual abuse material circulating on the web is dumbfounding. And shamefully, Europe is the global hub for most of this material. So it is really very much a question of if we do not act, then who will? The rules we are proposing set clear, targeted and proportionate obligations for service providers to detect and remove illegal child sexual abuse content. What services will be allowed to do will be very tightly ringfenced with strong safeguards in place – we are only talking about a program scanning for markers of illegal content in the same way cybersecurity programs run constant checks for security breaches.”
Commissioner for Home Affairs, Ylva Johansson, said: “As adults, it is our duty to protect children. Child sexual abuse is a real and growing danger: not only is the number of reports growing, but these reports today concern younger children. These reports are instrumental to starting investigations and rescuing children from ongoing abuse in real time. For example, a Europol-supported investigation based on a report from an online service provider led to saving 146 children worldwide with over 100 suspects identified across the EU. Detection, reporting and removal of child sexual abuse online is also urgently needed to prevent the sharing of images and videos of the sexual abuse of children, which retraumatizes the victims often years after the sexual abuse has ended. Today’s proposal sets clear obligations for companies to detect and report the abuse of children, with strong safeguards guaranteeing privacy of all, including children.”
The fight against child sexual abuse is a priority for the Commission. Nowadays, photos and videos of children being sexually abused are shared online on a massive scale. In 2021, there were 29 million reports submitted to the US National Center for Missing and Exploited Children.
In the absence of harmonized rules at EU level, social media platforms, gaming services, other hosting and online service providers face divergent rules. Certain providers voluntarily use technology to detect, report and remove child sexual abuse material on their services. Measures taken, however, vary widely and voluntary action has proven insufficient to address the issue. This proposal builds on the Digital Services Act and complements it with provisions to address the specific challenges posed by child sexual abuse online.
Today’s proposal follows from the July 2020 EU strategy for a More Effective Fight Against Child Sexual Abuse, which set out a comprehensive response to the growing threat of child sexual abuse both offline and online, by improving prevention, investigation and assistance to victims. It also comes after the Commission presented its March EU Strategy on the Rights of the Child, which proposed reinforced measures to protect children against all forms of violence, including abuse online.
Source: European Commission, press release and platform.