Europe is ramping up efforts to combat the rise of online child sexual abuse material (CSAM) hosted in its member states. As the demand for stricter regulations grows, the European Union (EU) is aggressively protecting children from online dangers. With the EU’s Digital Services Act (DSA) in place and the increasing focus on AI-generated abuse content, the region is furthering legislative changes that hold tech platforms accountable. Further, they introduce harsher penalties for those who host harmful material.
What’s Happening & Why This Matters
In 2024, the Internet Watch Foundation (IWF) reported a rise in CSAM hosted by European countries. 62% of all CSAM webpages were traced to EU member states. The Netherlands was found to be the most common location for hosting this illegal content; more than 83,000 URLs are linked to the country. Other European nations, including Bulgaria, Romania, Lithuania, and Poland, also saw increased content they hosted.
This surge in harmful content has prompted widespread concern. According to Derek Ray-Hill, Interim CEO of the IWF, the situation demands immediate attention. He called for swift EU legislation and urged member states to unite to address the spread of CSAM. The IWF’s report shows that the majority of victims are children between the ages of 7 and 10. Girls are being depicted almost four times more frequently than boys.

Another disturbing trend is emerging: nearly 39% of the content is now generated using artificial intelligence (AI). Some of the content is described as extreme, further compounding the crisis. As Lori Cohen, CEO of Protect All Children from Trafficking (PACT), points out, this type of abuse is driven mainly by demand from consumers in the U.S., U.K., Australia, Canada, and Europe. She stressed that tech platforms need to be accountable for effectively monitoring and removing CSAM from their platforms.

Meanwhile, the EU is preparing to update its existing legislation to tackle online child sexual abuse more effectively. The current framework, established in 2011, is outdated and needs to address the rise of AI-generated content. A new proposal, introduced in February 2024, seeks to expand the scope of offenses and impose stricter penalties. However, the proposal has raised concerns over the balance between child protection and digital rights, particularly regarding privacy.
As the European Parliament and Council continue their discussions, Ray-Hill is calling for swift passage of the Child Sexual Abuse Regulation and the Child Sexual Abuse Directive. The legislations criminalize the production and dissemination of AI-generated CSAM. The urgency is apparent — European digital spaces cannot afford to become havens for online abuse perpetrators.
TF Summary: What’s Next
As Europe tackles child abuse material hosted on its platforms, the push for stronger legislation becomes more critical. The upcoming updates to the DSA and the proposed Child Sexual Abuse Regulation can improve child protection online. Platforms must also enhance CSAM prevention efforts through advanced tools like image hashing and AI detection.
TF Europe predicts vigorous discussions in the coming months on balancing privacy and child safety. The EU plays a pivotal role in establishing global standards for online protection. Collaboration and creating a safer online environment for children are paramount to all parties — children, parents, platforms, and lawmakers.
— Text-to-Speech (TTS) provided by gspeech