The U.K. Is Investigating How Reddit, TikTok Handle Children’s Data

Tiff Staff

Online privacy and child protection remain at the center of regulatory concerns as social media platforms continue handling massive amounts of user data. The U.K.’s Information Commissioner’s Office (ICO) has launched an investigation into TikTok, Reddit, and Imgur, focusing on how these platforms collect, store, and use children’s data. With minors spending more time online, questions around privacy, security, and algorithmic influence have intensified. The investigation signals a more assertive stance on enforcing data protection laws, particularly for platforms catering to younger audiences.

What’s Happening & Why This Matters

The ICO is examining whether these platforms properly enforce age verification and whether they protect the data of users aged 13 to 17. TikTok’s algorithmic recommendation system, which tailors content based on user behavior, has raised concerns about data privacy and exposure to harmful material. Regulators question whether these companies comply with U.K. data protection laws and the Children’s Code, which mandates stricter safeguards for minors using digital services.

The investigation also scrutinizes Reddit and Imgur’s age verification methods. Authorities are evaluating whether these platforms effectively distinguish between minors and adults and whether their measures prevent young users from being exposed to inappropriate content. The ICO’s goal is to determine whether these services uphold the highest standards of child safety in a digital space that remains largely unregulated.

Legal Framework & Prior Actions

The U.K. enforces strict online privacy regulations, including compliance with the General Data Protection Regulation (GDPR) and the Children’s Code, designed to prevent data misuse and ensure child safety. TikTok, which has faced previous fines for mishandling children’s data, is again under scrutiny.

The ICO opened investigations into online platforms TikTok, Reddit, and Imgur to assess their steps to protect children ages 13-17. (CREDIT: THN)

In April 2023, the ICO fined TikTok £12.7 million for unlawfully processing data from 1.4 million underage users. In September 2023, the Irish Data Protection Commission issued a €345 million penalty against TikTok for failing to protect children’s privacy settings. These actions set a precedent, showing that regulators are prepared to hold tech companies accountable for failing to protect minors.

Growing Concerns About Social Media Data Practices

Authorities worldwide have raised concerns about TikTok’s data collection, particularly its links to ByteDance, the platform’s China-based parent company. U.S. lawmakers have threatened bans over national security fears, while European regulators focus on privacy risks and child protection policies. The current investigation suggests that scrutiny of TikTok’s data practices is far from over.

U.K. Technology Minister Peter Kyle has called for greater oversight, warning that social media platforms must prioritize user safety over engagement-driven algorithms. The ICO’s investigation also extends to Reddit and Imgur, signaling that regulators are taking a broader look at the business models behind these platforms.

The ICO has clarified that these investigations are not about singling out any one company but ensuring that social media platforms uphold data privacy standards for minors. The findings could lead to stricter regulatory compliance for digital platforms, requiring them to implement enhanced age verification processes and more stringent data collection policies.

TF Summary: What’s Next

If the ICO finds violations, these platforms could face hefty fines and stricter enforcement actions. The results may also set new standards for data protection laws in the U.K. and beyond, influencing how governments regulate social media companies worldwide. With an increasing focus on child safety, this investigation could push platforms to rethink their privacy policies, recommendation algorithms, and data retention practices.

Regulators are clearly saying that protecting young users online is a top priority. As scrutiny intensifies, social media giants may need to rethink their approaches to user data, content moderation, and compliance to avoid financial penalties and reputational damage.

— Text-to-Speech (TTS) provided by gspeech

Share This Article
Leave a comment