EU Cracks Down on Online Child Safety – Snapchat, TikTok, Instagram Targeted

EU Intensifies Scrutiny of Social Media Platforms Over Child Safety

Brussels is increasing pressure on major social media companies, including Snapchat, TikTok, and Instagram, to better protect children and teenagers from online risks. The European Commission has launched formal investigations and issued warnings regarding potentially harmful design features and inadequate safeguards against grooming and access to illegal content. This heightened scrutiny reflects growing concerns about the impact of social media on young people’s well-being and safety, and a broader push for greater accountability from tech giants.

From Instagram — related to The European Commission

The move signals a shift towards more assertive regulation of the digital space, particularly concerning the protection of vulnerable users. European officials are demanding greater transparency and more effective measures to identify and remove harmful content, as well as to prevent predatory behavior. The focus extends beyond simply removing illegal material to addressing the addictive nature of these platforms and their potential to expose children to inappropriate content and interactions. This is not a new concern, but the level of formal action being taken suggests a growing impatience with self-regulation by the tech industry.

Snapchat Under Investigation for Child Safety Concerns

The European Union has formally launched an investigation into Snapchat, focusing on whether the platform adequately protects children from risks such as grooming and exposure to illegal goods. The investigation, announced by the European Commission, will examine Snapchat’s compliance with the Digital Services Act (DSA), a landmark piece of legislation designed to create a safer digital space for users across the EU. The DSA, which came into force in February 2024, imposes strict obligations on very large online platforms (VLOPs) and very large online search engines (VLOSEs) to address systemic risks, including those affecting minors.

According to the Commission, the investigation will specifically assess whether Snapchat has sufficient measures in place to ensure a safe environment for young users. This includes evaluating the platform’s age verification processes, its content moderation policies, and its response to reports of harmful activity. The DSA requires platforms to design their services with the safety of children in mind, and to take proactive steps to prevent them from being exposed to harmful content or predatory behavior. Failure to comply with the DSA can result in substantial fines – up to 6% of a company’s global annual revenue.

TikTok Faces Pressure to Redesign Addictive Features

Alongside the investigation into Snapchat, the EU is likewise putting pressure on TikTok to address concerns about its addictive design and the potential risks it poses to children. European regulators have warned TikTok that its current design features may violate the DSA by exploiting users’ vulnerabilities and encouraging excessive use. The concerns center around features such as the “For You” page, which uses algorithms to personalize content and keep users engaged for extended periods.

TikTok Faces Pressure to Redesign Addictive Features
Snapchat The Commission Social

The Commission has specifically raised concerns that TikTok’s algorithms may expose children to content that is inappropriate or harmful, and that the platform’s design may contribute to addictive behaviors. Regulators are urging TikTok to implement changes to its algorithms and design features to mitigate these risks, including providing users with more control over the content they see and limiting the amount of time they can spend on the platform. The EU’s actions follow similar concerns raised by other governments and advocacy groups around the world regarding the potential negative impacts of TikTok on young people’s mental health and well-being.

Broader Concerns About Social Media and Child Safety

The EU’s actions against Snapchat and TikTok are part of a broader effort to address the growing concerns about the impact of social media on children and teenagers. Regulators are increasingly focused on the need to protect young people from online risks such as cyberbullying, grooming, and exposure to harmful content. The DSA is a key tool in this effort, providing regulators with the authority to impose strict obligations on social media platforms and to hold them accountable for failing to protect their users.

EU Escalates Crackdown on Snapchat Over Child Safety Failures and Illegal Content Read More –

The concerns extend beyond Snapchat and TikTok to other major platforms such as Instagram and Facebook. These platforms also face scrutiny over their content moderation policies, their age verification processes, and their efforts to prevent harmful content from reaching young users. Advocacy groups and child safety organizations are calling for greater transparency from social media companies and for more effective measures to protect children online. They argue that platforms have a responsibility to prioritize the safety and well-being of their young users, and that they should be held accountable for failing to do so.

The Digital Services Act and its Implications

The Digital Services Act (DSA) represents a significant step forward in the regulation of online platforms in the EU. The DSA introduces a tiered system of obligations, with the most stringent requirements applying to very large online platforms (VLOPs) and very large online search engines (VLOSEs), those with over 45 million active users in the EU. These platforms are subject to a range of obligations, including conducting risk assessments, implementing content moderation policies, and providing users with greater transparency and control over their data.

The DSA also introduces new rules regarding online advertising, including restrictions on targeted advertising to children. Platforms are prohibited from using sensitive personal data, such as religious beliefs or sexual orientation, to target ads to minors. The DSA also includes provisions to address the spread of illegal content online, requiring platforms to remove illegal content promptly and to cooperate with law enforcement authorities. The implementation of the DSA is expected to have a significant impact on the way social media platforms operate in the EU, and it could serve as a model for other countries seeking to regulate the digital space.

What Happens Next?

The European Commission’s investigation into Snapchat is ongoing, and the platform has been given a deadline to respond to the Commission’s concerns. The Commission has the power to impose substantial fines on Snapchat if it finds that the platform has violated the DSA. TikTok is also under pressure to address the Commission’s concerns about its addictive design, and regulators are monitoring the platform’s progress in implementing changes. The EU is expected to continue to closely monitor social media platforms and to take further action if necessary to protect children and teenagers online.

What Happens Next?
Snapchat The Commission European

The next key date to watch is the deadline for Snapchat to respond to the Commission’s formal request for information, which is expected in the coming weeks. The Commission will then assess Snapchat’s response and determine whether further action is necessary. Meanwhile, the EU is also working on the Digital Resilience Act, which will introduce new cybersecurity requirements for digital products and services. This legislation is expected to further strengthen the EU’s regulatory framework for the digital space and to enhance the protection of users online.

The ongoing efforts to regulate social media platforms and protect children online are likely to continue for the foreseeable future. As technology evolves and new risks emerge, regulators will need to adapt their strategies and remain vigilant in their efforts to ensure a safe and responsible digital environment for all users. The debate over the appropriate level of regulation for social media is likely to continue, with tech companies arguing that excessive regulation could stifle innovation and limit freedom of expression.

What are your thoughts on the EU’s actions? Share your comments below and let us know how you reckon social media platforms can better protect young users.

Leave a Comment