The EU launched a formal investigation into Meta Platforms, Inc. (META) on Thursday, 16 May 2024. The probe aims to determine whether the parent company of social media platforms Facebook and Instagram breached the Digital Services Act (DSA) as it relates to the protection of children.
Meta To Face Child Safety Probe By The European Commission
Aspects under investigation are how Meta’s algorithms possibly promote “behavioural addictions” and create “so-called rabbit-hole effects”. A study available on ResearchGate depicted the rabbit-hole effect as the “use of algorithms” that amplify personalised content to keep users engaged. The European Commission also expressed concerns about Meta’s age verification processes.
Don’t miss out the latest news, subscribe to LeapRate’s newsletter
This investigation follows a preliminary evaluation, which took place in September 2023, of the child safety risks associated with Meta’s operations. In an email to CNBC, Meta wrote:
We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.
Thierry Breton, the EU Commissioner for Internal Market, said that the European Commission is not “convinced” that Meta took the necessary steps to protect children on its Facebook and Instagram platforms. He also added that this would be an in-depth investigation and commented:
We are sparing no effort to protect our children.
CNBC reported that Meta and other US tech companies are under pressure from the EU after the promulgation of the DSA. Should breaches be found, the EU can issue fines of up to 6% of a firm’s overall revenues.