The EU Officially Wants to Know if TikTok is Bad for Kids

Regulators have opened “formal proceedings” to examine who well the popular platform is protecting children.

Photo of TikTok app open on phone
Photo by Solen Feyissa via Unsplash

Sign up for smart news, insights, and analysis on the biggest financial stories of the day.

The EU has a new TikTok challenge. 

The European Commission said Monday it’s opened “formal proceedings” against TikTok. While that sounds like a black-tie event, it actually means the EU thinks TikTok might have broken some of the relatively new laws of the EU’s Digital Services Act (DSA) — specifically, ones that say Big Tech platforms need to be especially careful protecting minors. This is the latest in a slew of legal actions from regulators that aims to punish platforms for not protecting children — and it’s potentially the most threatening.

The Children Are Our Future

TikTok is only the second company to get bonked with the brand-new DSA regulatory bat, following a formal probe in December into whether Twitter/X had broken laws around disinformation and “deceptive design.” However, this isn’t the first time TikTok has taken some regulatory heat over how well it protects underage users on its platform. Thus far, successful(ish) enforcement has centered around children’s privacy.

In addition to privacy issues, the new DSA investigation will focus on a range of areas where it believes TikTok may have broken the law:

  • The inquiry will examine whether TikTok did enough to mitigate the risk of its algorithm inducing “behavioral addictions” and “rabbit hole effects,” i.e., whether TikTok has been auto-cueing videos for kids that are likely to result in foreseeable negative effects on their well-being, including radicalization.
  • The investigation will also look at transparency: whether TikTok has been assiduously keeping records of ads shown on the platform, and whether it’s been knowingly withholding data from researchers.

A TikTok spokesperson gave The Daily Upside the company’s official response: “TikTok has pioneered features and settings to protect teens and keep under 13s off the platform, issues the whole industry is grappling with. We’ll continue to work with experts and industry to keep young people on TikTok safe, and look forward to now having the opportunity to explain this work in detail to the Commission.” 

Addiction Diction: Social media companies have recently faced regulator lawsuits over children’s welfare, many from US states. Just on Friday, New York City filed a lawsuit against a collection of Big Tech platforms, including TikTok, accusing them of harming children’s mental health by designing their products to “attract, capture, and addict youth.” So far, US lawmakers’ attempts to prove that tech companies are creating addictive platforms have fallen flat. But the EU isn’t trying to prove that TikTok is inherently addictive, but that it hasn’t done enough to stop its algorithms from driving vulnerable children towards habit-forming behaviors. It’s also resting on a fresh batch of legislative framework designed to deal with modern tech companies — for now, anyway.