Dear Members of the European Parliament,

We, experts, academics and civil society groups, are writing to express our profound alarm at the social-media driven mental health crisis harming our young people and children. We urge you to take immediate action to rein in the abusive Big Tech business model at its core to protect all people, including consumers and children. As an immediate first step, this means voting for the Internal Market and Consumer Protection Committee’s report on addictive design of online services and consumer protection in the EU, in its entirety.

We consider social media's predatory, addictive business model to be a public health and democratic priority that should top the agenda of legislators globally. Earlier this year, the US Surgeon General issued a clear warning about the impact of addictive social media design: “Excessive and problematic social media use, such as compulsive or uncontrollable use, has been linked to sleep problems, attention problems, and feelings of exclusion among adolescents… Small studies have shown that people with frequent and problematic social media use can experience changes in brain structure similar to changes seen in individuals with substance use or gambling addictions”.

This is no glitch in the system; addiction is precisely the outcome tech platforms like Instagram, TikTok and YouTube are designed and calibrated for. The platforms make more money the longer people are kept online and scrolling, and their products are therefore built around ‘engagement at all costs’ – leading to potentially devastating outcomes while social media corporations profit. One recent study by Panoptykon Foundation showed that Facebook's recommender system not only exploits users' fears and vulnerabilities to maintain their engagement but also ignores users' explicit feedback, even when they request to stop seeing certain content.

The negative consequences of this business model are particularly acute among those we should be protecting most closely: children and young people whose developing minds are most vulnerable to social media addiction and the ‘rabbit hole’ effect that is unleashed by hyper-personalised recommender systems. In October 2023, dozens of states in the U.S. filed a lawsuit on behalf of children and young people accusing Meta of knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms, leading to "depression, anxiety, insomnia, interference with education and daily life, and many other negative outcomes".

Mounting research has revealed the pernicious ways in which social media platforms capitalise on the specific vulnerabilities of the youngest in society. In November 2023, an investigation by Amnesty International, for example, found that within 20 minutes of launching a dummy account posing as a 13 year old child on TikTok who interacted with mental health content, more than half of the videos in the ‘For You’ feed were related to mental health struggles. Within an hour, multiple videos romanticising, normalising or encouraging suicide had been recommended.

The real-world ramifications of this predatory targeting can be devastating. In 2017, 14 year-old British teenager Molly Russell took her own life after being bombarded with 2,100 posts discussing and glorying self-harm and suicide on Instagram and Pinterest over a 6-month period. A coroner’s report found that this material likely “contributed to her death in a more than minimal way”. The words of Molly’s father, Ian Russell, must serve as an urgent message to us all: “It’s time to protect our innocent young people, instead of allowing platforms to prioritise their profits by monetising their misery.”

Across Europe, children and young people, parents, teachers and doctors are facing the devastating consequences of this mental health crisis. But change will not come about from individual action. We urgently need lawmakers and regulators to stand up against a social media business model that is wreaking havoc on the lives of young people. We strongly endorse and echo the IMCO Committee Report’s calls on the European Commission to:


1. ensure strong enforcement of the Digital Services Act on the matter, with a focus on provisions on children and special consideration of their specific rights and vulnerabilities. This should include as a matter of priority:

  • independently assessing the addictive and mental-health effects of hyper-personalised recommender systems;
  • clarifying the additional risk assessment and mitigation obligations of very large online platforms (VLOPs) in relation to potential harms to health caused by the addictive design of their platforms;
  • naming features in recommender systems that contribute to systemic risks;
  • naming design features that are not addictive or manipulative and that enable users to take conscious and informed actions online (see, for example, People vs Big Tech and Panoptykon report: Prototyping user empowerment: Towards DSA-compliant recommender systems).

2. assess and prohibit harmful addictive techniques that are not covered by existing legislation, paying special consideration to vulnerable groups such as children. This should include:

  • assessing and prohibiting the most harmful addictive practices;
  • examining whether an obligation not to use interaction-based recommendation systems ‘by default’ is required in order to protect consumers;
  • putting forward a ‘right not to be disturbed’ to empower consumers by turning all attention-seeking features off by design.

Signed by the following experts and academics,

Dr Bernadka Dubicka Bsc MBBs MD FRCPsych, Professor of Child and Adolescent Psychiatry, Hull and York Medical School, University of York

Dr Elvira Perez Vallejos, Professor of Mental Health and Digital Technology, Director RRI, UKRI Trustworthy Autonomous Systems (TAS) Hub, EDI & RRI Lead, Responsible AI UK, Youth Lead, Digital Youth, University of Nottingham

Ian Russell, Chair of Trustees, Molly Rose Foundation

Kyle Taylor, Visiting Digital World and Human Rights Fellow, Tokyo Peace Centre

Dr Marina Jirotka, Professor of Human Centred Computing, Department of Computer Science, University of Oxford

Michael Stora, Psychologist and Psychoanalyst, Founder and Director of Observatoire des Mondes Numériques en Sciences Humaines

Dr Nicole Gross, Associate Professor in Business & Society, School of Business, National College of Ireland

Dr S. Bryn Austin, ScD, Professor, Harvard T.H. Chan School of Public Health, and Director, Strategic Training Initiative for the Prevention of Eating Disorders

Dr Trudi Seneviratne OBE, Consultant Adult & Perinatal Psychiatrist, Registrar, The Royal College of Psychiatrists

Signed by the following civil society organisations,

AI Forensics

Amnesty International

ARTICLE 19

Avaaz Foundation

Civil Liberties Union for Europe (Liberties)

Federación de Consumidores y Usuarios CECU

Defend Democracy

Digital Action

D64 - Center for Digital Progress (Zentrum für Digitalen Fortschritt)

Ekō

Fair Vote UK

Global Action Plan

Global Witness

Health Action International

Institute for Strategic Dialogue (ISD)

Irish Council for Civil Liberties

Mental Health Europe

Panoptykon Foundation

Superbloom (previously known as Simply Secure)

5Rights Foundation

#JeSuisLà