Briefing: protecting children and young people from addictive design
Research has shown the deep harm excessive social media use can do to young brains and bodies. The EU Commission must tackle the root cause.
Social media companies design their platforms to encourage users to spend as much time on them as possible. Addictive design impacts everyone, but children and young people are especially susceptible. Research shows that given their neural developmental stage, young users are particularly prone both to excessive use of social media as well as its harmful effects, and young users with preexisting psychosocial vulnerabilities are even more at risk.
What is addictive design?
Social media platforms’ business model relies on keeping users online for as long as possible, so they can display more advertising. The platforms are optimised to trigger the release of dopamine - a neurotransmitter the brain releases when it expects a reward - making users crave more and use more.
Young users are far from exempt, documents reveal that Meta has invested significant resources to study and even created an internal presentation on how to exploit the neurological vulnerabilities of young users.
While more research is needed, the following addictive features have been identified:
- Notifications such as “likes”: both the novelty and validation of another user’s engagement triggers a dopamine release reinforcing the desire to post and interact creating a “social validation feedback loop”.
- Hyper personalised content algorithms or “recommender systems”: Brain scans of students showed that watching a personalised selection of videos triggered stronger activity in addiction-related areas of the brain compared to non-personalised videos.
- Intermittent-reinforcement: meaning users receive content they find less interesting punctuated by frequent dopamine hits from likes or a video they really like. This keeps the user scrolling in anticipation for the next dopamine reward. This randomisation of rewards has been compared to “fruit machines” in gambling.
- Autoplay and infinite scroll: automatically showing the next piece of content to provide a continuous, endless feed, makes it difficult to find a natural stopping point.
Why is addictive design so harmful?
Excessive screen time and social media usage has been shown to cause
- Neurological harm:
- Reduction in grey matter in the brain according to several studies, similar to the effects seen in other addictions.
- Reduced attention span and impulse control is linked to the rapid consumption of content on social media, particularly short-form videos, and especially in younger users.
- Possible impairment of prefrontal cortex development, which is responsible for decision-making and impulse control, due to early exposure to social media's fast-paced content. N.B. the prefrontal cortex does not fully develop until around age 25.
- Possible development of ADHD-like symptoms: may be linked to excessive screen according to early studies.
- Temporary decline in task performance identified in children after watching fast-paced videos.
- Psychological harm:
- In November 2023, Amnesty International found that within an hour of launching a dummy account posing as a 13 year old child on TikTok who interacted with mental health content, multiple videos romanticising, normalising or encouraging suicide had been recommended. This illustrates both the risk of prolonged screen time and also the hyper personalisation of content recommender systems.
- Increased anxiety, depression, and feelings of isolation have been linked to prolonged online engagement, as social media can negatively affect self-esteem, body image and overall psychological well-being.
- Risk exposure: Longer time online exposes children and young people more to risks such as cyberbullying, abuse, scams, and age-inappropriate content.
- Physical harm:
- “93% of Gen Z have lost sleep because they stayed up to view or participate in social media,” according to the American Academy of Sleep Medicine.
- Reduced sleep and activity: Social media usage can lead to sleep loss and decreased physical activity, which impacts weight, school performance, mental health, and distracts from real-life experiences.
Gone is the time when the streets were considered the most dangerous place for a child to be - now, for many young people the most dangerous place they can be is alone in their room with their phone.
What’s the solution?
Given the severity of the risks to children online, we need binding rules for platforms. Unfortunately, the very large online platforms (VLOPs) have repeatedly demonstrated that they choose profit over the safety of children, young people and society in general.
The adjustments that some have made have been minor, for example, TikTok no longer allows push notifications after 9 pm for users aged 13 to 15. But they will still be exposed to push notifications (linked to addictive behaviour) for most of the day. In March 2023, TikTok introduced a new screen-time management tool which requires under-18s to actively extend their time on the app once they have reached a 60-minute daily limit. However, this measure puts the burden on children, who in large numbers describe themselves as “addicted” to TikTok, to set limits on their own use of the platform. The prompt can also be easily dismissed and does not include a health warning. Adding to the limitations of the measure, the change only applies to users who the system identifies as being a child, with the effectiveness of TikTok’s age verification being called into question. For example, the UK’s media regulator Ofcom has found that 16% of British three- and four-year-olds have access to TikTok.
Meta’s leaked internal documents reveal that the corporation knowingly retains millions of users under 13 years old, and has chosen not to remove them. Notably, Harvard University research last year estimated that in the US alone, Instagram made $11 billion in advertising revenue from minors in 2022.
Risk of overreliance on age verification
While we welcome norms on an appropriate age to access social media platforms, overreliance on age-gating and age verification to adequately protect minors online is, unfortunately, unrealistic and alone will not adequately protect minors online. Even the most robust age-verification can be circumvented.
Age-gating and age verification still assume that parents or guardians have the availability, capacity and interest in monitoring internet usage. Frequent monitoring is unrealistic for most families but in particular, this approach risks disadvantaging young people who face additional challenges, such as those living in care, whose parents work long hours or face language barriers in their country of residence.
To truly protect children and young users, we need safe defaults for all. Please see our whitepaper prepared in collaboration with Panoptykon and other researchers and technologists: Safe by Default: Moving away from engagement-based rankings towards safe, rights-respecting, and human centric recommender systems.
Aside from this, age verification can present its own risks to privacy, security, free speech, as well as cost and convenience to businesses.
Establishing binding rules
Fortunately, there has been momentum to tackle addictive design in the EU; last December the European Parliament adopted by an overwhelming majority a call urging the Commission to address addictive design. In its conclusions for the Future of Digital Policy, the Council stressed the need for measures to address issues related to addictive design. In July, Commission President von der Leyen listed this as a priority for the 2024-2029 mandate. The Commission’s recent Digital Fairness Fitness Check also outlined the importance of addressing addictive design.
The Commission must:
- assess and prohibit the most harmful addictive techniques not already covered by existing regulation, with a focus on provisions on children and special consideration of their specific rights and vulnerabilities.
- examine whether an obligation not to use profiling/interaction-based content recommender systems ‘by default’ is required in order to protect users from hyper personalised content algorithms;
- put forward a ‘right not to be disturbed’ to empower consumers by turning all attention-seeking features off.
- ensure strong enforcement of the Digital Services Act on the protection of minors, prioritising:
- clarifying the additional risk assessment and mitigation obligations of very large online platforms (VLOPs) in relation to potential harms to health caused by the addictive design of their platforms;
- independently assessing the addictive and mental-health effects of hyper-personalised recommender systems;
- naming features in recommender systems that contribute to systemic risks;