Executive Summary (full briefing here)

What would a healthy social network look and feel like, with recommender systems that show users the content they really want to see, rather than content based on predatory and addictive design features?

In October 2022, the European Union adopted the Digital Services Act (DSA), introducing transparency and procedural accountability rules for large social media platforms – including giants such as Facebook, Instagram, YouTube and TikTok – for the first time. When it comes to their recommender systems, Very Large Online Platforms (VLOPs) are now required to assess systemic risks of their products and services (Article 34), and propose measures to mitigate against any negative effects (Article 35). In addition, VLOPs are required to disclose the “main parameters” of their recommender systems (Article 27), provide users with at least one option that is not based on personal data profiling (Article 38), and prevent the use of dark patterns and manipulative design practices to influence user behaviour (Article 25).

Many advocates and policy makers are hopeful that the DSA will create the regulatory conditions for a healthier digital public sphere – that is, social media that act as public spaces, sources of quality information and facilitators of meaningful social connection. However, many of the risks and harms linked to recommender system design cannot be mitigated without directly addressing the underlying business model of the dominant social media platforms, which is currently designed to maximise users’ attention in order to generate profit from advertisements and sponsored content. In this respect, changes that would mitigate systemic risks as defined by the DSA are likely to be heavily resisted – and contested – by VLOPs, making independent recommendations all the more urgent and necessary.

It is in this context that a multidisciplinary group of independent researchers, civil society experts, technologists and designers came together in 2023 to explore answers to the question: ‘How can the ambitious principles enshrined in the DSA be operationalised by social media platforms?’. On August 25th 2023, we published the first brief, looking at the relationship between specific design features in recommender systems and specific harms.1 Our hypotheses were accompanied by a list of detailed questions to VLOPs and Very Large Online Search Engines (VLOSEs), which serve as a ‘technical checklist’ for risk assessments, as well as for auditing recommender systems.

In this second brief, we explore user experience (UX) and interaction design choices that would provide people with more meaningful control and choice over the recommender systems that shape the content they see. We propose nine practical UX changes that we believe can facilitate greater user agency, from content feedback features to controls over the signals used to curate their feeds, and specific ‘wellbeing’ features. We hope this second briefing serves as a starting point for future user research to ground UX changes related to DSA risk mitigation in a better understanding of user's needs.

This briefing concludes with recommendations for VLOPs and the European Commission.

With regards to VLOPs, we would like to see these and other design provocations user-tested, experimented with and iterated upon. This should happen in a transparent manner to ensure that conflicting design goals are navigated with respect to the DSA. Risk assessment and risk mitigation is not a one-time exercise but an ongoing process, which should engage civil society, the ethical design community and a diverse representation of users as consulted stakeholders.

The European Commission should use all of its powers under the DSA, including the power to issue delegated acts and guidelines (e.g., in accordance with Article 35), to ensure that VLOPs:

  • Implement the best UX practices in their recommender systems
  • Modify their interfaces and content ranking algorithms in order to mitigate systemic risks
  • Make transparency disclosures and engage stakeholders in the ways we describe above.

Read the full briefing here.


Photo by Christin Hume