The EU’s Golden Opportunity to Turn Off Big Tech’s Manipulation Machine
The Digital Services Act is a chance to address algorithmic harms exposed by Facebook whistleblower Frances Haugen
How the pending Digital Services Act can address algorithmic harms exposed by Facebook whistleblower Frances Haugen.
The entire world has been shocked by the extent of the algorithmic harms revealed by Facebook whistleblower Frances Haugen. Besides sounding the alarm with her evidence to lawmakers, Haugen and her legal team have also filed at least eight formal complaints with the US Securities and Exchange Commission alleging Facebook broke the law through a series of misstatements, omissions, and lies. These complaints collectively demonstrate that Facebook has consistently misled both the public and potential regulators about the extent to which its platform directly harms society. Yet while legislators in the US and other countries now scramble to prepare oversight options to rein in one of Big Tech’s most powerful and toxic companies, it is the European Union that is in pole position to enact rules to mitigate these harms.
Years in the making, the pending Digital Services Act (DSA) is designed to introduce a new set of regulations that would require giant platforms like Facebook to, amongst other things, retool their advertising and recommendation algorithms to take into account their impact on society. As Haugen’s testimony and disclosures make plain, Big Tech’s unchecked business of prioritising profits over people has come at an immense real world cost. She will discuss her explosive revelations with the lead committee at the European Parliament responsible for framing new rules for the massive EU market. The power of democracy to make laws, whistleblowers to expose the truth, and citizens to demand change is truly crystallising in the EU.
Haugen’s Disclosures Show Facebook Knows its Algorithms are Harmful
Haugen’s well-documented allegations cite internal Facebook documents that confirm what many women -- particularly women of colour -- and members of marginalised communities have long reported: hate speech and online violence are not simply tolerated by Facebook, they are amplified and promoted by the platform’s own algorithms.
(CBS News - Whistleblower’s SEC Complaint: Facebook Knew Platform Was Used to “Promote Human Trafficking and Domestic Servitude”)
Despite being aware of the company’s “significant” role in enabling hate speech and other toxic content to “flourish” on its platform, Facebook has done little to meaningfully address the problem. Internal documents cited by Haugen reveal that the company takes action against as little as 3% to 5% of hate speech and less than 1% of Violence and Inciting to Violence (V&I) content on Facebook.
This troublingly low figure is a direct result of the company’s unwillingness to implement impactful interventions that would lower the prevalence of such content but might otherwise reduce or slow the platform’s astronomical growth. Instead of pursuing solutions to reduce the virality of harmful content via algorithmic modifications and product design changes, Facebook instead employs an under-resourced content-level enforcement strategy. This approach, which relies on flagging and human review (as well as generally ineffective AI), is inherently slow -- a piecemeal approach that enables problematic content to go viral and spread its dangerous influence long before its root source is ever reviewed or taken down. Such an approach may make sense for Facebook from a bottom line perspective -- after all, the more inflammatory the content, the more interactions and ad revenue to be reaped -- but it is devastating people’s lives as hate speech, homophobia, sexism, and racism are left to flourish. Yet with no financial incentive to act otherwise, change will only come when the tech giant is required to alter its ways. A DSA that only addresses illegal content will fail to address the beating heart of what makes the platform currently truly toxic.
How Two Key Tools in the DSA Toolbox Would Address these Algorithmic Harms
In light of Facebook’s well-documented refusal to protect the public good, EU lawmakers must take action by passing the strongest possible version of the DSA. If designed properly, this landmark legislation will equip European regulators to clean up Facebook’s toxic algorithms and ensure meaningful transparency and accountability of the platform. Two tools in particular are needed to address the issues exposed by Haugen and other whistleblowers: (1) Risk Assessment Obligations, and (2) Recommender System Transparency and Controls.
Risk Assessment Obligations: As Haugen’s disclosures demonstrate, instead of slowing the spread of toxic content, Facebook’s design features in fact facilitate and indirectly encourage hate speech, violence, and disinformation. Given this, Facebook and other large platforms must be obligated to identify, prevent, and mitigate the risk of these types of content being distributed and amplified by their products. Under Article 26 of the DSA, large platforms would be required to take into account the ways in which their design choices and operational approaches influence and increase these risks. Under Article 27, they would also be required to take action to prevent and mitigate the risks identified. Together, these two articles would serve as a counterweight to Facebook’s current refusal to implement impactful interventions that would lower the prevalence of harmful content but might otherwise reduce or slow the platform’s growth. Amending Article 27 to require platforms to provide written justification to independent auditors whenever a subject corporation fails to put in place risk-mitigating measures would further strengthen the regulatory framework and increase the instance of responsible design choices.
Recommender System Transparency and Controls: Whistleblower disclosures such as Haugen’s also reveal how little insight is publicly available into the inner workings of Facebook’s core mechanics. As a result, countless users remain unaware as to why they are being shown certain posts and how their experiences on the platform differ from that of other users due to financially motivated -- and societally detrimental -- filter bubbles. In response, Article 29 of the DSA would require platforms to provide users with clear information about the main parameters used in their recommender systems. At present, these invisible systems wield a dangerously high degree of power over all those who log on to Facebook -- dictating the content and order of the posts each user sees, which invariably informs and shapes their worldview. As an internal Facebook study disclosed by Haugen reported, test accounts that followed “verified/high quality conservative pages” such as Fox news and Donald Trump “began to include conspiracy recommendations after only 2 days.”
To further benefit and protect users, Article 29 should also be strengthened by mandating that recommender systems can no longer be based on data profiling by default. Creating a baseline that users must opt-in to such profiling (rather than opt-out) would ensure that recommender systems are not personalised to a user’s online activity and past behaviour without the user being fully aware and explicitly consenting beforehand. This type of measure would put the brakes on a foundational element that promotes the way that toxic content currently spreads on Facebook. In an effort to increase diversity and user choice over time, Article 29 should additionally be modified to allow third parties to offer alternative recommender systems on very large online platforms. Allowing for user choice in recommender systems outside of the four walls of Facebook would promote healthier algorithms optimized for factors beyond mere engagement to flourish over time.
The EU Must Seize this Historic Opportunity to Lead and Protect its Citizens
Frances Haugen’s revelations make it clear that Facebook is incapable of fixing itself. Amidst a global crisis born out of online misinformation and corporate mismanagement, the European Parliament must therefore seize its golden opportunity, put people’s rights before Big Tech profits, and vote for the strongest possible version of the DSA.
In doing so, it is essential for MEPs to bear in mind that the new law will only be as powerful as its enforcement mechanisms. Regulators must accordingly be vested with meaningful powers to supervise the tech titans and take strong action if companies fail to address identified risks. Supervisory powers must be both independent and robust, as must auditing powers. Every auditor should have defined independence, expertise on platform design, and access to all relevant information and data -- including access to the platforms’ algorithms (in turn, Article 31 on data scrutiny should be improved by widening the definition of “vetted researchers” to include civil society and investigative journalists). MEPs must also be watchful and reject any proposal to include a media exemption clause in the DSA. Ring-fencing “media” content will be worse than today’s status quo -- where platforms will be actively stopped from taking voluntary remedial action to stem the spread of orchestrated disinformation, violence, and hate.
Frances Haugen will be giving live testimony to different committees of the European Parliament on 8th November 2021 from 16:45 - 19:30 CET. You can watch the livestream here.
To learn more about the essential DSA package, click here.