Back to news

Who Writes the Rules?

Six campaigners highlight marginalised people’s exclusion from the process of writing the rules that govern the online experience

6 campaigners highlight marginalised people’s exclusion from the process of writing the rules that govern the online experience.

Over the next few weeks, policymakers in the European Union are gearing up to a crucial round of voting on the Digital Services Act (DSA) - a legislative proposal geared toward governing digital content and laying out the roles and responsibilities of Big Tech platforms. Yet all too often, marginalised groups - often those the most affected by the issues the DSA is trying to address - are excluded from these processes.

That’s why 6 women have come together to form Who Writes The Rules. They want to highlight the fact that marginalised people are routinely excluded from the process of writing the rules that govern the online experience - because the European Commission’s employees are overwhelmingly white and male. And they want to draw attention to the fact that the rules that are being written often don’t include adequate enforcement to protect those facing the brunt of Big Tech harm.

The Who Writes the Rules campaigners come from a variety of backgrounds: they research and prevent online abuse, run tech companies, support refugee and immigrant women to code, fight gender-based violence and advocate for women’s rights. They have come together to represent some of the people disproportionately impacted by the systemic threats to their online experience.

Below are excerpts from their stories - the full stories can be found at https://www.whowritestherules.online/ourstories

Aina Abiodun: The cost of their enrichment is my continued oppression

As a Black tech entrepreneur, Aina Abiodun has had to curate her online experience. Using digital platforms is a professional necessity, and she feels forced, against her own principles, to participate (and fund) the continuation of their oppressive practices.

She wants policy-makers to pay more attention to these seeming ‘technicalities’ or intricacies of technological oppression as this is clearly complicity.

“In the West, the dismantling of the legacy of colonialism and race-based oppression must include and rigorous investigation into the ways in which white power is perpetuated online and reinforces the never-ending, metamorphosis of a vile and immoral anti-Black, anti-femme agenda.” - Aina Abiodun

Read Aina Abiodun’s full story here: https://www.whowritestherules.online/stories/abiodun

Asha Allen: The Brussels Bubble: Advocating for the rights of marginalised women and girls in EU tech policy

As a young Black woman advocate in the Brussels political bubble, Asha Allen has personal experience of the political exclusion of marginalised and racialised communities that continues to characterise the European decision-making space. This is mirrored by the same lack of inclusion in the digital and tech sphere, which remains overwhelmingly male and pale.

“In the case of online violence, the experience of Black women [...] represents not only some of the worst manifestations of the systemic issues regarding harm in the online space, but how our continued exclusion from these decision-making spaces only further exacerbates online violence despite efforts to combat it.” - Asha Allen

Asha Allen collaborates with activists leading the charge for digital citizenship and transformative change - and they will be watching and holding decision-makers and Big Tech to account.

Read Asha’s full story here: https://www.whowritestherules.online/stories/allen

Dr Carolina Are: Bodies have rights just like words do


Dr Carolina Are has found support and education as a survivor in sex-positive networks and spaces online. Those spaces helped her to love her body again after abuse. But now, those spaces and networks are under threat:

“Simply because social media platforms have decided that nudity - aka women’s and marginalised users’ bodies - are inappropriate, risky and worth censoring. I don’t want to lose those networks and the opportunities they provide, and I don’t want people who go through my same experiences to be left unsupported.” - Dr Carolina Are

She wants to see the Digital Services Act legislation process to include those affected by the policies and not see a repeat of previous examples that have led to blanket censorship of bodies.

Read Dr Carolina Are’s full story here: https://www.whowritestherules.online/stories/are

Hera Hussain: Decolonising digital rights

Hera Hussain talks about the need for a decolonised digital rights approach because Eurocentricity when discussing digital rights is exclusionary and short-sighted. For example, videos containing disinformation in languages other than English take longer to take down, simply because Youtube hasn’t invested staff in the relevant countries. And while everyone worldwide was made aware of fact-checking features during elections in the US, disinformation flourishes in Hungary and Myanmar.

“We need radical reform but one that works for everyone. When we talk about reform, let’s not forget that the ripples of policies in Europe can create a tsunami in the rest of the world. Though courts see jurisdictions - the web sees none.“ - Hera Hussain

Read Hera Hussain’s full story here: https://www.whowritestherules.online/stories/hussain

Dr Nakeema Stefflbauer: #defundbias in online hiring and listen to the people in Europe whom AI algorithms harm

Having lived and worked in Europe for close to a decade, Dr Nakeema Stefflbauer knows that biased hiring practices are far from unusual. And now that many EU institutions receive thousands of applications per job, more and more employers are moving to artificial intelligence (AI) hiring algorithms.

These algorithms make it far too easy to simply filter out candidates based on a criteria, whether it’s their religious background or their age or education. The algorithms will even select “top candidates” for the employer. But no one knows exactly what information those matches are based on. In this way, AI hiring algorithms may unfairly exclude people from job opportunities –without them ever knowing the reason why.

Dr Nakeema Stefflbauer asks,

“What if we looked at the reality of employment discrimination in Europe and whom it actually harms? What if hiring bias in Europe was addressed with input from people with actual lived experience of the problem?”

Read her full story here: https://www.whowritestherules.online/stories/stefflbauer

Raziye Buse Çetin: The absence of marginalised people in AI policymaking

Raziye Buse Çetin has witnessed frequently how people of colour are almost totally absent from AI policy conversations. But she sees this as not just an issue of representation. AI systems can also inherently contain bias. But currently it’s impossible to even measure algorithmic bias related to race since it’s forbidden in the EU to collect such data.

For example, people of colour have shared their ongoing and traumatic experiences of not being recognised by AI security machines at the airport and how this can automatically put them in a position of suspect. Most of the people involved in AI policymaking simply do not have this lived experience.

Raziye Buse Çetin says there is a reluctance in the EU to acknowledge racism, and to call out its historic roots in European colonialism.

“With the inclusion of racialised people and welcoming policies; the EU needs to adopt a racial equity approach in AI policy and understand how discrimination and inequity manifests in AI in the EU. Algorithmic bias is only one of the visible results of many intertwined forms of inequity; but the problem has deeper roots.” - Raziye Buse Çetin


Read her full story here: https://www.whowritestherules.online/stories/cetin