Two Ethiopians have filed a lawsuit in Kenya’s High Court in Nairobi against US tech giant Meta, the owner of Facebook, for failing to prevent the spread of harmful content which allegedly contributed to the killing of an academic.
Meareg Amare, a professor at Bahir Dar University in northern Ethiopia, was hunted down and killed in November 2021, weeks after posts inciting hatred and violence against him spread on Facebook, according to the suit filed by his son Abraham Meareg and Fisseha Tekle, a legal advisor at Amnesty International.
The case claims that Facebook only removed the hateful posts eight days after Professor Meareg’s killing, more than three weeks after his family had first alerted the company.
According to Amnesty International, which is joining six other human rights and legal organisations as interested parties in the case, the legal action claims that Meta promoted speech that led to ethnic violence and killings in Ethiopia by utilising an algorithm that prioritises and recommends hateful and violent content on Facebook.
The petitioners seek to stop Facebook’s algorithms from recommending such content to Facebook users and compel Meta to create a $1.6bn victims’ fund.
The individual petitioners are represented by Mercy Mutemi of Nzili and Sumbi Advocates, supported by Foxglove, the tech-justice nonprofit.
Tekle, who lives in Kenya and fears returning to Ethiopia, said he has been subjected to a stream of hateful posts on Facebook for his work exposing human rights violations in Ethiopia.
“In Ethiopia, the people rely on social media for news and information. Because of the hate and disinformation on Facebook, human rights defenders have also become targets of threats and vitriol. I saw first-hand how the dynamics on Facebook harmed my own human rights work and hope this case will redress the imbalance,” he said.
Oversight Board calls for more action
Content moderation has become a particularly fraught issue for Meta in Ethiopia given the currently frozen civil war between the federal government and its allies and the Tigray People’s Liberation Front, which is estimated to have killed hundreds of thousands and left millions homeless.
In December 2021, Meta’s Oversight Board – tasked with making independent decisions regarding content on Facebook and Instagram – instructed the firm to order an independent human rights assessment on Facebook and Instagram’s role in exacerbating the risk of ethnic violence in Ethiopia and to evaluate how well it can moderate content in the country’s languages.
But in its recently released third quarter transparency report, the Oversight Board said that “Meta reported implementation or described as work Meta already does but did not publish information to demonstrate implementation.”
If Tekle and Meareg’s action succeeds, the US tech giant – which contracts the services of more than 15,000 content moderators globally – could be forced to make changes to its algorithm in 2023.
The case is not the first time Meta has faced significant scrutiny over its content moderation policies in Africa. In February, the company’s third-party content moderation contractor based in Nairobi, Sama AI, was accused by former South African employee Daniel Motaung of providing unsafe and unfair working conditions and terminating his employment while he was attempting to start a union in 2019. Motaung has since launched legal action against Meta and the contractor.

To keep up with the latest in African tech, subscribe to Tech54
This article originally appeared in Tech54, the African Business newsletter that takes an incisive look at the continent’s tech scene. Subscribe to the newsletter here.
Want to continue reading? Subscribe today.
You've read all your free articles for this month! Subscribe now to enjoy full access to our content.
Digital Monthly
£8.00 / month
Receive full unlimited access to our articles, opinions, podcasts and more.
Digital Yearly
£70.00 / year
Our best value offer - save £26 and gain access to all of our digital content for an entire year!