A lawsuit against Facebook’s parent company Meta was filed earlier today in Kenya’s High Court over its alleged role in fueling violence and hate in eastern and southern Africa.
The case claims that Meta has failed to employ enough safety measures on Facebook, which has in turn fueled conflict that has led to deaths, including of 500,000 Ethiopians during the recently ended Tigray war.
The petitioners, Kenyan rights group Katiba Institute, and Ethiopian researchers Fisseha Tekle, and Abrham Meareg, whose father Professor Meareg Amare was killed during the Tigray war, claim Facebook amplified hateful content, and failed to have enough personnel, with an understanding of local languages, to moderate content.
The petitioners are demanding that Facebook stops and demotes viral hate, have enough content moderators at the content moderation hub in Kenya, and creates a restitution fund of $1.6 billion.
Meta and Sama, its main subcontractor for content moderation in Africa, are already facing another lawsuit in Kenya for forced labor and human trafficking, unfair labor relations, union busting and failure to provide “adequate” mental health and psychosocial support.
“Nairobi has become a Hub for Big Tech, and that’s notable. What we must fight against endlessly is Big Tech’s abuse of Nairobi as a base to export human suffering to Africans. Big Tech must put respect for human rights at the forefront; design AI in a way that puts people first, not profit; and resource that hub properly,” the petitioners lawyer Mercy Mutemi of Nzili and Sumbi Advocates, said in a statement.
“Not investing adequately in the African market has already caused Africans to die from unsafe systems. We know that a better Facebook is possible – because we have seen how preferentially they treat other markets. African Facebook users deserve better. More importantly, Africans deserve to be protected from the havoc caused by underinvesting in protection of human rights,” said Mutemi.
Whistleblower Frances Haugen previously accused Facebook for ‘literally fanning ethnic violence’ in Ethiopia, and recently a Global Witness investigation also noted that the social site was “extremely poor at detecting hate speech in the main language of Ethiopia.”
Global witness investigation followed a Meta stance that Ethiopia was one of its “highest priorities for country-specific interventions to keep people safe given the risk of conflict.”
Responding to the new claims Meta Spokesperson told TechCrunch: “We have strict rules which outline what is and isn’t allowed on Facebook and Instagram. Hate speech and incitement to violence are against these rules and we invest heavily in teams and technology to help us find and remove this content.”
“Our safety and integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions. We employ staff with local knowledge and expertise, and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya.”
Updated to include Meta’s comment