Friday, November 15, 2024
  • Ethiopia has been one of our highest priorities for country-specific interventions to keep people safe given the risk of conflict. 
  • For more than two years, we’ve invested in safety and security measures in Ethiopia, including building our capacity to catch hateful and inflammatory content in the languages that are spoken most widely in the country, despite the fact that the country’s lower internet adoption means that less than 10% of the population uses Facebook.
  • Since the situation began to deteriorate, we have taken a number of additional steps to address violating content across our platforms and to help keep people safe.

Recent events have focused the world’s attention on the conflict in Ethiopia. Our thoughts are with the people of Ethiopia, both in the country and in the diaspora, during this difficult time. But while the international attention that these events are getting may be new, our work to prevent our platform from being abused in Ethiopia is not. 

For more than two years, we’ve been implementing a comprehensive strategy to keep people in the country safe on our platform given the severe, longstanding risks of conflict.

Longstanding Safety Measures in Ethiopia

Two years ago we moved Ethiopia to the category of countries that we believe are at the highest risk for conflict and violence, enabling the development of both proactive solutions that we can implement when crises arise, and a long-term strategy to keep people safe. We’ve been doing this despite the fact that the country’s lower internet adoption means that less than 10% of the population uses Facebook. For the millions of Ethiopians who rely on our services as a source of information and communication, our focus is threefold: 

  • Removing content that violates our policies, 
  • Respecting people’s right to free expression, and 
  • Helping to keep people safe both online and offline.

Ethiopia is an especially challenging environment to address these issues, in part because there are multiple languages spoken in the country.  Over the past two years, we’ve significantly improved our reporting and enforcement tools. We can now review content in the top four languages spoken and those central to the conflict (Amharic, Oromo, Somali, Tigrinya). We’ve also made it easier for Ethiopians as well as specialized international and local human rights and civil society organizations to tell us when they see potentially violating content, so we can investigate it for possible violations.  We also have technology to identify hate speech in Amharic and Oromo before anyone reports it to us. These efforts are industry-leading.   

As a result of these efforts, between May and October 2021, we took action on more than 92,000 pieces of content in Ethiopia on Facebook and Instagram for violations of our Community Standards prohibiting hate speech, about 98% of which was detected before it was reported by people to us. 

In June 2021, we also removed a network of fake accounts posting critical commentary of opposition politicians/groups in Amharic.  The people behind these posts used coordinated, inauthentic accounts as a central part of their efforts to mislead people about who they were and what they were up to. In March 2021, we removed accounts in Egypt that targeted Ethiopia, Sudan, and Turkey. 

Additional Safety Efforts In Response to Recent Events 

As the local situation deteriorated, and as we approached elections in June and again in September, we took a number of additional steps:

  • Reducing potential violating content: To address possible viral content, we’re continuing to reduce content that has been shared by a chain of two or more people. We’re also continuing to reduce the distribution of content that our proactive detection technology identifies as likely to violate our policies against hate speech as well as from accounts that have recently and repeatedly posted violating content.
  • New Classification: In line with our violence and incitement policy, we classified all of Ethiopia as a ‘Temporary High Risk Location’, and will remove content calling for people to bring/carry weapons to specified locations, or to take up arms. We will also remove content containing veiled threats of violence, such as general statements calling for revenge, action, or statements that a target group will, for example, “pay the price.”
  • New Designations: In recent weeks, in line with our policy of banning Violent Non-State Actors, we’ve taken enforcement action against several groups and individuals that were inciting violence in Ethiopia, including most recently the Oromo Liberation Army (OLA) and Abba Torbee. We’re also continuing to remove content that provides material support towards these groups or praises the violence they commit. 
  • Improving Hate Speech enforcement: While we’ve had a robust hate speech policy in place for a long time, which we use to remove content that attacks people based on protected characteristics like ethnicity and race, over the past year, we have expanded this enforcement. It now encompasses a more extensive list of slurs across the four main Ethiopian languages. We’re also continuing to observe and address emerging patterns of hateful speech and trending content.
  • Removing harmful misinformation: We’ve removed misinformation when there is a risk it may contribute to physical harm for a long time. But in Ethiopia, we have identified and are removing a number of persistent harmful false claims and out of context imagery that make false allegations about the perpetrators, severity or targets of violence in Ethiopia. This decision was based on guidance from over 50 local partners and independent experts.
  • Violence & Incitement: We’ve temporarily expanded our policies on coordinating harm and will remove content that contains claims about individuals being spies, traitors, or informants, or which encourages other people to make such claims

Then as now, our teams are working around the clock and we’ve activated our Integrity Operation Center — bringing together subject matter experts from across the company to respond in real time to problems and abuses.

Given the rapidly evolving situation, and informed by conversations we’ve had with human rights activists, journalists and civil society groups in Ethiopia and the diaspora about security concerns, we’ve taken additional steps in recent days. We recently launched a new safety feature in Ethiopia called Lock Profile that allows people to restrict anyone who isn’t their friend from downloading, enlarging, or sharing their profile photo. It also prevents non-friends from seeing posts or other photos on their timeline, regardless of when they may have posted it. We’ve also put temporary measures in place to restrict views of peoples’ Friends List on their profile pages and remove results from “Search this profile.” 

While safety work in Ethiopia has been going on for a long time, we know that the risks on the ground right now are higher. And since we recognize that local context and language-specific expertise is essential for this work, we will remain in close communication with people on the ground, along with partner institutions and non-governmental organizations as the days and weeks progress. This will help us take the right actions and make the right calls. We remain vigilant to emerging trends and stand ready to take additional action to meet the demands of this ongoing human rights situation. 

Source

0 Comments

Leave a Comment