Monday, December 23, 2024

When we launched the Oversight Board, we committed to consider and transparently respond to all of the board’s recommendations. 

Today, we’re publishing our first quarterly update, covering Q1 2021, which provides 1) information about cases that Facebook has referred to the board and 2) an update on our progress implementing the board’s recommendations. These quarterly updates are designed to provide regular check-ins on the progress of this long-term work, while sharing more about how we approach these challenges. They are meant to hold us accountable to the board and the public.  

Facebook Referrals to the Board

In addition to providing users with direct access to appeal content decisions to the board, we regularly and proactively identify some of the most significant and difficult content decisions we’ve made on our platform and ask the board to review them. While the board notes when cases have been referred by Facebook, we haven’t previously disclosed details about the cases we referred to the board that were not selected.

We refer cases involving issues that are severe, large-scale, and/or important for public discourse. Additionally, we look for content decisions that raise questions about current policies or their enforcement, with strong arguments on both sides for either removing or leaving up the content under review. We discussed how we prioritize content decisions for referral to the board in our Newsroom.

Facebook teams with expertise on our content policies, our enforcement processes, and cultural context from regions around the world review the candidate cases and provide feedback on their significance and difficulty. We refer the most significant and difficult content decisions to the board, and the board has sole discretion to accept or decline those cases. As with appeals, the board’s decisions are binding. From November 2020 through March 31, 2021, we referred 26 content decisions to the board, and the board selected three: a case about supposed COVID-19 cures; a case about a veiled threat based on religious beliefs; and a case about the decision to indefinitely suspend former US President Donald Trump’s account.

Our Progress on Non-binding Recommendations

In the first quarter of 2021, the board issued 18 recommendations in six cases. We are implementing fully or in part 14 recommendations, still assessing the feasibility of implementing three, and taking no action on one. The size and scope of the board’s recommendations go beyond the policy guidance that we first anticipated when we set up the board, and several require multi-month or multi-year investments. The board’s recommendations touch on how we enforce our policies, how we inform users of actions we’ve taken and what they can do about it, and additional transparency reporting. We welcome these recommendations — the changes they have sparked make Facebook more transparent with users and the public, more consistent with our policy applications, and more proportional in our enforcement.

For example, last quarter, in response to the board’s recommendations, we launched and continue to test new user experiences that are more specific about why we remove content. We’ve made progress on the specificity of our hate speech notifications by using an additional classifier that is able to predict what kind of hate speech is in the content: violence, dehumanization, mocking hate crimes, visual comparison, inferiority, contempt, cursing, exclusion, and/or slurs. People using Facebook in English now receive more specific messaging when they violate our hate speech policy. We’ll roll out more specific notifications for hate speech violations to other languages in the future. And, as a result of the board’s recommendations, we’re running tests to assess the impact of telling people about whether automation was involved in enforcement. Additionally, we’ve updated our Dangerous Organizations and Individual policy, creating three tiers of content enforcement for different designations of severity and adding definitions of key terms.

We hope our responses also add to the dialogue around the challenges of content moderation at scale by providing more insight into tradeoffs. Where we disagree in part or whole with a board recommendation — or where implementation will take a long time — we explain why.

Future Reporting

The board’s impact comes not only from its binding decisions and recommendations on our policies and processes, but also from the public discourse surrounding the cases. We welcome the board’s feedback and review — along with feedback from the public — of our implementation of the recommendations, as well as how we can continue to improve.

See the full update for more information.

Source

0 Comments

Leave a Comment