Monday, December 23, 2024

Facebook gives everyone a voice, no matter what your political views may be, which candidates or parties you support, or which causes you advocate. Facebook should be safe for both people and businesses. Whether it’s an ad for elections or business, we are focused to ensure our ads tools aren’t misused to discriminate or to create a polarized society. There has been some debate around our political ad transparency and enforcement in India. The representation of our election integrity efforts willfully ignored critical facts. 

We want to be unambiguous: we have always been an open, transparent and non-partisan platform where people can express themselves freely. We apply our policies uniformly, without regard to anyone’s political positions or party affiliations and without devaluing the principles of free expression and democracy. 

We have learnt from over 200 elections globally. We can anticipate threats and help prevent election interference better than before. We continue to build new tools, including stronger AI and machine learning systems, and as a result we’re now more effective at finding and removing abuse and fake accounts. 

We’re also bringing unprecedented transparency to political advertising with our labels and Ad Library. We believe every voter deserves transparency as they participate in political discussions and debate. That is why we are clarifying our political ad policies and its enforcement. 

  1. Our Ad Policies level the playing field for everyone. We allow political ads on Facebook, and all political advertisers are subject to our Ads Terms and Terms of Service and Community Standards in the same way as businesses – both large and small. And we’ve put tools into the hands of millions of small businesses around the world which were previously available only to the largest corporations. This holds true for political advertisers as well. Our ads policies apply equally to all advertisers.
  2. How much advertisers pay is determined by an auction: Our auction doesn’t make pricing decisions based on political viewpoint. Determining how much an advertiser pays through an ad auction is not unique to Meta — this is how much of digital advertising works. The assertion that any political party in India got discounted rates on ads because of their political affiliation is factually inaccurate. All ads, from all advertisers, compete fairly in the same auction. Ad pricing will vary based on the parameters set by the advertiser, such as their targeting and bid strategy. You can read more about our ad platforms and how they work.
  3. Our political advertising transparency efforts: We require anyone running political ads on Facebook and Instagram to be authorized and include disclaimers on their ads, enabling people to see the name of the person or organization running these ads. In India, we tightened the disclaimer options available to advertisers and require additional credentials to increase their accountability. For example, if we discover that the phone, email or website are no longer active or valid, we will inform the advertiser to update them. If they do not, they will no longer be able to use that disclaimer to run ads about elections or politics.
    Our Ad Library stores all ads related to politics, as well as information like the budget associated with an individual ad, a range of impressions, and the demographics of who saw the ad, for seven years. This effort has set new benchmarks in how the industry deals with political advertising.
    Anyone willing to advertise can run ads about elections or politics, provided the advertiser complies with all applicable laws and the authorization process required by Meta on its platforms. We don’t allow ads that violate our Ad Policies, and we disable ads flagged to us by the Election Commission of India which we find in violation of local election laws.
  4. Election Laws and Social Media Intermediaries: While we comply with election-related obligations applicable to social media intermediaries, advertising laws in this context are still evolving. To address this gap, all industry players came together under the banner of IAMAI, a leading trade body, to volunteer to follow certain best practices approved by the Election Commission of India. It was applicable to all major social media intermediaries who signed up to ensure transparency for political advertising. The self regulatory code continues to be followed since 2019 by the participants of the code and new members have also joined since then. The transparency efforts offered in Meta’s ads transparency tool are the basis for ECI to act against violators on the platform.
  5. Coordinated Inauthentic Behavior enforcements is linked to behaviors and not content: Some of the recent media reports conflate coordinated inauthentic behavior as an action to limit virality and reach of political advertising. This is not how our enforcement against coordinated inauthentic behavior works. When we enforce against coordinated inauthentic behavior, we are focused on behavior, not the content. In fact, in many cases the content shared by influence operations isn’t verifiably false and may in fact be copied from authentic communities these deceptive campaigns are trying to mimic or reach.
    Over the past several years, we’ve worked to make it harder for people who misrepresent themselves to operate on Facebook. We’ve improved our technologies so that we can more effectively detect and block fake accounts, which are the source of a lot of the inauthentic activity. When we remove accounts or Pages for engaging in coordinated inauthentic behavior, we are looking for signs of groups of people working together in a coordinated way, while centrally relying on fake accounts, to mislead people about who they are and what they’re doing.
  6. Our algorithms cannot be tweaked to push virality of a piece of content: Feed is personalized and shaped heavily by choices and actions of the individual user. It is made up primarily of content from the friends and family they choose to connect to on the platform, the Pages they choose to follow, and the Groups they choose to join. Ranking is then the process of using algorithms to order that content. We do not tweak our algorithms to suit a particular user.

Our work around elections is ongoing and will never be done. People trying to disrupt elections are always changing their tactics and often those tactics can impact people in historically marginalized communities. That’s why we are working closely with the civil rights community, law enforcement, and other technology companies, to stay ahead and keep adapting to trends, abuse and threats.

Source

0 Comments

Leave a Comment