— Facebook

Why Claims of Bias in Our Content Review Process Are Wrong

A number of recent public reports have claimed that our content moderation related to the war in Ukraine is biased. At the heart of these accusations is the claim that those who review content for Meta deliberately censor Ukrainian content, whilst freely allowing pro-Russian propaganda.

These accusations are false and there is no evidence to support them. Preventing bias and making sure we’re applying our policies accurately and consistently is fundamental to our approach and something we take seriously. 

That is why there are dedicated teams that collectively review content on Facebook and Instagram in more than 70 languages, as we recognise that it often takes a local to understand the specific meaning of a word or the political climate in which a post is shared. For example, content from Ukraine requiring language and cultural expertise is checked by Ukrainian reviewers, not Russian.

It’s also why we have policies — called our Community Standards on Facebook and Community Guidelines on Instagram — which are, by design, comprehensive and detailed so any two people looking at the same piece of content would make the same decision. These policies are also the only criteria that these teams are allowed to use to review content. They are global and apply equally to all who use our technologies, be it Ukrainians, Russians, Americans or anyone else. Every reviewer undergoes a thorough onboarding process and regular re-training to make sure they properly understand our policies and how to apply them. 

Finally, it’s also why we allocate content to reviewers randomly, so they can’t determine what content they will receive for review. We also conduct weekly audits across all review teams — again, based on a random sample — meaning it wouldn’t be possible for a reviewer to evade this system. Our regular audits of sites that review Ukrainian and Russian language content, as well as other languages across Central and Eastern Europe, have shown consistently high levels of accuracy and are in line with our results across other review sites and other languages.

Of course, there will always be things we miss and things we take down by mistake. There is no ‘perfect’ here, as both humans and technology make mistakes. Wars are also complex and fast-moving events where the scope for errors is greater. That’s why we have teams that work around the clock to rectify any errors and provide channels for people to appeal decisions they disagree with.

False assertions designed to undermine trust in both public and private institutions are not new, and are to be expected during a time of conflict. However, to suggest that our content moderation processes — and those who review content for Meta — are deliberately biased and in favour of anyone is simply wrong.

For more information about our ongoing efforts regarding Russia’s invasion of Ukraine, including how we address Russian propaganda, information operations and misinformation, please read our newsroom post on this topic. 

Source

What is your reaction?

0
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly

Leave a reply

Your email address will not be published. Required fields are marked *