42.6 F
New York
Friday, March 29, 2024

Facebook’s Ever-Changing Secret Rulebook for Political Speech

Related Articles

-Advertisement-

Must read

Facebook’s platform has become an open global platform for sharing ideas, news and information. The company has experienced exponential growth but also growing responsibility. Facebook is attempting to tackle propaganda and hate through its secret rulebook, which it updates regularly for its global content moderators.

Last Thursday night, the New York Times obtained a nearly 1,400-page document which is, in essence, the speech-policing rulebook, leaked from a Facebook employee. The employee said he “feared that the company was exercising too much power, with too little oversight — and making too many mistakes.” In its review and report of the complex ‘rulebook’, the Times revealed a range of biases, cracks and even mistakes in the censuring of posts. In particular, the paper noted contradictions in allowing extremist posts in some counties while stifling mainstream speech in others.

The online social media giant, launched by CEO Mark Zuckerberg in 2004, has come in the spotlight in its struggle to monitor billions of posts daily. To do this accurately, it must decipher the complicated context of everyday language, slang, text shortcuts images and even emojis, in over 100 languages. As per Fox News, a group of Facebook employees meets every other Tuesday, to update the rules, trying to simplify everything into standardized rules that can be translated into a quick allow or delete.

After creating the updated set of rules, the company then outsources the content moderation to other companies that are likely to hire unskilled, cheap workers. As per the Times report, the roughly 7,500 moderators “have mere seconds to recall countless rules and apply them to the hundreds of posts that dash across their screens each day. When is a reference to ‘jihad,’ for example, forbidden? When is a ‘crying laughter’ emoji a warning sign?”

Some of these moderators spoke up to the Times, using a shield of anonymity because of their nondisclosure agreement. They said they are pressed to assess a thousand pieces of content daily, with only eight to 10 seconds to spend evaluating each post. This is despite the importance of the job, which has the potential to stop a violent uprising, or give rise to one political party verse another.

The Times probe suggests that Facebook may have too much power in the dissemination of information, permitting some information while banning others. “Facebook’s role has become so hegemonic, so monopolistic, that it has become a force unto itself,” said Jasmin Mujanovic, an expert on the Balkans. “No one entity, especially not a for-profit venture like Facebook, should have that kind of power to influence public debate and policy.”

Monika Bickert, Facebook’s head of global policy management, maintains that the company is doing its best to prevent harm. “We have billions of posts every day, we’re identifying more and more potential violations using our technical systems,” said Bickert. “At that scale, even if you’re 99 percent accurate, you’re going to have a lot of mistakes.”

balance of natureDonate

Latest article

- Advertisement -