11 October, 2024
Search
Close this search box.
Meta oversight board finds censoring of word ‘shaheed’ discriminatory

Date

Spread the love
Meta oversight board finds censoring of word ‘shaheed’ discriminatory

The board advises Meta to end its ‘blanket ban’ on the Arabic word for ‘martyr’ across Facebook, Instagram and Threads

Nader Durgham

A blue verification badge and the logos of Facebook and Instagram are seen in this picture illustration taken 19 January 2023 (Reuters/Dado Ruvic)

Meta’s Oversight Board, the body in charge of content moderation decisions for the company’s social media platforms, found that censoring the Arabic word “shaheed” has had a “discriminatory impact on expression and news reporting”.

In an investigation done at Meta’s request, the board found that the company’s highly restrictive approach regarding “shaheed”, the most censored word on Facebook and Instagram, has led to “widespread and unnecessary censorship affecting the freedom of expression of millions of users”.

“Shaheed” has several meanings but can roughly be translated to “martyr” in English. The board has found that Meta has struggled to grapple with the linguistic complexities and religious significance attached to that word.

As the word is also used as a loan word in other languages, many (mostly Muslim) non-Arabic speakers have had their posts censored on Meta’s platforms.

Prior to the release of the board’s advisory opinion, Human Rights Watch found that Meta was guilty of “systemic censorship of Palestine content” amidst the Gaza war, which it attributed to “flawed Meta policies and their inconsistent and erroneous implementation, over-reliance on automated tools to moderate content, and undue government influence over content removals.”


The company has also previously removed the accounts of several Palestinian and pro-Palestinian individuals and advocacy groups, which has led to activists accusing it of “taking a side” in the conflict.

‘Discriminatory and disproportionate’

According to the board, the “discriminatory and disproportionate” impact Meta’s restrictive policy has had on information sharing outweighs the company’s concern over the word being used to promote terrorism.

Some examples listed include governments sharing a press release confirming the death of an individual, a human rights defender decrying the execution of an individual using the word “shaheed”, or even a user criticising the state of a local road named after an individual that includes the honorific term “shaheed”.

‘We won’t be silenced’: Meta removes Instagram accounts of pro-Palestine advocacy group

Read More »

Meta would remove all of these posts, as it considers the term “shaheed” to be violating its policies.

“Meta has been operating under the assumption that censorship can and will improve safety, but the evidence suggests that censorship can marginalise whole populations while not improving safety at all,” said oversight board co-chair Helle Thorning-Schmidt. 

“The reality is that communities worst hit by the current policy, such as those living in conflict zones like Gaza and Sudan, also live in contexts where censorship is rife,” she added.

“The Board is especially concerned that Meta’s approach impacts journalism and civic discourse because media organisations and commentators might shy away from reporting on designated entities to avoid content removals.”  

As Israel’s ongoing war in Gaza has seen many users say they have been censored on Facebook and Instagram, the board saw it as important to tackle the targeting of posts containing the word “shaheed”.

The board concluded that Meta should “end the blanket ban on ‘shaheed’ when used in reference to people Meta designates as terrorists” and instead focus on only removing posts that are linked to clear signs of violence (such as imagery of weapons) or when they break the company rules (for example, glorifying an individual designated as a terrorist).

About the Author

More
articles