Facebook’s quasi-independent oversight board (FOB) said Thursday that it overturned the decision to take down a post that was shared by a Facebook user in Myanmar initially considered as “pejorative or offensive” towards Muslims, concluding that it did not advocate hatred or incite imminent harm.
The post, published in October, includes two widely-disseminated pictures of a Syrian toddler who drowned and washed up in the Mediterranean Sea in September 2015, together with a text stating that “there is something wrong with Muslims (or Muslim men) psychologically or with their mindset.”
The post also questioned the lack of response from the Muslim population with regards to the situation of the Uyghur Muslims in China, compared to the upset caused by cartoons of Prophet Muhammad in France, that have become a motive for terrorist attacks. The author argues that the deadly attacks in France have lowered the sympathies towards the Syrian toddler in the picture, which might suggest that the child could have grown up to be an extremist.
The board, dubbed as the social media giant’s “supreme court,” argues that while the post appears to make an insult against Muslims, it suggests to read it as “a commentary” on the differing reactions of Muslims regarding the events in France and China. “That expression of opinion is protected under Facebook’s community standards and does not reach the level of hate speech,” it said.
“Facebook’s sensitivity to anti-Muslim hate speech was understandable,” it continued, “particularly given the history of violence and discrimination against Muslims in Myanmar and the increased risk ahead of the country’s general election in November 2020.”
The social network’s oversight board, created in May last year, consists of 20 journalists, politicians, and judges from around the globe. Endy Bayuni, a senior journalist from the English media outletThe Jakarta Post represents Southeast Asia. Bayuni told KrASIA in October that the board takes on cases with “huge impact and real-world consequences,” and that it must decide within a maximum of 90 days.
Read this: WhatsApp tip-toes privacy policy in India as Facebook pushes its new hosting service
Initially proposed by Facebook founder Mark Zuckerberg in 2018, the board can reverse and uphold content takedown decisions from moderators and executives on Facebook and Instagram. Direct messages on Instagram, Facebook Messenger, and WhatsApp are excluded from the review. Users who do not agree with the platform’s decisions can file an appeal to the FOB.
There have been other rulings around the world. In Brazil, the board ordered to restore Instagram photos that were showing female nipples and aimed to raise awareness on breast cancer. A Facebook post including an alleged quote from Joseph Goebbels, Reich Minister of Propaganda in Nazi Germany, was reinstated too, as the author intended to draw a comparison with former US president Donald Trump.
In France, the board overturned the takedown of a post criticizing the lack of strategy regarding public health, while it upheld Facebook’s decision to remove a post that contained a racial slur against Azerbaijanis.