Facebook’s quasi-independent Oversight Board said last Wednesday that the platform’s content moderation mechanism lacks consideration for local context. The board reversed Facebook’s decision to take down a post that was shared by a user in Myanmar and initially considered to be insulting toward Chinese people.
Although this is the second Myanmar-related decision that has been overturned by the Oversight Board, the ruling typifies some of the issues that are ingrained within Facebook’s content moderation mechanism and can only be addressed by changing its policies.
The post, published in April, includes political commentary about the Chinese government’s policy in Hong Kong as part of a discussion on its position in Myanmar. With around 500,000 views and 6,000 shares, the post also covers ways to restrict financing to the Myanmar military in the wake of the February 1 coup.
Dubbed by the media as Facebook’s own “Supreme Court,” the Oversight Board’s ruling argues that the post was aimed at a state instead of particular people based on their race or ethnicity. “This case highlights the importance of context when enforcing content policies designed to protect users from hate speech while also respecting political speech,” the board said in its decision.
However, public comments provided by local rights groups reveal that the evaluation process involves problems beyond the lack of context. “It is not possible to meaningfully assess the decision without seeing the actual content,” two representatives of the Tech Accountability Initiatives (TAI) said. Although the board allows comments from stakeholders to offer more “diverse perspectives,” it does not offer sufficient information to assess the posts. For example, the board does not provide the full text of the post, the author’s background, where the post was published, or the comments and reactions the post generated, according to the TAI.
“We also note concerns over the way Facebook leverages automation in its moderation, pointing in particular to the lack of transparency and consistency in Facebook’s use of slur lists as well as limitations with Facebook’s appeal process,” said the representatives.
Facebook’s decision to automate its moderation of Burmese language content from 2020 onward has hampered freedom of expression in the country. The platform has adopted a country-specific list of slurs for moderation for several years. However, the list is not available to the public, triggering questions about how the list is compiled and how regularly the company reviews it, according to the TAI.
Read this: Facebook was the internet in Myanmar. What happens now that it’s banned?
Although the junta started blocking social media platforms including Facebook, Twitter, and Instagram on February 3, local residents have managed to equip themselves with an array of digital tools to skirt the ban. For example, many have downloaded encrypted messaging apps and virtual private networks, or VPNs.
“Facebook has a history of using over-simplistic keyword searches for moderation, with no consideration for the context of the post or human assessments of the intent,” said the TAI representatives, adding that the platform allows users to challenge moderation decisions.
Platforms such as Facebook and Twitter remain a digital battleground where local users galvanize international and local support for the country’s anti-coup movement.
TAI’s representatives recommend that the board publish posts in full detail with the consent of users, provide metadata of the content, as well as hire translators that are well-equipped with local context and internet culture so that a human is involved in the review process.
The social network’s Oversight Board was created in May 2020. It consists of 20 journalists, politicians, and judges from around the globe. Endy Bayuni, a senior journalist from the English media outletThe Jakarta Post, covers Southeast Asia. Bayuni told KrASIA last October that the board takes on cases with “huge impact and real-world consequences” and that it must decide on cases within 90 days.
Initially proposed by Facebook founder Mark Zuckerberg in 2018, the board can reverse and uphold content takedown decisions made by moderators and executives of Facebook and Instagram. Direct messages on Instagram, Facebook Messenger, and WhatsApp are excluded from the review. Users who do not agree with the platform’s decisions can file an appeal to the FOB.
“The Oversight Board should encourage Facebook to implement a more sophisticated, proactive, and educational response with targeted sanctions incorporating international human rights standards on free expression,” said Oliver Spencer, a representative from Free Expression Myanmar, a local rights group advocating freedom of expression.
Read this: Facebook’s oversight board reverses Muslim-related hate speech takedown in Myanmar