losangelesfert.blogg.se

Facebook oversight board
Facebook oversight board









facebook oversight board

Oversight Board rulings provide the user with an explanation of what happened, which the board thought was important.

facebook oversight board

Although Facebook restored the image before the case made its way to the board, the Oversight Board still wanted to making a ruling on it. (It restored a post that was removed by its automated moderation system.) A Brazilian user's breast cancer awareness post was removed from Instagram for showing female nipples. The Oversight Board also ruled on a case that Facebook had already reversed itself. The board ruled that the post was more of a critique of government policy than a call for Facebook users to take a possibly harmful medication. In addition to that case, the Oversight Board overturned a Facebook decision to remove a post in France that the company claimed fell under its COVID-19 misinformation policy. While the Board’s ruling can only make Facebook restore the post, it also suggested that it update this policy so users know who and what is designated as “dangerous." The board admonished Facebook for not providing users with a list of examples that fall under its Dangerous Individuals and Organizations Community Standards policy. According to the board, those comments made it clear that the quote was being used to criticize Trump. In addition, the board determined that Facebook did not make its policies about who qualifies as a "dangerous individual" clear.Īnother interesting piece of evidence the Oversight Board used: comments made on the post by the user's friends. The decision was based mostly on these two findings: The user was telling the truth about the quote being used to criticize Trump, not promote a Nazi. The board ruled in favor of the user, ordering Facebook to restore the post. However, the user argued that they posted the quote in order to criticize then-President Donald Trump, not to disseminate hateful material. In October 2020, a user shared a quote falsely attributed to Nazi Germany’s Minister of Propaganda Joseph Goebbels. Take, for example, the one U.S.-based case the board reviewed. The other decisions made by the Oversight Board appear fairly straightforward. Instead of taking meaningful action to curb dangerous hate speech on the platform, Facebook punted responsibility to a third party board that used laughable technicalities to protect anti-Muslim hate content that contributes to genocide.” It’s impossible to square Mark Zuckerberg’s claim that Facebook does not profit from hate with the board’s decision to protect a post showing images of a dead Muslim child with a caption stating that ‘Muslims have something wrong in their mindset.’ It is clear that the Oversight Board is here to launder responsibility for Zuckerberg and Sheryl Sandberg. “Facebook’s Oversight Board bent over backwards to excuse hate in Myanmar-a county where Facebook has been complicit in a genocide against Muslims. When the text is put into context alongside the picture, the post does appear to be dehumanizing a group of people for the crime of.trying to escape the civil war in Syria and ISIS.Įric Naing, a spokesperson for the civil rights group Muslim Advocates, provided an emailed statement on the ruling to Mashable: It seems like the Oversight Board ignores the photo of the child, other than to recognize it will be reposted with a warning label as per Facebook's Violent and Graphic Content Community Standard policy. However, according to the Oversight Board, ".while hate speech against Muslim minority groups is common and sometimes severe in Myanmar, statements referring to Muslims as mentally unwell or psychologically unstable are not a strong part of this rhetoric." According to the board, its own translators claimed the phrase more accurately translated to “hose male Muslims have something wrong in their mindset.”Įxperts have previously put some blame on Facebook for the spread of anti-Muslim rhetoric in Myanmar. While Facebook removed this post under its Hate Speech Community Standard, the board ruled to reverse this decision and restore the content. Along with the photo, they included a comment which Facebook translated as, “ something wrong with Muslims psychologically.” The most bizarre decision from the board involved a post that was flagged as "anti-Muslim hate speech." A user from Myanmar posted a picture of a Syrian toddler who drowned while trying to reach Europe in 2015. As a result, Facebook must restore those four posts. The Oversight Board chose to overturn Facebook’s decision to remove content in four out of the five cases. Facebook’s Oversight Board has officially chimed in on its first five cases - and the rulings are certainly interesting.











Facebook oversight board