Facebook unveils nudity, terrorism, revenge porn policy updates

0
433

Facebook has released new policy updates that governs terrorism, nudity, revenge porn and many other ‘bad’ content to normal users. Facebook hates when users post or share naked butts or women’s nipples on the social network, according to its newly clarified Community Standards. The company also outlined what constitutes hate speech, revenge porn and terrorism. Facebook did this just some few day after twitter also announced their anti-revenge porn campaign to stop revenge porn on it’s microblogging platform.

In a detailed explanation Monday, the company outlined its new Community Standards, which have so far replaced the previous version. The updated policy, at nearly 2,500 words, is almost three times longer than the old guidelines, BBC reported. The guidelines are designed to respond to criticism it has faced and questions raised over how content on the site is moderated.

031515_facebook_hate_speech
Image: Screenshot (Facebook)

Bicket told the BBC that the rewrite of the policy was intended to address confusion about why some takedown requests were rejected. She stressed that the changes were meant as a clarification to the current guidelines, rather than a change in policy.

The company also banned “revenge porn,” or sexually explicit content posted without the subject’s permission. A Texas woman has sued Facebook for failing to delete falsified, lewd images of her after repeated requests. Google, Twitter and Reddit have also banned the sharing of sexual imagery without permission.

Members of the five independent organizations that comprise Facebook’s safety advisory board applauded the move, though with some reservations.

“I think it’s great that Facebook has revamped its community standards page to make it both more readable and accessible,” Family Online Safety Institute (FOSI) chief executive Stephen Balkam told the BBC. “I wish more social media sites and apps would follow suit.”

However, Balkam noted that the site has done nothing to allow members to prevent young users from seeing graphic videos that automatically play until Facebook has received a complaint. At that point, the company’s staff can add an interstitial image warning.

Loading...