It appears Facebook’s Internal Rulebook on sex, violence and terrorism has been leaked to The Guardian.
The social media giant’s secret rules and guidelines for deciding what and what not to be posted by Facebook’s 2 billion users are revealed for the first time. The investigation and the leaked documents will fuel a worldwide debate about the role and ethics of the social media heavy weight.
As we all believe the internet is the last bastion of free speech and social freedom, Facebook is facing a tough spot. They are being asked to regulate most of the western civilisation and many countries around the world. If they do nothing, Facebook will become a cesspool of moral filth. Facebook will be blamed for cyber-bullying, people-bashing, racism, stalking and other hate and isms people do with each other. At the same time, any censorship or rule they make will be hard to enforce and will surely be subject to strict investigation.
Details of the leaked rules surround the question of allowing or rejecting content on violence that includes racism, hate speech, terrorism, pornography and streaming of self-harm.
In an investigation, The Guardian has seen more than 100 internal training manuals, flowcharts and spreadsheets that give deep insights and laid bare for the first time how Facebook moderate content for its 2 billion users.
These files show the first view of the codes and the rules formulated by the site, of which is under huge political pressure in the United States and Europe. The files also show the challenges faced by executives and moderators to react to new site challenges like this so-called “revenge porn”. Also, Facebook moderators are overwhelmed by the volume of work they are getting. This also means that they often have just 10 seconds to decide which will get posted.
One inside source even told The Guardian that Facebook can’t keep control of its content since it has grown very big and too quickly. Unlike Google that has their algorithms and crawlers to identify which websites are of low quality or which is spam, Facebook still has to depend on human moderators to do the job. Many moderators also stated some concerns about the inconsistency and specific nature of some of the policies.
The inside source also said that as an example, sexual content is said to be the most confusing and more complex. Under the violence policy, for example, moderators attempt to draw distinctions credible physical threats, self-harm and verbal threats.
Using thousands of pictures and slides, the company sets out guidelines that may worry critics who say the social media site is now a publisher and must do more in removing hurtful, hateful and violent content. On the other hand, these guidelines may also alarm free speech advocates concerned about Facebook’s role as a world de facto censor.
The investigation viewed documents relating to moderation policies within the last year and found the remarks for example, as “to snap a hoe’s neck, make sure to apply pressure to the middle of the throat” were potentially permissible, as Facebook do not see them as credible threats.
On the other hand, a threat against a leader, a president or any head of state should be deleted. Any person who post comments like “kill Barack Obama” or “Someone shoot fuc*&ng Trump” will and should be censored as stated in the documents.
Any violent videos, that include self-harm will not be always deleted but can be marked as disturbing. If the videos, create awareness, it will not be deleted and can be shared.
Bullying videos can be permitted unless the said videos are brutal or sadistic in nature.
Live streaming self-harm is also permitted because Facebook does not want to censor or punish people in distress.
Photos or videos of animal abuse can be shared, and with extremely upsetting images will be marked as disturbing.
Any handmade artwork that shows nudity or sexual activity is permitted, but, digitally made art showing any sexual activity is not allowed.
Any user or anyone with more than 100,000 followers on Facebook is assigned as public figure and are denied the full protection given to private users.
In one of the leaked documents received by The Guardian, Facebook acknowledges users who use violent language that express frustrations. They feel that this issue will not come back to them and they feel any issues or indifferent towards the person they are cursing because of the lack of empathy created by the communication through smart devices as opposed to cursing or threatening a person face to face.
Cursing and threatening due to frustration is just simply an expression of emotion, but not a transition to a plot or dangerous design. Expressions like “sod off” of “go fuck yourself and die” or “I’m going to fucking kill you” are not credible but just violent expression of frustration and dislike. Also, Facebook said that not all disagreeable or disturbing content violates the community standards.
Monika Bickert, head of global policy management has spoken to The Guardian and said it was hard to reach consensus on what to be posted and what not to be posted. There will always be some grey areas even if you draw the line.
As an example, the line between humour, satire and inappropriate content is sometimes blurred.
Marketers and Publishers
For marketers, companies and publishers, many of them can’t live without Facebook. Yes, they may have their own website or blogsite, but Facebook enables them to reach a wider audience. But as they have stated after they read the report from The Guardian, many of them got even more confused on whether they should or should not continue using the service.
Some of them have totally stopped using the service because of the rules they make and how silent they are when keeping the rules. Most of them are impossible to understand as a publisher or marketer, but if you’re looking for effective ways of connecting with your Facebook audience, the leaked guidelines can help you with issues like sensationalism, clickbait and any misinformation.
According to an article by Marketing Land, Facebook encourages marketers and publishers to focus on these points in what they publish. The content should be:
Respectful – Facebook understands that users have different opinions, thus it removes certain audiences from sensitive contents like violence, hate, racism, nudity and graphic images.
Informative – people value content that provides quality and relevant information to their lives. And each user has different content preferences.
Safe – any distributed content should keep readers safe. Facebook usually work with law enforcements in disabling accounts while removing content whenever threatening messages are received or sent.
Meaningful – the site’s descriptor takes account many personal factors like how close someone is to a person, how long a user read a story or watched a video.
Authentic – real and authentic stories are the ones that deeply affect with most readers. Authenticity should also reflect using clear headlines with no misleading expectations or sensationalism.
Accurate – through the years there have lots of fake news and misinformation spreading on Facebook. Most of them has gone even viral. These should be taken seriously because spreading fake information may lead to distrust, misleading expectations or worse, chaos.
Possibly, in the long run, Facebook will probably be better off taking a hands-off approach as some people and experts believe. They can claim that they’re like a mobile phone service or a regular phone company, which in essence they are not fully responsible for the contents of any specific user, but there should be some form of accountability to remove or block any trash content, fake accounts and any content that degrades anyone’s humanity.
As private or public users, marketers and publishers, you need to consider the leaked guidelines to get the best responses from your audience and followers.