Facebook community standards: negotiating a tricky path
Anything that educates users and encourages greater online safety is to be welcomed. Facebook’s recently announced initiative to provide guidance on acceptable content is a further example of the move towards greater transparency.
We now know the criteria used to decide whether content is taken down or not. A lot still depends on reporting by individual users on this social media platform, particularly where personal privacy and safety is concerned, but these new guidelines suggest that a great deal of the regulation of the site is managed by human intervention.
Facebook’s new community standards are broken down into navigable sections rather than being presented as a single impenetrable screed. It uses language that is accessible and without the all-too-common lawyers’ footprints on the text.
After a short introduction explaining the reason for the community standards policy there are five sections:
* Keeping you safe
* Encouraging respectful behaviour
* Violence and graphic content
* Keeping your account and personal information secure
* Protecting your intellectual property
There then follows a section on reporting abuse and some guidance on controlling the content that you see. The section ‘Keeping you safe’ identifies the risks that users face and this is a good starting point for deciding on what measures are needed to improve personal safety and to create an environment where users feel secure. What is refreshing is the empowerment of users by suggesting that they are not just passive participants but that they have some responsibility for their own safety. They can change privacy settings and block content or material from unwanted sources. Previous attacks on public figures as well as tragic cases of online bullying resulting in self-harm or suicide have created an unwelcome atmosphere for many people who consequently exclude themselves from social media.
Social networks such as Facebook offer enormous potential benefits to groups that would otherwise be marginalised or excluded. For instance, elderly parents separated from family, friends wanting to stay in touch with schoolmates, those who want to network with like-minded people, and patients who want to share experiences with others that have similar medical conditions are all groups that already benefit.
A recent analysis of social networks identified risks to users in terms of their consequences.
Facebook’s approach is a mixture of consequences, agents and activities – lacking coherence but instinctive and meaningful to many users. For instance, it makes the distinction between risks to public figures and to private individuals, even though the risks they face may have similar consequences, such as intimidation or physical violence. It also refers to ‘Dangerous Organizations’ and to ‘Regulated Goods’ as separate categories. It also identifies ‘Direct Threats’, ‘Self-Injury’, ‘Bullying and Harassment’, and ‘Sexual Violence and Exploitation’ which conform to the categorisation of risk by consequence to the user.
The reference to ‘Criminal Activity’ raises the question of ‘whose laws?’. The wording on criminal activity is interesting because it has a get-out clause to allow for the activities of (peaceful) protest groups that might otherwise contravene draconian national laws:
We prohibit the use of Facebook to facilitate or organize criminal activity that people, businesses
or causes physical harm to animals, or financial damage to people or businesses.
So, for instance, one would expect that human rights activity promoting LGBT rights in Russia, which contravenes the Russian crime of “promoting non-traditional sexual relations”, would not be taken down by Facebook. Similarly, promotion of multi-party democracy in one-party states could be seen as a criminal act, but might not fulfil the criteria for takedown, in the way that violent revolution probably would.
This new policy demonstrates the importance of careful judgement in many cases and is a significant advance from the crude filters that have previously been deployed by social media to block offending content.
It also puts out an important message that we, as users, have some power (and responsibility) to protect our own online safety. All in all this is a welcome development.
Social media comprises platforms which enable users to participate in, comment on and create content as a means of communicating with each other and the public. Social media encompasses a wide variety of content formats including text, video, photographs and audio. The by-product of creating this content is called social content.