Facebook has taken a new approach towards making the platform better and more user-friendly by removing hate speech, violence and nudity from its website.
The social networking giant has published a massive set of new guidelines that span across 27 pages detailing how its moderators view such content. The site has about 7,600 moderators deployed on the job who closely monitor the content being posted on the website.
These new guidelines follow just over a month of criticism after Facebook admitted that it had exposed millions of users’ data to a third party, prompting an extensive hearing for Facebook Founder Mark Zuckerberg in the U.S. Senate.
The team has now issued a detailed description of the content that is considered inappropriate for the world’s most popular social media platform.
Type of Content That Gets Banned
According to the newly published guidelines, any kind of abusive comment, violent threat against a person or community and content promoting terrorism will immediately be removed from the website.
The list is pretty extensive as it goes on to outline the types of subjects not allowed on the site, including talking about poaching animals, posting instructions on how to inflict self-injury, hate speech, sex with minors, selling or buying firearms and drugs, and planning to commit homicide in the country the user is from.
This is the gist of the content Facebook wants users to know. More details are explained across the 27-page set of guidelines.
Legal Action Against Select Comments
With over 7,600 moderators working on manually identifying the comments and removing them to create a safe community for everyone else, the people working on the job have the rights to inform the authorities if required.
When people discuss homicide, making a bomb or using drugs, the moderators are expected to report the authorities so as to prevent any bad incidents from happening before it occurs.
Facebook had a detailed community standards page which listed all the above-mentioned components, but this is the first time the company has explicitly provided all the information users should know.
Being a moderator can be an extremely difficult task as not everything is considered nudity or violence—the social network still allows breasts to be displayed if they’re depicted in a painting. When a story related to violence in a particular region is posted, it is still considered legal and will not be removed.
Quick Action Against Objectionable Content
Facebook has its content moderators spread across the globe with thousands of them working around the clock. Their aim is to take down all the wrong comments and content within 24 hours of posting them.
The process might sound tedious, but the website developers believe it is important to ensure obscene, racist or provoking content doesn’t find its place for longer periods.
The newly introduced guidelines page is translated into 40 different languages so that Facebook’s massive user base could read and understand them.
The team is also working to prevent people from finding ways to bypass the security measures in place and ensure moderators get the best assistance to keep the website free of hate speech or any illegal content.
Educating Users to Keep Facebook Clean
The ultimate goal of releasing this new guideline policy in such a detailed manner is to educate users on what is not allowed on the website. By letting users know the policies, the idea is to slowly reduce the number of inappropriate comments or content from being posted on Facebook.
In a blog post announcing the changes, Facebook’s Monika Bickert noted that the new community guidelines also offer an expanded appeals process if users think Facebook has made a mistake in removing their content.
Facebook has been at the center of some major data privacy scandals in the recent past and became answerable to millions of users around the globe. They also had to meet with federal authorities to prove the company is doing its part to keep the site clean, and these guidelines are expected to make things better for Facebook users.