Here’s How Facebook Decides What You Can and Can’t Post

Facebook has updated the policies that govern what users can and can’t post on the social network. While the update doesn’t actually change anything, it offers more clarity and specificity on what is and isn’t allowed. The update also provides some insight into the way the company handles the imperfect process of moderating its community of more than 1 billion users worldwide.

Facebook clarified its community standards and its rules on the types of sharing that are allowed, the kinds of content that can be reported, and the sort of content that’s subject to removal by the social network. Facebook’s updated community standards are aimed at four objectives: “keeping you safe,” “encouraging respectful behavior,” “keeping your account and personal information secure,” and “protecting your intellectual property.”

While Facebook notes on the community standards page that the “conversations that happen on Facebook reflect the diversity of a community of more than one billion people,” Vindu Goel, writing for The New York Times, points out that the company walks a delicate line when it tries to ban violent or offensive content without suppressing the free sharing of information. Facebook’s user base is huge, with wide variances in age and cultural values, and is subject to different laws around the world. And while Facebook has published guidelines for all of those users, the reasoning behind the social network’s decision to block or allow content has often seemed opaque or inconsistent.

For example, Goel notes that the company flip-flopped on whether to allow the beheading videos that circulate with news stories of acts of terrorism and human rights abuses. It blocked a page in Russia that promoted an antigovernment protest, but allowed copycat pages to remain online. And while it allowed San Francisco’s drag queens to continue to use their stage names on the social network, it continued to enforce its real-name policy among other groups of users.

So a post on Facebook’s Newsroom, written by head of global policy management Monika Bickert and deputy general counsel Chris Sonderby, attempts to provide more clarity on what is and is not allowed on the social network. Bickert and Sonderby write, “While our policies and standards themselves are not changing, we have heard from people that it would be helpful to provide more clarity and examples, so we are doing so with today’s update.”

They note that the updated community standards provide more guidance on the company’s policies related to self-injury, dangerous organizations, bullying and harassment, criminal activity, sexual violence and exploitation, nudity, hate speech, violence, and graphic content. Facebook says that while some of the specific guidance is new, it is “consistent with how we’ve applied our standards in the past.”

Facebook explains community standards

Source: Newsroom.fb.com

Keeping you safe

The first principle of the social network’s community standards is to keep users safe. To that end, Facebook removes content, disables accounts, and works with law enforcement when it believes that there’s a genuine risk of physical harm or direct threats to public safety contained in a variety of types of abusive content, including:

  • Direct threats
  • Self-injury
  • Dangerous organizations
  • Bullying and harassment
  • Attacks on public figures
  • Criminal activity
  • Sexual violence and exploitation
  • Regulated goods

Threatening any individual or group with personal harm or with theft, vandalism, and other financial harm is prohibited, as is posting content that promotes suicide, self-mutilation, eating disorders, or other forms of self-harm. Users also can’t bully or harass others, or target private individuals — defined as “people who have neither gained news attention nor the interest of the public, by way of their actions or public profession” — with an intention of degrading or shaming them. And while the social network permits the open and critical discussion of people who are featured in the news or who have a large public audience, it removes all credible threats or hate speech directed at them.

Terrorist organizations like the Islamic State have always been banned from having a presence on the social network. The updated community standards also say that Facebook will remove content that supports terrorist or organized crime groups or condones their activity.

Facebook is for the first time explicitly banning content that promotes sexual violence of exploitation — which includes “solicitation of sexual material, any sexual content involving minors, threats to share intimate images, and offers of sexual services,” as well as so-called revenge porn — and removes media depicting incidents of sexual violence or images shared without permission from the people shown.

The company also specifies that users aren’t permitted to organize criminal activity through the social network or to use it to celebrate crimes they’ve committed. It prohibits attempts to purchase, sell, or trade prescription drugs and marijuana on its platform, and expects users who post offers to purchase or sell firearms, alcohol, tobacco, or adult products “to comply with all applicable laws and carefully consider the audience for that content.”

Facebook explains community standards

Source: Newsroom.fb.com

Encouraging respectful behavior

Facebook encourages people to use Facebook to share their experiences and raise awareness about the issues that matter to them, and as such, its community standards warn users that they may encounter opinions that are different from their own. The social network aims to balance the safety and interests of the community by limiting the audience of or removing some types of sensitive content, including:

  • Nudity
  • Hate speech
  • Violence and graphic content

Facebook has always banned pornography and most forms of nudity, but is, for the first time, going into the specifics, though the updated community standards offer the caveat that “our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes. We are always working to get better at evaluating this content and enforcing our standards.”

The social network says that it removes photographs of people displaying genitals or focusing in on fully-exposed buttocks. It also restricts “some images” of female breasts if they include the nipple, but always allows images of women engaged in breastfeeding or showing post-mastectomy scarring. Explicit images of sexual intercourse are prohibited, and descriptions of sexual acts “that go into vivid detail” may be removed.

One category of content that the social network has no tolerance for is hate speech. The social network removes hate speech that directly attacks people based on their race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, or disability or disease, and organizations or people who promote hate against protected groups are not allowed on Facebook. While Facebook has always listed the categories of content that qualify as hate speech, the community standards now address in more detail the social network’s policy on satire or using content from others to draw attention to an issue.

While Facebook permits users to share graphic and violent images when sharing their experiences or raising awareness of important issues, it removes graphic images “when they are shared for sadistic pleasure or to celebrate or glorify violence.” Additionally, the company asks users to warn their audience if they are sharing content with graphic violence.

Facebook explains community standards

Source: Newsroom.fb.com

Keeping your account and personal information secure

Facebook aims to keep users’ account data and personal information secure, and requires members to use their authentic name and identity and refrain from posting the personal information of others.

  • Using your authentic identity
  • Fraud and spam
  • Accounts of friends or family who have passed away

Facebook wants users to stand behind their opinions and actions with their authentic identities so that the “community is more accountable.” It will ask users with multiple profiles to close the additional profiles, and may remove profiles that impersonate other people.

The clarification that users can sign up with their “authentic identity,” or the name they choose to go by, seems aimed at clearing up some of the controversy caused by Facebook’s enforcement of its so-called “real name” policy. Bickert tells Re/Code, “There has been a lot of confusion from people who thought we were asking them to use what’s on their driver’s license,” and explains, “That’s not an accurate interpretation. We want people communicating using the name they actually use in real life.”

Facebook says that it investigates any suspected breach of security, and the updated community standards warn that it may refer attempts to compromise the security of a profile, including fraud, to law enforcement. Additionally, people aren’t allowed to use misleading information to collects likes, followers, or shares.

The updated policies also clarify Facebook’s policies on what happens to a profile when a Facebook user passes away. The company will secure and memorialize accounts after receiving proof of death, and immediate family members can also request that the company remove and delete a loved one’s profile. This works in tandem with the option the social network recently added for users to assign a legacy contact to their account to establish who will manage the account once the user passes away.

Facebook explains community standards

Source: Newsroom.fb.com

Protecting your intellectual property

Facebook enables users to share the information that’s important to them. Users own all of the content that they post on the social network, and they can control how it’s shared through privacy and application settings. But before sharing information on Facebook, users need to ensure that they have the rights to do so, and make sure that they are respecting any copyrights, trademarks, or other legal rights. In its Help Center, the company provides users with information on how people and organizations can protect their intellectual property rights.

Bickert tells The New York Times that Facebook has no plans to automatically scan for and remove potentially offensive content. Instead, the company will still rely on users to report violations of the community standards. Facebook explains in its Newsroom post:

If people believe Pages, profiles or individual pieces of content violate our Community Standards, they can report it to us by clicking the “Report” link at the top, right-hand corner. Our reviewers look to the person reporting the content for information about why they think the content violates our standards. People can also unfollow, block or hide content and people they don’t want to see, or reach out to people who post things that they don’t like or disagree with.

Bickert says that Facebook has review teams working at all hours of the day around the world, and that every report is examined before a decision is made. The process typically takes about 48 hours on matters of safety, though Goel notes that that may not be fast enough in an era where graphic content can go viral in a matter of minutes.

Through the review process, Facebook aims to take into account the context of a post, and users can also appeal Facebook’s “rulings.” Bickert said that clarifying the rules helps both Facebook users, and the reviewers who determine what’s permissible to post. “We can only do this if we have objective rules,” she explained. All reported items are reviewed the same way regardless of the number of people who report it.

Along with providing additional clarity on what users can share via the social network, Facebook also shared its Global Government Requests Report, which details the number of government requests to restrict content that contravenes with local laws. Facebook says that it challenges requests “that appear to be unreasonable or overbroad.” Additionally, if a country’s government requests that the company remove content that is illegal in that country, Facebook will “not necessarily” remove it from the social network altogether, but may simply restrict access to it in the country where it’s illegal.

Facebook reports that in the second half of 2014, it restricted 9,707 pieces of content for violating local laws, up 11% over the first half of the year. India requested the most takedowns, at 5,832, and Turkey followed with 3,624. According to the report, no content was restricted in the United States based on government requests. The total number of government requests for account data increased slightly, to 35,051 compared with 34,946 in the first half of the year. The United States topped this list, logging 14,274 requests for information on 21,731 Facebook accounts. The company agreed to hand over information in 79% of the cases.

Sonderby said in a statement, “Moving forward, we will continue to scrutinize each government request and push back when we find deficiencies.” He added, “We will also continue to push governments around the world to reform their surveillance practices in a way that maintains the safety and security of their people while ensuring their rights and freedoms are protected.”

More from Tech Cheat Sheet:

More Articles About:   ,