Facebook bans more than 4,000 people and groups the company considers dangerous, including white supremacists, militarized social movements, and alleged terrorists.
The Intercept on Tuesday published a leaked list of dangerous individuals and organizations that Facebook doesn’t allow on its platform, revealing how the social network controls content that could lead to offline violence. Experts told Intercept The list and Facebook’s policy suggest that the company imposes harsher restrictions on marginalized groups.
More than half the list consisted of alleged foreign terrorists who are predominantly Middle Eastern, South Asian and Muslim. Experts told Intercept That list, as well as Facebook’s policy, suggest that the company imposes harsher restrictions on marginalized groups.
Facebook has a three-tier system that indicates what types of content the company will implement. Terrorist groups, hate and criminal organizations are part of the most restrictive Tier 1. The least restrictive Tier 3 consists of militarized social movements, which The Intercept said are “mostly right-wing anti-American militias, who are almost entirely white.”
Brian FishmanFacebook’s policy director for terrorism and dangerous organizations said in a series of tweets that the version of the list published by The Intercept is not comprehensive. He added that the list is constantly updated.
“It is extremely difficult to define and identify dangerous organizations globally. There is no hard and fast definition agreed upon by everyone,” he said. Fishman also pointed out that terrorist groups such as ISIS and al-Qaeda have hundreds of individual entities, many of which are listed as separate entries to “facilitate enforcement”, allowing entities from a particular region to be identified. the number decreases. He said the Tier 1 list includes more than 250 white supremacist organizations.
Facebook has faced pressure to be more transparent about its policy against dangerous individuals and organizations. In January, inspection board The company responsible for reviewing the social network’s content moderation reversed the decision to remove a post the company claimed violated this policy, noting that “the rules are not clear enough to users.” it was done.” The board also recommended that Facebook publicize a list or examples of dangerous organizations and individuals.
Fishman said Facebook did not share the list “to limit legal exposure, limit security risks, and reduce opportunities for groups to circumvent regulations” but is trying to improve the policy. .