Throughout 2016, Facebook took a hands on approach to curating the digital environment in which its users interact. Although privacy groups have long fretted about how the site tracks users and sells that data to advertisers, in 2016 the company increasingly used its power to censor opinions and images deemed undesirable.
Facebook content is monitored in order to maintain “community standards.” This should mean that it allows for cultural diversity and encourages respectful discourse. “Our Community Standards aim to find the right balance between giving people a place to express themselves and promoting a welcoming and safe environment for everyone,” the company explains in a letter to users.
Images that don’t fit that view can be removed at the site’s discretion, but these community standards increasingly seem arbitrary. For example, one bug on the site shut down anyone who shared a particular image of a cat’s head photoshopped into a suit. Even sending the silly shot in a private message was enough to have a user’s page disabled.
Other bans point to problems with how the site treats controversial themes, with many worried about its tendency to turn towards censorship. One of the most visible of these spats occurred after Norwegian author and journalist Tom Egeland was banned after posting a Pulitzer Prize-winning photograph of a young girl running naked from a napalm attack. Facebook said that the famous image violated its “community standards,” but many in Norway saw it as an attempt to censor an historical image.
When another Norwegian newspaper published an editorial criticizing Facebook, the site banned the editor who wrote it for posting a link to the story on his personal Facebook page. It even deleted a post by the Norwegian prime minister which defended the journalists.
“When Facebook removes an editorial from a Norwegian newspaper, it shows the online community a lack of respect for editorial freedom unlike anything I have ever seen,” said editor Gunnar Stavrum in a follow-up.
Still, the bans did not end there. In Sweden, pictures of a man who had suffered severe facial burns were removed. So was a post depicting topless Aboriginal women performing a traditional dance.
The site also censored a post by French paper Le Monde which featured a woman having a mammogram for violating its anti-nudity rules. While the site later apologized and reposted the story, critics of Facebook noted that it was word for word the apology offered for a series of other accidental deletions.
“The post was removed in error and restored as soon as we were able to investigate,“ a Facebook spokesperson said. ”Our team processes millions of reports each week, and we sometimes get things wrong. We’re very sorry about this mistake.”
The same apology was also used after Facebook disabled the accounts of two Palestinian journalists, blocked a post from a Black Lives Matter activist, and after protesters in North Dakota claimed that the site cut off a livefeed they were posting.
It also went after posts of historical paintings, such as Gustave Courbet’s painting The Origin of the World and a sketch of a human hand by Hans Holbein. Facebook blamed human error for the ban, rather than an algorithm malfunction.
Facebook also blocked access to certain groups. In February, it shut down a brand page for Viz, a British satirical newspaper known for its often crude humor. The page was abruptly taken down for an unexplained breach of the rules.
“The question is what is, and isn’t acceptable to Facebook,” said Ian Westwood, group managing director for Viz’s publisher. “We have had correspondence with them before about stuff they haven’t liked and we’ve taken it down. This time they have just blocked the page and won’t tell us what we’ve violated.”
Viz took to Twitter with the news of its ban and succeeded in raising enough attention that Facebook reinstated the page. Often, public outcry was sufficient to force the company to change its mind and reinstate images that had been removed. However, many supporters of privacy rights worry that that the blocks have become too frequent.