Facebook may soon lower its censorship standards and display more graphic content including violence and nudity that will be deemed more “newsworthy” or important enough to its millions of social media users. This standard lowering will now run contrary to Facebook’s policies of censoring. In spite of this, Facebook assures its users that the intent here is to allow images and stories that do not impose safety risks or show graphic images to minors and others who do not want to see them, but may be automatically but unintentionally censored by Facebook’s system. This move comes after Facebook faced widespread criticism for temporarily censoring the famous “Napalm Girl” photo by Nick Ut depicting a nude child escaping a napalm bombing strike during the Vietnam War. Facebook began censoring the photo after Norwegian Aftenposten journalist and editor Espen Egil Hansen shared it with others and the newspaper. Eventually Facebook restored the photo and all its shares when the move became public through heavy media backlash.
Another similar issue to this was when Facebook temporarily took down a video showing the final moments of Philando Castile before he was shot down by police. At first Facebook claimed a “technical glitch” caused the disappearance of the newsworthy video, but later on admitted the social media’s censorship system automatically took down the video after it detected content that violated its standards and policies. They also admitted that the takedown was a “miscategorization” and censorship was due to a wrongly applied algorithm. In connection to this, last October 21, Facebook again censored a Swedish breast cancer video, but has since apologized and restored it. Although Facebook has often argued about its clearly defined censorship policy, in line with the recent events of mis-censorship, the social media company may need to rethink and better define its application of its censorship. This new move towards new content on its news feeds may be its response.
Video Courtesy of YouTube:
More recently Facebook has been trying to get feedback from its users about what they did and didn’t want to see. This led to Facebook finally announcing that, “In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest, even if they might otherwise violate our standards.” Facebook has repeatedly reminded its users and the public in the past that it is not a media company and so it doesn’t have the same editorial responsibilities to avoid censorship even if it offends viewers. Facebook insists it operates as a technology platform that gives users what they want. Facebook has had to remind people in the past that it’s a technology company. And while the company is aware of the meaningful role it plays in media, and yet its responsibility is to make sure it is still a social media platform for all ideas. They are not in the business of deciding which ideas people should read about.