COO Sheryl Sandberg has acknowledged that Facebook was wrong to delete posts showing an iconic image of a naked girl fleeing a napalm attack during the Vietnam war, according to a Reuters report. The admission came in a letter to Norwegian Prime Minister Erna Solberg, which Reuters obtained on Monday.
Facebook last week repeatedly deleted the Pulitzer Prize-winning photo, “The Terror of War,” on grounds that it violated its nudity restrictions.
The company’s reversal of course came in response to a global firestorm over its censorship. The controversy began when Norwegian author Tom Egeland posted the image as one of seven war-related photographs accompanying an entry on the history of warfare. The photo shows Pan Thi Kim Phuc, age 9 — naked because her clothes had been burned off her body — fleeing in terror with several other children.
The editor of Aftenposten, Norway’s largest daily newspaper, wrote an open letter to CEO Mark Zuckerberg, criticizing Facebook’s handling of the issue. Facebook’s Hamburg office had sent his publication an email demanding that it remove the photo from its page. Before the paper’s editor, Espen Egil Hansen, could respond, Facebook deleted the image and the accompanying article from the paper’s social media page.
Hansen blasted the action in his letter, telling Zuckerberg that Facebook, which funcitons as a powerful media platform, was engaging in censorship.
“I think you are abusing your power and i find it hard to believe that you have thought it through thoroughly,” Hansen wrote.
Support for the original post quickly became so widespread that Solberg reposted the image to her page, which Facebook also deleted.
“What Facebook does by removing images of this kind, good as the intentions may be, is to edit our common history,’ Solberg wrote in a Facebook post. “I hope that Facebook uses this opportunity to review its editing policy, and assumes the responsibility a large company that manages a broad communications platform should take.”
Mounting Backlash
Before Facebook reversed course, free speech advocates, journalists and other critics called it out for ignoring the historical context of the photograph and relying far too much on its computerized algorithms.
The image of a naked child normally would violate Facebook’s community standards and might be classified as pornographic in some countries, noted Facebook spokesperson Andrea Saul. However, the company realized after the outcry that the specific image in question had great historical significance.
“Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed,” Saul told TechNewsWorld.
Facebook would adjust its review mechanisms to permit sharing of the image going forward, she added.
The company will engage with publishers and other members of Facebook’s global community to improve its policies in order to promote free expression, Saul said.
Algorithm Overkill
The controversy follows a series of moves Facebook and other social media companies have made to monitor the growing problems of online harassment, hate speech, and the use of social media for terrorist activities. Facebook and other social media platforms have made changes to their service agreements and algorithms to monitor more closely what many consider abusive behavior online.
Facebook came under fire in May after a report accused the company of biasing trending topics against conservatives. In response, it made several changes to the way it curates content, in an effort to take the human bias out of the picture.
It’s possible Facebook went too far in relying on its technology to make common sense decisions about the content appearing on its pages, with censorship of legitimate imagery and speech becoming unintended consequences.
“I think it’s one more demonstration that Facebook has taken a pivotal role in publishing,” said Rick Edmonds, media business analyst at the Poynter Institute for Media Studies.
“It’s standards bear scrutiny, whether they are algorithmic or not,” he told TechNewsWorld. “Human editors make blunders too, but this action is egregiously illogical, as the affected Norwegian journalists say.”
I have two problems with Facebook. One is they insist on gathering as much info on you as they can. But filter what you post because they want control. Is Facebook really a freedom of speech platform? Or do we end up going the direction of a China, or North Korea where big entities with power decide what is correct? It’s a slippery slope when a Facebook or Google or even our news media begins to filter information. The filter should be at the end user to decide if they want to view content not from a provider of said content. I understand some content can be disturbing and needs some sort of control. But with verification you should be able to access it. Otherwise we become a nation of censorship decided by one entity who should not have that kind of power.