Privacy

Manipulative User Research Earns Facebook a Shiner

Once again, Facebook is embroiled in a controversy over privacy. This time, hackles have been raised by publication of a study for which the company manipulated the News Feeds of nearly 700,000 subscribers.

The study concluded that, yes, negative messages on social networks make people sad, and positive ones make them happy — and those feelings can spread through a social network to third parties.

Led by Adam Kramer of Facebook’s Core Data Science team, which was heavily involved in the research, the study was published in Proceedings of the National Academy of Sciences of the United States of America, or PNAS.

Both Facebook and Kramer have issued statements in response to the outcry, which in turn have been criticized as being dismissive of the issues raised.

What the Fuss Is About

The experiment “manipulated the extent to which people were exposed to emotional expressions in their News Feed,” the published paper states.

Nearly 690,000 subscribers to the English-language version of the social network were selected randomly as guinea pigs without their knowledge, and divided into two groups.

One group had some negative posts omitted from their News Feeds, while the other had some positive posts cut out.

The experiments were conducted between January 11 and 18 in 2012.

The researchers contended that no text was seen by the researchers and that the project therefore was compliant with Facebook’s Data Use Policy, to which all users must agree as a prerequisite to creating a Facebook account.

Agreement to the policy constitutes informed consent for the research, they argued.

“Facebook’s ToS, like those of most Internet companies protects Facebook but in no way provides the level of informed consent that is expected when doing research with human subjects,” John Simpson, privacy advocate at Consumer Watchdog, told TechNewsWorld.

It’s All Right, Ma, I’m Only Bleeding

“Facebook and Google look at people only in the abstract,” Rob Enderle, principal analyst at the Enderle Group, told TechNewsWorld.

“They don’t ask for permission because they don’t care — any more than they’d poll ants before they drove over an ant hill,” he observed.

“Sleazy, unethical behavior is nothing new for Facebook, so I’m not surprised they would do this,” Simpson fumed. “The academic researchers involved with the project and the National Academy of Sciences, which published the results, should be ashamed of themselves.”

The study’s editor, Princeton University psychology professor Susan Fiske, reportedly had concerns that were allayed when the authors said their local institutional review board had approved it on the grounds that Facebook manipulates subscribers’ News Feeds all the time.

It Wasn’t All That Bad: Kramer

However, study lead Kramer did not mention an IRB in his response to the outcry.

Kramer said he and “several other” researchers at Facebook “have begun working on improving our internal review practices.”

Those practices “will also incorporate what we’ve learned from the reaction to this paper,” he added.

“This is lame back-pedaling that completely avoids the fact that they violated basic research ethics for dealing with human subjects,” Simpson said.

As for the dropped posts, “they just didn’t show up on some loads of Feed,” Kramer noted. “Those posts were always visible on friends’ timelines and could have shown up on subsequent News Feed loads.”

That’s “just noise,” Enderle scoffed. “The deep issue for Facebook is that it exists because people believe it will help them make and manage friends, but it exists to take monetary advantage of weaknesses.”

So Whaddayado?

Neither Kramer nor Facebook has apologized for not first obtaining their subjects’ consent for the study.

“Facebook do what they want and what is expedient until their fingers are caught in the cookie jar,” Simpson said.

“I don’t get why anybody would be surprised at this latest hullaballoo,” Laura DiDio, principal at ITIC, told TechNewsWorld.

“There is no such thing as privacy online,” DiDio explained. “Once you put it out there, it’s out there.”

Further, Facebook’s privacy rules “have been a moving target from Day One

Richard Adhikari

Richard Adhikari has written about high-tech for leading industry publications since the 1990s and wonders where it's all leading to. Will implanted RFID chips in humans be the Mark of the Beast? Will nanotech solve our coming food crisis? Does Sturgeon's Law still hold true? You can connect with Richard on Google+.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels