Facebook on Thursday announced it had developed a framework for conducting research on its 1.3 billion or so users.
Although Facebook so far has revealed only the general outlines, this framework clearly is a response to the onslaught of criticism the company received this summer, when it blithely reported the findings of a study about how News Feed content affected a user’s mood.
In carrying out that research, Facebook withheld certain posts and promoted others to see how users would react.
When its methodology became public, reactions were immediate and harsh.
“While all businesses of this scale constantly experiment with the factors that influence their customers and users, there’s something especially spooky about the idea of being experimented on in our digital lives,” said Will McInnes, CMO of Brandwatch.
Facebook apparently was caught off guard by the vitriol.
A Look at the Framework
The new framework includes giving researchers clearer guidelines and, in certain cases — such as when dealing with content that might be considered deeply personal — putting the project through an enhanced review process before the research begins.
Further review is required if the work involves collaboration with someone in the academic community.
Toward that end, Facebook has created a panel comprised of its most senior subject-area researchers, along with people from its engineering, research, legal, privacy and policy teams to review projects.
Facebook also will put its new engineers through a six-week training boot camp on privacy and research related issues.
Informed Consent?
Facebook’s new guidelines appear to be missing some fundamental ingredients, starting with actual policies on what will or won’t be permissible.
For instance, it is unclear whether Facebook would repeat this summer’s study under the new guidelines.
The company has not exactly said that it shouldn’t have tinkered with users’ News Feeds — just that it should have considered other, perhaps nonexperimental, ways to conduct its research. Facebook acknowledged that its study would have benefited from review by a more senior group of people. It also owned up to having failed to communicate the purpose of its research.
Facebook has not promised to inform users the next time it conducts a research project, noted Lance Strate, professor of communications and media studies at Fordham University.
“Instead, Facebook is in effect saying, ‘I’m sorry, I made a mistake, I won’t do it again, I can change, I promise — just trust me,’ while giving their users absolutely no concrete reason why they should be trusted,” he told TechNewsWorld.
The irony is that Americans usually are very willing to participate in consumer research and divulge all sorts of personal, private information in focus groups, interviews, surveys and opinion polls — as long as they are asked whether they are willing to take part in the study first, Strate pointed out.
Indeed, asking permission to conduct such studies goes beyond privacy and business ethics to common courtesy and basic human decency, he said. “It’s the sort of thing we teach children long before they enter kindergarten — to ask for permission, to say, ‘Mother, may I’ and ‘please’ and ‘thank you.'”
Facebook’s apparent sense of entitlement regarding the collection of user data and the violation of user privacy is one reason for the extraordinary amount of buzz surrounding the launch of Ello as an alternative social network, Strate added.
That thought surely has occurred to Facebook’s executive team, which might have been one factor behind the release of the guidelines, McInnes told TechNewsWorld.
“Facebook’s greatest fear and business risk is a user exodus, and so it knows that the trust of users is crucial,” he said. “This move represents Facebook stepping up and looking to close down the risk of such a backlash again.”