“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Sandberg, Facebook’s chief operating officer,told the Wall Street Journal while travelling in New Delhi. “And for that communication we apologize. We never meant to upset you.”
Sandberg’s statement was the first public comment by a Facebook executive on the controversy since it erupted over the weekend, prompting anger from many Facebook users and criticism from some academics who said it was unethical to manipulate users’ emotions without informed consent.
In the study, researchers at Facebook tweaked what hundreds of thousands of users saw in their news feeds, skewing content to be more positive or negative than normal in an attempt to manipulate their moods. Then they checked users’ status updates to see if the content affected what they wrote. They found that, yes, Facebook users’ moods are affected by what they see in their news feeds. Users who saw more negative posts would write more negative things on their own walls, and likewise for positive posts.
Sandberg’s apology is not likely to appease some, such as Robert Klitzman, a psychiatrist and ethics professor critical of the study, who said in a column for CNN that “the problem is not only how the study was described, but how it was conducted.”
It seems that until now, Facebook data scientists have been pretty much free to do as they please.  “There’s no review process, per se,” Andrew Ledvina, who worked at Facebook as a data scientist from 2012 to 2013, told the Journal. “Anyone on that team could run a test,” he said. “They’re always trying to alter people’s behavior.” Ledvina told the Journal that tests were so frequent that some data scientists worried that the same users might be used in different studies, tainting the results.
Facebook has since implemented stricter guidelines, the Journal reported. Research other than routine product testing is reviewed by panel of 50 internal experts in fields such as privacy and data security. Company research intended for publication in academic journals goes through a second round of review, again by in-house experts.
The upset over Facebook’s mood study is “a glimpse into a wide-ranging practice,” Kate Crawford, a visiting professor at the Massachusetts Institute of Technology’s Center for Civic Media and a principal researcher at Microsoft Research told the Journal. Companies “really do see users as a willing experimental test bed” to be used at the companies’ discretion.
Plenty of companies may do this sort of testing. But Facebook is different, John Gapper argued in the Financial Times. Here’s why, he said:
  • “Facebook holds more intimate information about its users than other internet companies.”
  • Unlike testing products to see what appeals to users, which many companies do, with Facebook, “we are the product” being tested.
  • “Facebook wields incredible power over the behavior of users. This is partly because of its size.” He points to another Facebook study of 235 million users – noting that their sample size is four times the population of France.
  • Facebook “focuses its judgments on personal material,” unlike Google, which uses algorithms to analyze material across the Web. “An algorithm that selects from thousands of links about, say, Buckingham Palace feels like a service; one that weeds out the posts of friends and family feels like a moral guardian.”
  • “Facebook has demonstrated that it can alter behavior,” he writes, citing studies that show users who see more status updates will write more themselves and another than encouraged users to becomes organ donors by allowing existing donors to display their that status.
(Washington Post)