When Facebook performed an experiment to see if it could secretly affect users' emotions, were you really surprised?
Academics were galled, knowing they would need explicit permission from research subjects and strict ethical oversight to perform such an experiment. A privacy group filed a complaint with the Federal Trade Commission, the journal that published the study later expressed concern, and some European nations began an investigation.
But many Facebook users responded with a shrug, having long accepted that targeted ads and extensive data collection are permanent features of life online.
In the 21st century, user participation has come to equal user consent, a social contract governed by massive terms-of-service agreements that few users fully read or understand.
However, that social contract has come under scrutiny this week as news spread of Facebook's emotion-manipulation experiment in 2012, which was carried out on nearly 700,000 users without their knowledge.
Facebook data scientist Adam D.I. Kramer, who would later say he was concerned that users might leave the network if using Facebook made them sad, carried out the weeklong experiment with input from two outside academic researchers from Cornell University.
The researchers wanted to see whether emotions were contagious on Facebook. If users saw a greater proportion of happy statuses from friends in their newsfeeds, would they feel happier? And if they saw more sad updates, would they be sadder?
According to the results of this experiment, Kramer and the Cornell researchers, Jamie E. Guillory and Jeffrey T. Hancock, found that emotions appeared to be contagious on Facebook.
Since then, Facebook and the researchers have been barraged with criticism. The editor of the Proceedings of the National Academy of Sciences, which published the study in June, noted that researchers' failure to obtain participants' informed consent was "a matter of concern."
Facebook officials and other industry insiders argue that Facebook has a right to conduct testing on its own service to improve its product, and that users implicitly agree to the testing when they accept the company's sweeping terms of service.
"This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated," Sheryl Sandberg, Facebook's chief operating officer, told the Wall Street Journal. "And for that communication we apologize. We never meant to upset you."
Similar large-scale user testing goes on all the time, internally, as a normal course of business, officials said. "They're always trying to alter peoples' behavior," one former Facebook Data Science team member told the Journal this week. (continued...)
© 2014 Los Angeles Times (CA) under contract with NewsEdge. All rights reserved.