Facebook has been playing with its users' emotions, and now a lot of people are upset. For one week in 2012, hundreds of thousands of Facebook users were unknowingly subjected to an experiment in which their news feed was altered to see whether certain kinds of content made users happy or sad.
The research that resulted from that experiment, which was published in an academic journal this month, said emotions appeared to be contagious: If users saw happier posts from friends in their Facebook news feed, they were more likely to post their own happy updates. Sad updates appeared to have a comparable effect.
In other words, the study seems to show you are what you eat, as the saying goes -- except in that metaphor, you usually get to choose what you put in your mouth.
Now, Facebook, which uses a secret algorithm to control what it shows users on its popular news feed, faces another round of allegations that the world's largest social-media network is being a little too creepy and manipulative.
After the study started to receive widespread scrutiny on the Web, Adam D.I. Kramer, a data scientist at Facebook and one of the study's authors, wrote in a post Sunday: "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
Kramer added that he and the paper's coauthors were "very sorry for the way the paper described the research and any anxiety it caused."
The other two researchers involved were Jamie E. Guillory of UC San Francisco and Jeffrey T. Hancock of Cornell University.
The research, which was published in the June 17 issue of Proceedings of the National Academy of Sciences, has also drawn complaints that the academics involved strayed from typical standards about participant consent, especially in a study that could have had negative effects.
The research paper's editor, Susan T. Fiske of Princeton, told the Los Angeles Times that the authors told her the research had been approved by Cornell's Institutional Review Board on the basis that Facebook had already performed the study. However, she said, she did not confirm that approval herself. A spokesman for Cornell could not immediately confirm or deny its review board's involvement late Sunday.
Such boards, which are common at many universities, exist to ensure that researchers don't harm their subjects and that they obtain what's known as "informed consent" from their subjects.
The Facebook study's researchers apparently relied on Facebook's sweeping terms of service and data use policy -- which combine for more than 13,000 words and mention the word "research" exactly twice -- to use 689,003 users for its experiment. (continued...)
© 2014 Los Angeles Times (CA) under contract with NewsEdge. All rights reserved.