How are you feeling? Facebook wants to know. Wait, scratch that -- Facebook may already know. A just-published report about a 2012 study suddenly has the Internet buzzing with concern, once again, over Facebook policies.
This week's debate relates to the secretive social experiment Facebook conducted on a random selection of 689,003 of its one billion-plus users. According to an article published June 17, 2014, in the Proceedings of the National Academy of Sciences, researchers from Facebook and Cornell University were testing whether certain emotions could be manipulated and would then spread among people without face-to-face contact.
As part of the experiment, the number of positive and negative comments that Facebook users saw on their feeds of articles and photos was artificially altered without their knowledge in January 2012. In the end, the researchers found that users who were shown fewer positive words were found to write more negative posts, and those who were exposed to fewer negative terms ultimately shared more positive posts.
You Said We Could
The authors of the study were able to conduct the research because, they said, automated testing “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”
But many are saying that the problem with gaining user consent this way is in the cursory look users give the privacy policies of Web sites, if they look at all. Even users who take the time to read Facebook’s user agreement might not understand what they’re signing up for, according to Susan Etlinger, an industry analyst with the Silicon Valley-based Altimeter Group.
“Facebook has been making changes to its timeline algorithm since it began, and will continue to do so,” says Etlinger. “What makes this different from other areas that are covered in Facebook’s terms and conditions is that it’s not made clear that users will be part of a behavior experiment.”
A Bit of Backtracking
Facebook has shown at least hints of contrition since the results of the study spurred such a strong reaction this week. Adam Kramer, a Facebook data scientist who was among the study’s authors, wrote on his Facebook page earlier this week that the team was “very sorry for the way the paper described the research and any anxiety it caused.”
The main problem with the Facebook experiment, say observers, is that it exposes the notoriously weak form of consent that exists in many online transactions. The nearly unanimous uproar this week over the secretive experiment highlights the importance of getting truly informed consent for such transactions.
The experiment was made to look even more suspicious when it was reported earlier this week that four months after the study, Facebook made changes to its data use policy that could seem designed to cover its tracks in case of the kind of negative reaction the study has received. The new clause allowed “(f)or internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
Whether the uproar leads to stronger regulation of online user agreements -- or just prompts users to read such agreements more carefully -- the industry will be paying attention.
“This is big because it’s a matter of social data ethics,” says Etlinger. “I think it’s good that there’s an outcry over this. It sets a dangerous precedent.”
Posted: 2014-07-16 @ 2:04pm PT
"Founded in 2004, Facebook’s mission is to give people the power to share and make the world more open and connected."
As a healthcare provider, I see many people with mobility issues who are unable to go out of their homes or hospital so they use Facebook to connect with the world. It makes me sick to know that an additional layer of negativity may have been forced upon them by something they "liked".
It's unlikely Facebook can stop being evil, so they will inevitably fall.
Posted: 2014-07-02 @ 7:33am PT
I think Facebook is trying to stay in the new as much as they can. There is so many other social networks popping up, Facebook is just trying to hold on to power like the rest of the giants. I try to limit my usage on these sites, I use http://LookSeek.com a non tracking search engine for privacy
Posted: 2014-07-02 @ 6:46am PT
@Elizabeth: What's sad is that you could possibly believe that 911 and the Boston City bombing were staged by the US government. Unfortunately, terrorism is a real and present danger, not just something the US government cooked up.
Posted: 2014-07-02 @ 2:28am PT
this is the worst kind of invasion of privacy and extremly unethical. its bad enough that we live in a country that claims to be the land of the free, and yet we us citizens cant and wont see the truth that our own goverment has brainwashed the american people into beleiving that foreigners have waged terrorist attacks on us like 911 and boston city bombing when its our own goverment doing it and blaming others and the proof is all over youtube if we would just open our eyes to the enormous abundenceof proof. we claim to be the richest,most powerful and most free country, yet in reality we are the most hypocritical, lying, theiving and murderous country there is. we stick our noses in everyone elses business when we cant even solve our own problems here first. when social media plays god what do we have left.