A study not to ‘like’
Facebook’s study on emotional contagion may not have broken laws, but it has exposed the unfettered power of big data companies grounded in opaque user policies.
For one week in 2012, researchers from Facebook, Cornell and the University of California skewed the emotional content of almost 700,000 news feeds to test how users would react. They found that people would write slightly more negative posts when exposed to negative feeds and vice versa. News of the study spread on the Internet on Monday, angering users who thought Facebook had treated them as “lab rats” and sparking European legal probes. Facebook executive Sheryl Sandberg eventually apologized for “poorly” communicating the study, but Facebook stood firm. “When someone signs up for Facebook, we’ve always asked permission to use their information,” the company said in a statement. “To suggest we conducted any corporate research without permission is complete fiction.”
Facebook is half right. Users agree to terms and conditions when they join the social network. In-house experiments, called “A/B testing,” are routine, too. They observe how users react to small changes in format and content, such as a bigger icon or a different shade of blue. The purpose is to improve user experience on the site.
But this crossed an important line: Unlike typical A/B testing, Facebook tried to directly influence emotions, not behaviors. Its purpose was not to improve user experience but rather to publish a study.
Almost all academic research requires informed consent from participants, which Facebook assumed from acceptance of its terms of service. Yet Facebook’s data-use policy at the time of the study did not explicitly state that data would be used for “research.” This means the company likely justified the study under one of its broad provisions. A user would have to read tens of thousands of words of the agreement, then hypothesize about its possible interpretations, to consent. This practice is very different from the offline standard, where subjects need to understand the full risks and benefits of a study and have an option to decline. Federally funded research institutions are required to follow these rules, but plenty more do so for ethical reasons anyway.
Recent lawsuits against Facebook and Google — including the European Court’s ruling in favor of a “right to be forgotten” — focus on the ownership and use of companies’ existing store of data. This study reveals a new arena, in which users are manipulated to create new data for companies beyond their narrow commercial purposes.
While Facebook has implemented internal review mechanisms since the study, the underlying problem remains. Permission is still based upon ineffectual terms-of-service agreements. Users do not know what to expect from services; companies push to the limit because they know users won’t drop out.
President Barack Obama’s 2012 proposal for a “Consumer Privacy Bill of Rights” and the 2014 “Big Data” report have failed to produce much progress on transparency. This Facebook study should prompt a resumption of debate in and out of government on how to manage big-data practices.
Rules for posting comments
Comments posted below are from readers. In no way do they represent the view of Stephens Media LLC or this newspaper. This is a public forum.
Comments may be monitored for inappropriate content but the newspaper is under no obligation to do so. Comment posters are solely responsible under the Communications Decency Act for comments posted on this Web site. Stephens Media LLC is not liable for messages from third parties.
IP and email addresses of persons who post are not treated as confidential records and will be disclosed in response to valid legal process.
Do not post:
- Potentially libelous statements or damaging innuendo.
- Obscene, explicit, or racist language.
- Copyrighted materials of any sort without the express permission of the copyright holder.
- Personal attacks, insults or threats.
- The use of another person's real name to disguise your identity.
- Comments unrelated to the story.
If you believe that a commenter has not followed these guidelines, please click the FLAG icon below the comment.