Is Facebook Using Us As Guinea Pigs?
Well, it’s happened. Those tiny words jumbled together in the terms of service section of a web-site that nobody reads has given none other than the largest internet company in the world (in terms of data transfer and human reach) the right to conduct experiments on us. We are, of course, talking about our friendly social networking site, Facebook.
Facebook conducted a massive psychological experiment on nearly 700,000 unwitting users in 3012, and the activity has just come to light, causing a furor over the internet.
To determine whether it could alter the emotional state of its users and prompt them to post either more positive or negative content, the site’s data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users.
The research, published in the March issue of the Proceedings of the National Academy of Sciences, sparked a different emotion—outrage—among some people who say Facebook toyed with its users emotions and uses members as guinea pigs.
“What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we’ll respond to but actually change our emotions,” wrote Animalnewyork.com, a blog post that drew attention to the study Friday morning.
Facebook has long run social experiments. Its Data Science Team is tasked with turning the reams of information created by the more than 800 million people who log on every day into usable scientific research.
On Sunday, the Facebook data scientist who led the study in question, Adam Kramer, said he was having second thoughts about this particular project. “In hindsight, the research benefits of the paper may not have justified all of this anxiety,” he wrote on his Facebook page.
A prominent consumer privacy group is filing a complaint with the Federal Trade Commission claiming that Facebook’s users were deceived by the mood experiments conducted by the social network in 2012.
“We think the Facebook study is a deceptive trade practice which is subject to investigation by the FTC,” said Marc Rotenberg, executive director of the Electronic Privacy Information Center.
In addition, the group is asking Facebook to make the News Feed algorithm public so users fully understand why they’re being shown certain posts. “There should be no more secret manipulation of Internet uses,” Mr. Rotenberg said.
Facebook is under fire for conducting a mood experiment on 700,000 users in 2012 in which they manipulated the amount of negative or positive content in News Feeds to determine what if any impact they have on subsequent posts by the user. The experiment came to light late last week when the results were published in an academic journal.
Facebook COO Sheryl Sandberg apologized Wednesday for the experiment. “This was part of ongoing research companies [advertisers] do to test different products, and that was what it was; it was poorly communicated,” she said while in New Dehli, according to the Wall Street Journal. “And for that communication, we apologize. We never meant to upset you.”
Facebook is already under a 20-year-consent decree as part of a 2011 settlement with the FTC over its privacy practices.
“If these are material terms that users would not expect by signing up for Facebook, then they should be disclosed in a clear and prominent manner, using language consumers are likely to understand,” she said.
The FTC does not comment on prospective or ongoing investigations. A Facebook spokesperson was not immediately available.
Facebook’s current Data Use Policy does disclose that user data can be used for research, but as Forbes pointed out, that “research” caveat was added in May 2012, four months after the study took place.
To read more, click here: The Facebook Experiment