FOX31 Denver

Internet outraged by Facebook’s ‘creepy’ experiment on users

HONG KONG — Everyone has a bad day on occasion. But what if Facebook made it worse — on purpose, and without telling you?

Internet users have reacted angrily to news that Facebook researchers manipulated the content some users were shown in an attempt to gauge their emotional response.

For one week in early 2012, Facebook changed the content mix in the News Feeds of almost 690,000 users. Some people were shown a higher number of positive posts, while others were shown more negative posts.

The results of the experiment, conducted by researchers from Cornell, the University of California, San Francisco and Facebook, were published this month in the prestigious academic journal Proceedings of the National Academy of Science.

The study found that users that were shown more negative content were slightly more likely to produce negative posts. Users in the positive group responded with more upbeat posts.

“The short version is, Facebook has the ability to make you feel good or bad, just by tweaking what shows up in your news feed,” Forbes reported.

So it worked! Facebook was able to successfully change the emotional state of its users. While the mood changes were small, the researchers argued that the findings have major implications given the size and scale of the social network.

LINK: View full study at PNAS.com

While users may not have been aware they were part of the experiment, this sort of test is allowed under the terms and conditions all Facebook users must agree to. Those conditions include “internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

“I wonder if Facebook KILLED anyone with their emotion manipulation stunt,” privacy activist Lauren Weinstein said on Twitter. “At their scale and with depressed people out there, it’s possible.”

Facebook uses an algorithm to determine which of roughly 1,500 available posts will show up in a user’s News Feed. The company frequently changes this program to modify the mix of news, personal stories and advertisements seen by users.

The Facebook researcher who designed the experiment, Adam D. I. Kramer, said in a post Sunday that the research was part of an effort to improve the service — not upset users.

“I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused,” Kramer wrote. “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

A Facebook spokesman said the company frequently does research to “improve our services and to make the content people see on Facebook as relevant and engaging as possible.”

“We carefully consider what research we do and have a strong internal review process,” the spokesman said in a statement. “There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Given the company’s terms of service, it does not appear that Facebook faces any legal implications. But the guinea pig nature of the experiment — and the decision to execute it without the explicit consent of participants, raises ethical questions.

Susan Fiske, the Princeton professor who edited the research, said that while the research was “inventive and useful,” the outcry suggests that maybe it shouldn’t have been carried out.

“I was concerned,” she told The Atlantic, “until I queried the authors and they said their local institutional review board had approved it — and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time… I understand why people have concerns. I think their beef is with Facebook, really, not the research.”

CNN contributed to this report.