Facebook experiment on users' emotions sparks outrage

Last Updated: Monday, June 30, 2014, 3:36 PM EDT

Are you angry that Facebook conducted experiments on users’ emotions by tweaking their news feeds?

View all recent polls
This poll is not scientific and reflects the opinions of only those internet users who have chosen to participate.

Furor has erupted in recent days over news that Facebook conducted a psychological experiment on its users by manipulating their emotions without their knowledge.

According to a study published in the prestigious academic journal Proceedings of the National Academy of Scientists, researchers probed the feelings of 689,003 randomly selected Facebook users by changing the contents of their news feed.

According to CNN and other media outlets, the study has sparked outrage among Facebook users.

During one week in January 2012, researchers from Cornell, the University of California and Facebook staged two parallel experiments, reducing the number of positive or negative updates in each group's news feed.

The study found that users that were shown more negative content were slightly more likely to produce negative posts. Users in the positive group responded with more upbeat posts.

So, Facebook was able to manipulate the emotional state of its users. While the mood changes were small, the findings could have major implications given the size and scale of the social network.

"When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks," the authors of the paper said.

Facebook apparently did not violate any laws. When users sign up for the network, they agree to give up their data for analysis, testing and research.

But it isn't the research that people are upset about. It's the manipulation of data without users consent or knowledge.

Facebook uses an algorithm to determine which of approximately 1,500 available posts will show up in a user's news feed. The company changes this program to modify the mix of news, personal stories and advertisements seen by users.

Information from CNN.com, the Los Angeles Times and Mashable.com was included in this report.