A study with details of how Facebook secretly manipulated the news feed of about 700,000 users to study "emotional contagion" has stirred an outcry on the world's largest social network.
An article published in the journal 'Proceedings of the National Academy of Sciences of the USA' has published details of Facebook's experiment changing the algorithm used to place posts into user News Feed to study how it affected their mood.
The study was conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco in early 2012, reports said.
More From This Section
The research was aimed at finding out if the number of positive or negative words in messages users read affected the content in their status updates.
The research found that when they manipulated user timelines to reduce positive expressions displayed by others "people produced fewer positive posts and more negative posts" and vice versa.
The move has received criticism from all across the world.
While #Facebookexperiment was abuzz and saw a lot of activity, many tweets described the move as "creepy", "super disturbing", "cruel" and even "evil".
However, Facebook data scientist Adam Kramer, in a post said, "The goal of all of our research at Facebook is to learn how to provide a better service."
He clarified that the research sought to investigate by very minimally "deprioritizing a small percentage of content in News Feed" for a group of people (about 0.04 per cent of users).
"Nobody's posts were "hidden," they just didn't show up on some loads of Feed. Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads," he said.
He added that he understood "why some people have concerns about it" and apologised for the way the paper described the research and any anxiety it caused.
"In hindsight, the research benefits of the paper may not have justified all of this anxiety," he said.