Business Standard

<b>Alokananda Chakraborty:</b> Much ado about nothing

Facebook's tinkering of user data for a social experiment has raised the hackles of its users. A bigger question: were the findings worth the effort?

Image

Alokananda Chakraborty
Whatever Facebook does is news - such is its following. Mostly, it does stuff that becomes industry standard. On other occasions, what it does is just lame.

I am talking of the huge furore online at this point over Facebook's manipulation of user feed two years ago as part of some research project. Here's a little background. The hullabaloo started with a recent report in NewScientist, a weekly non-peer-reviewed international science magazine, which said that in January 2012, the social network altered the number of positive and negative comments that almost 700,000 randomly selected readers saw on their feeds of articles and photos. According to the study, Facebook manipulated the feeds to study whether online emotions are contagious. That is, how online messages influence readers' "experience of emotions," which may, in turn, affect their offline behaviour.
 

For a week, some users were shown posts with a higher number of positive words, others were shown posts with more negative sentiments. The outcome of the study, published on June 17 in the Proceedings of the National Academy of Sciences, showed people shown fewer positive words were prone to writing more negative posts, while the opposite happened with users who were exposed to fewer negative terms.

The whole thing seems to have blown up on its face, with Facebook users turning to every other social networking platform including Twitter to vent their anger. Almost every post I have read looks at the research as a breach of their privacy. Such is the outrage that a Facebook researcher involved in the project has had to tender a public apology. Adam Kramer, a Facebook data scientist and one of the authors of the study, wrote on his Facebook page on Sunday that the team was "very sorry for the way the paper described the research and any anxiety it caused."

Facebook followers' ire is understandable. A New York Times post rightly points out "academic protocols generally call for getting people's consent before psychological research is conducted on them, Facebook didn't ask for explicit permission from those it selected for the experiment."

That said, there are two things here. First, this isn't the first time the Menlo Park, California-based company was using the data available with it for a "social experiment". Nor is it the only one tinkering with the data. Google and Yahoo! routinely follow how people interact with search results or news articles to modify what is shown. All of them say this improves the user experience, makes the site more engaging and so on.

And if you are not already aware, Facebook does it too. In 2012, MIT Technology Review reported that Mark Zuckerberg himself used Facebook user data for some personal experiments. That report suggested that Zuckerberg tapped the social influence of Facebook to improve the number of registrations for organ donation. It was a clever move - users were given the option to click a box on their Timeline pages to indicate whether they were registered donors. That click would send a notification message to people on their "Friend" list. The article goes on to indicate that this feature created a sort of social pressure to register as an organ donor and the enrolment numbers zoomed. But in my opinion, that move was even more spooky than the latest research.

Second, as the site has pointed out in its defence, customers' concerns about privacy issues are misplaced. The company has said none of the data in the study was associated with a specific person's account. It was undertaken to make the site's content more alluring and relevant, and, in any case, when users sign up for Facebook and agree to its terms of service they consent to this kind of manipulation. That argument was not accepted, but that's a different story.

Now come to the research and you will see why I call this study lame. Academics have trashed the study - as you can read on the many blog posts on the study - on several grounds. For instance, Tal Yarkoni, a psychology Research Associate at the University of Texas at Austin, says, that the fact that users who were part of the experimental ended up producing content that was slightly more positive or slightly more negative doesn't mean that those users actually felt differently. "It's entirely possible - and I would argue, even probable - that much of the effect was driven by changes in the expression of ideas or feelings that were already on users' minds," he writes.

Some others have questioned the methodology used in the study. Dr John Grohol, founder of the psychology site, Psych Central, says the study doesn't really measure the moods it is proposing to capture. If the researchers wanted to make the study exhaustive, they had to go to Facebook users and have them fill out a complete questionnaire. "Instead the authors were making strange judgement calls based on content of status updates to predict a user mood," he says, adding that one needs a different tool or survey to accurately gauge something as complex as emotional state.

In other words, so much for nothing!

Disclaimer: These are personal views of the writer. They do not necessarily reflect the opinion of www.business-standard.com or the Business Standard newspaper

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Jun 30 2014 | 9:46 PM IST

Explore News