Facebook Manipulated 700,000 News Feeds for Emotion Experiment

Facebook Manipulated 700,000 News Feeds for Emotion Experiment

Social networking website Facebook has come under criticism for playing with its users emotions after undertaking a psychological test to see whether people reacted differently when exposed to primarily negative or positive comments from their Facebook friends.

The news came on Saturday evening, when New Scientist magazine disclosed that Mark Zuckerberg’s company had manipulated nearly 700,000 users’ Facebook landing pages to serve them primarily positive, or negative content from their friends.

The site then registered whether the user themselves became more positive or negative in their own attitudes, based on their own status updates and posts to the site.

The move however has been slammed by privacy campaigners and even those involved in the experiment. Susan Fiske, professor of psychology at Princeton University who edited the study, expressed her own doubts: 

“I was concerned until I queried the authors and they said their local institutional review board had approved it–and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time… I understand why people have concerns. I think their beef is with Facebook, really, not the research.”

Since the outrage over the incident, which many are reporting could have caused depression and/or psychological damage to the study subjects, Facebook’s Adam Kramer has said: “Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

The research paper concluded:  “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”

Critics argue that while people are used to being manipulated by adverts and the like, this kind of undisclosed manipulation is relatively new, and that companies like Facebook should be held to the same standards as government agencies or universities when conducting such experiments. 

COMMENTS

Please let us know if you're having issues with commenting.