Facebook has generated anger among members of its social network after a study surfaced tied to its users' news feeds.
Earlier this month, The Proceedings of the National Academy of Sciences (PNAS) published a study conducted by a Facebook data scientist and two other researchers from the University of California and Cornell University on how social networks impact users' emotions.
The study required researchers to manipulate the News Feeds of roughly 689,000 users to determine whether positive or negative content would affect their emotions and subsequent Facebook updates.
The experiment -- carried out for one week in 2012 -- explored "whether exposure to emotional content led people to post content that was consistent with the exposure—thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion," reads an excerpt from the study.
The study found users with less positive content in their news feeds used more negative words in status updates, and vice versa.
Facebook's experiment has fueled criticism from its users and experts questioning the ethics of its research.
"The unwitting participants in the Facebook study were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise," says University of Maryland law professor James Grimmelmann in a blog post. "That's psychological manipulation, even when it's carried out automatically."
An editor on the study, Susan Fiske, tells The Guardian she had worries about Facebook's study. "People are supposed to be told they are going to be participants in research and then agree to it and have the option not to agree to it without penalty."
Adam Kramer, the Facebook data scientist who worked on the study, apologized for causing worries among its users in a post on Facebook. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."
The post generated responses from some Facebook users questioning how the social network seeks consent for research such as the news feed study.