Facebook Enters Ethical Gray Area With News Feed Experiment

Source: Getty Images

Source: Getty Images

About two years ago, Facebook (NASDAQ:FB) conducted an experiment to see if what your News Feed showed you could affect your emotions (online and off.) It turns out that it can — and users aren’t happy about it.

NewScientist was the first to report that a team of researchers led by Facebook data scientist Adam Kramer manipulated the posts that would appear in the News Feeds of 689,003 Facebook users to find out if emotions spread through the social network. Some users saw fewer posts with negative words, while others saw fewer posts with positive words. The results showed that emotions were “contagious,” with people who had seen fewer negative posts more likely to use positive words, and vice versa. The study briefly sums up its findings and their significance:

We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness … In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.

As the world’s largest social network, with 1.23 billion monthly active users, Facebook has a huge repository of user data and an unprecedented reach into users’ daily lives. While users are technically giving the okay to participate in this type of experiment when they agree to Facebook’s terms and conditions, many aren’t happy about it. The revelation of the study and its unapologetic manipulation of users’ emotions and behavior has compounded public concern over the social network’s treatment of user privacy, sparking outcry against Facebook’s use of its data for its own experiments.

Users seem particularly angered by the fact that Facebook is legally allowed to experiment with users’ behaviors and emotions. Every user, by agreeing to Facebook’s terms and conditions to create a profile on the network, has agreed to share his or her data and consenting to the use of his or her profile for such social experiments. In this case, the experiment shows that Facebook can, in fact, influence users’ behavior and emotions through manipulation of the version of the product that they’re using.

A post by Kramer, the lead data scientist on the experiment, attempts to explain the scale of the experiment and the motivation behind it. The experiment affected 1 in 2,500 users, who for a week-long period didn’t see some posts in certain loads of the News Feed. What displays on the News Feed is controlled by a constantly changing algorithm that determines which statuses, stories, photos, and activities will show up on a given load. What the News Feed doesn’t do is display a linear chronology of everything that a user’s friends post.

Kramer says that the goal of the research was to make Facebook “a better service” for its users. He writes that the experiment was conducted in early 2012, and that the team’s internal review practices “have come a long way since then.” However, Kramer adds, “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

The effect of the study was relatively small in statistical terms. But the paper ascribes weighty significance to it, introducing the idea that the effect of social media on users’ emotions falls under “public health.” The paper reads: “The well-documented connection between emotions and physical well-being suggests the importance of these findings for public health. Online messages influence our experience of emotions, which may affect a variety of offline behaviors. And after all, an effect size of d = 0.001 at Facebook’s scale is not negligible: In early 2013, this would have corresponded to hundreds of thousands of emotion expressions in status updates per day.”

While some users and privacy advocates maintain that the study could have had extreme consequences for the emotions and actions of unsuspecting users, much of the outcry over the experiment stems from the idea that Facebook simply went too far by manipulating users’ emotions. The intimation is that users are fine — or at least less upset — with Facebook manipulating where they click or what ads they see than if that influence extends to aspects of their offline life, like emotions.

While the study was (almost) certainly legal, whether it was ethical or not is entirely less certain. The Atlantic reports that Susan Fiske, a Princeton professor who edited the paper for publication, said that an institutional review board only consulted regarding the study’s data analysis, not its methods of data collection. Fiske told The Atlantic, “I was concerned until I queried the authors and they said their local institutional review board had approved it — and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time.”

Fiske also added: “It’s ethically okay from the regulations perspective, but ethics are kind of social decisions. There’s not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn’t have been done … I’m still thinking about it and I’m a little creeped out, too.”

The New York Times points out that other Internet companies, like Google (NASDAQ:GOOG) (NASDAQ:GOOGL) and Yahoo (NASDAQ:YHOO), monitor how users interact with search results or content, and then make adjustments to improve the user experience. The article adds, “But Facebook’s most recent test did not appear to have such a beneficial purpose.” There’s a difference in not only the motivation but also the methodology, as the study didn’t obtain consent for what many in the academic community and in the general public would consider psychological research. The New York Times reports that the News Feed algorithm typically chooses from about 1,500 statuses, photos, and stories to display 300, and Facebook regularly solicits feedback from users on how the News Feed could be improved. However, the study falls outside the realm of the regular tweaks to the algorithm. It also doesn’t conform to sites’ routine tactic of making changes in response to users’ behavior, and instead made just to see how users respond.

The ethics of the research certainly lie in a grey area, and while the study may cause Facebook to lose a few users in the short-term, the public outcry is unlikely to change the extent to which Facebook or any other company with access to similar user data will put that data to use. Anyone who uses the Internet has either come to terms with, or will have to resign himself or herself to, the fact that the information we share with services like Facebook can and will be put to use. Sometimes that means that sites will seek to manipulate where you click or which content you see, and other times that will mean that a site will seek to influence your emotion — or at least prove that it can.

Perhaps people are also unsettled, in some measure, by the idea that emotions are “contagious,” even over social media, that they — largely subconsciously, and even automatically — can mirror the mood that’s most widely represented on their Facebook News Feed. But that has just as much to do with human emotion as with a social network’s experiments with user data. We’ll all have to come to terms with the idea that the algorithm that chooses what to show us in its summary of our friends’ posts and lives now has the same ability to influence how we’re feeling (in real life) that grabbing a coffee with one of those friends has always had.

More From Wall St. Cheat Sheet:

Want more great content like this? Sign up here to receive the best of Cheat Sheet delivered daily. No spam; just tailored content straight to your inbox.