Are Facebook’s Unannounced Experiments Improving Ethically?

Source: Thinkstock

Source: Thinkstock

Following the publication of a Facebook data scientist’s study on whether the emotional manipulation of users’ News Feeds could affect their behavior and the emotional content of their Facebook posts — and the public outcry and PR disaster that ensued — the social networking giant has amended the policies that govern the research it conducts on its users. The company is implementing a formal review process for proposed research projects, and will also collect all of its published research on one central website.

Contrary to Sheryl Sandberg’s statement this summer — where she apologized only for the poor communication about the purpose of the study, not for its methods or the study itself — Facebook’s latest statement offers something of an apology for the way the company experimented on users without their consent or knowledge. A news release on Facebook’s site, titled simply “Research at Facebook,” explained that the company is now aware that, “We should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.”

The post also includes the line: “It is clear now that there are things we should have done differently.” So, will Facebook do things differently when it studies users’ behavior in the future? To find out, let’s look at what Facebook has said it will do to ensure that research is conducted ethically, and how its intentions measure up against more traditionally ethical and accepted methods of conducting behavioral research.

What are Facebook’s new guidelines?

Mike Schroepfer, Facebook’s chief technology officer, wrote in the news release on Facebook’s website that the company is updating the way that it conducts research. He wrote that after considering its research methods for the past three months, the company is implementing “a new framework that covers both internal work and research that might be published,” complete with “clearer guidelines” for researchers and a panel that will “review projects.”

Guidelines: we’ve given researchers clearer guidelines. If proposed work is focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin. The guidelines also require further review if the work involves a collaboration with someone in the academic community.

Review: we’ve created a panel including our most senior subject-area researchers, along with people from our engineering, research, legal, privacy, and policy teams, that will review projects falling within these guidelines. This is in addition to our existing privacy cross-functional review for products and research.

Training: we’ve incorporated education on our research practices into Facebook’s six-week training program, called bootcamp, that new engineers go through, as well as training for others doing research. We’ll also include a section on research in the annual privacy and security training that is required of everyone at Facebook.

Research website: our published academic research is now available at a single location and will be updated regularly.”

For now, while published research will be available at Research.Facebook.com, Facebook has not offered any further information on how it evaluates potential research projects, or exactly what kind of research its new panel will allow employees to conduct. The “User Experience” section of the research website notes that research topics in the area include “Human-computer interaction,” “Computer-mediated communication,” “Personal influence and diffusion of innovations,” “Social networks,” “Computer-supported collaboration,” “Digital identity and self-presentation,” and, notably, “Compassion and emotion in online communication.” The site does not indicate whether the company’s research on any of these topics involves experiments with actual Facebook users.

What are the standards of academic research?

One of the most important ethical rules for research involving humans is that participants give informed consent before taking part in a study. The U.S. government — via the jurisdiction of the Office for Human Research Protections, or OHRP– requires that researchers at federally funded institutions who conduct any research on human subjects must provide participants with specific information about the study before the research commences, including “a statement that the study involves research, an explanation of the purposes of the research and the expected duration of the subject’s participation, a description of the procedures to be followed, and identification of any procedures which are experimental.”

Participants should also be provided with information such as a description of foreseeable risks, an explanation of whom to contact with questions pertinent to the research, a statement that participation is voluntary, and that participants can choose to end their involvement with the research.

Another important component of assuring that research on humans is conducted ethically is each research plan’s review by an Institutional Review Board. An IRB is a panel that reviews plans for research involving humans before a study can begin. According to the OHRP, the IRB determines not only that risks are minimized and reasonable in relation to the benefits expected from the study, but that subjects are fairly selected, their privacy and data are protected, and that informed consent will be sought and documented. The IRB has the authority to approve, require modification to, or disapprove research plans.

In the case of cooperative research projects — those “which involve more than one institution” — each is responsible safeguarding the rights and welfare of subjects, and with complying with all of the OHRP’s research policies. With the approval of the department or agency, the institutions can arrange for a joint review to avoid duplicating efforts, but each is still responsible for compliance.

In 1979 — admittedly, well before the advent of the technology that would eventually make Facebook’s research efforts possible — the OHRP summarized the ethical principles of its guidelines for biomedical and behavioral research on humans in a document called the Belmont Report. The basic ethical principles cited were “Respect for Persons,” “Beneficence,” and “Justice,” and those principles were to be applied through informed consent, assessment of risks and benefits, and fair selection of subjects.

Additionally, the ethical principles and code of conduct (PDF) of the American Psychological Association specifies that for studies where deception is necessary, researchers should “explain any deception … as early as is feasible, preferably at the conclusion of their participation, but no later than at the conclusion of the data collection, and permit participants to withdraw their data.”

Where do Facebook’s policies deviate from academic standards?

As The New York Times noted, Facebook’s announcement of its new research policies is short on details. None of the actual guidelines have been disclosed, and the company has not explained whether it intends to ask Facebook users for their consent before conducting another study like the News Feed experiment. While Schroepfer says that the social network has “taken to heart the comments and criticism” on its research methods, there’s just enough information in Facebook’s blog post to make it clear that the new policies won’t be enough to satisfy the academics and consumer advocates who have called Facebook’s ethics into question.

While Facebook is training its engineers on the ethics of research, it’s unclear how extensive that training will be, or of what it will consist, though The Times learned from a person involved in drafting the new policy that the training teaches researchers engineers to ask important questions before conducting experiments. Those questions include considerations such as “what are the benefits of this test? Who will be affected? How would you feel if you were a subject of the experiment? Is there a better way to achieve the same result?”

While the decision to create a review panel “including our most senior subject-area researchers, along with people from our engineering, research, legal, privacy, and policy teams, that will review projects” seems to be in the spirit of an institutional review board, it’s unclear how large or how diverse this team will be.

The OHRP specifies that an IRB should have at least five members, diverse in professional background, race, and gender, and knowledgeable of the institution’s professional and legal commitments, regulations, and standards of conduct. The New York Times notes that Facebook is not inviting anyone outside of the company to objectively review its research projects, a solution that some have advocated as a way to ensure the ethics of Facebook’s experiments.

To be clear, legal experts have noted that Facebook is a private company, and therefore not subject to the regulations that federally funded research or academic institutions must follow. But the ethical guidelines to which these institutions are subject are useful as an ethical yardstick of sorts to evaluate whether Facebook’s new guidelines go far enough to protect users and assure the proper use of their data. While the OHRP policies are generally intended to govern clinical research, a 2012 order by the Federal Trade Commission directly targeted Facebook — and required that it obtain consumers’ consent before sharing their data in ways beyond what is provided for in its privacy policy.

To find where things get sticky with Facebook’s past experiments, you don’t have to look far into the study that spurred mass outcry over the summer and led to Facebook’s revision of its research policy. The Washington Post reported that in the case of the News Feed study, which involved a Cornell University professor and a former Cornell doctoral student, Cornell’s independent ethics board — the Cornell University Institutional Review Board — did not pre-approve the study.

Instead, the experiment was conducted before the university’s IRB was consulted. At the time of the study’s publication, it wasn’t entirely clear what role Cornell and the affiliated professor and former doctoral student played in the research. But in the fallout after the study was published in in the June 17 issue of the Proceedings of the National Academy of Sciences, the university distanced itself from the research, noting that Professor Jeffrey Hancock and Dr. Jamie Guillory, who co-authored the study along with Facebook data scientist Adam D.I. Kramer, did not participate in the collection of data, had no access to user data, and therefore were not directly engaged in human research.

If they had been, the study would have been subject to review by the Cornell Human Research Protection Program, which operates in concert with the university’s IRBs and cites the ethical principles of the Belmont Report as key to the review process that must take place for “all research that involves human participants, regardless of the source of financial support.”

Facebook’s researchers claimed that the terms and conditions that users agree to when they sign up for the site were equivalent to the informed consent that participants of a traditional academic or clinical study knowingly give — even though clauses referencing “research” were not added until after the study was conducted in January 2012, and participants weren’t notified of the research before or even after the study concluded. In fact, Facebook still hasn’t revealed which users were involved in the study, or if minors were included.

Is Facebook doing enough to protect users and win back their trust?

Or, in other words, is the social network’s News Feed experiment an ethics violation it can’t bounce back from? Not necessarily, but Facebook hasn’t done enough to assure users and academics that it will play by the ethical rules with future research. Jeffrey Hancock, the Cornell professor who was a co-author on the News Feed study, told The New York Times that he considers it important to know what standards Facebook will use to judge the ethics of proposed research projects.

While Hancock is pleased that Facebook is creating internal review panels, he thinks that the company needs to publicly share its guidelines, both to get feedback and to properly disclose the kind of research that it will conduct. It’s unclear if Facebook will continue to conduct research similar to the emotional manipulation study and simply not publish them, or if the newly created review panel will reconsider that type of research. Hancock also notes that Facebook has not yet made any mention of informed consent or participant debriefing in explaining what the new research policy covers.

It’s interesting to note that while Facebook’s blog post specifies that, “The guidelines also require further review if the work involves a collaboration with someone in the academic community,” Schroepfer notes later in the post that, “It’s important to engage with the academic community and publish in peer-reviewed journals, to share technology inventions and because online services such as Facebook can help us understand more about how the world works.”

That makes it sound like Facebook wants to play by the rules of academic research, but its policies will likely need further revision for an academic institution to partner with the social network in studying its users’ behavior in the future.

To some, the key to understanding why Facebook conducts its experiments the way it does is to stop measuring Facebook’s efforts against the standards of research methods. Those who defend the principle of Facebook’s research efforts — if not the exact method of the News Feed study — argue aptly that experimentation on users is ongoing in aspects of almost every online business.

The results of Facebook’s data analysis made it past the Cornell IRB, which concluded that the research did not need to be evaluated by the Cornell Human Research Protection Program, perhaps neglecting to take into account the methods that Facebook used to obtain the data. The Washington Post points out that if Facebook were a government agency, or a federally funded institution, the approval could be considered an ethical lapse. But because Facebook is neither of those — and is, in fact, a private company that isn’t legally bound by the ethical standards for academic research — Facebook is technically just another web company running A/B tests on its users without their knowledge and without the approval of an outside regulatory body.

It may very well be Facebook’s presentation of its results as science that prompts us to ask whether the work upholds the standards of academic research — clearly, it doesn’t — as Brian Fung points out in The Post, when we should consider it merely a commercial A/B test that most companies choose to not make public, much less present as social science.

But it’s undeniable that even Internet users who are very aware of how their data is collected and used everywhere and all the time were less than happy to learn that Facebook intentionally manipulated users’ emotions, and failed to tell them about it either before or after the fact.

In a way, Facebook has two choices: either be transparent about the experiments conducted on users, informing people and getting their consent. Or, forgo the publication of social science experiments and keep all research internal, where users won’t know if their News Feed is being manipulated in ways that deviate from the company’s usual experiments with its algorithm. Either way, the academic community and the public will need a significantly more detailed press release to trust Facebook’s ethics again — even with their expectations appropriately adjusted with the knowledge that Facebook isn’t legally obligated to comply with the same ethical principles as a university or a hospital, even when it comes to research on real, human Facebook users.

More from Tech Cheat Sheet: