In the fallout after Facebook’s infamous experiment to manipulate the emotional content of users’ News Feeds, another tech company, dating site OkCupid, has admitted that it also conducted psychological experiments on its users without their knowledge or explicit consent.
In a post titled “We Experiment on Human Beings!” Christian Rudder, one of OkCupid’s founders, took to the company’s OkTrends blog to divulge several experiments that the site had conducted on users without their knowledge. Rudder broke the news with an attitude and tone of “what’s the big deal” toward the recent public outcry over the revelation of similar experiments, like Facebook’s Newsfeed experiment to find out whether emotions are “contagious” via social media. (It turns out that they are.)
The particular experiments that Rudder reported that OkCupid conducted were psychological in nature, like Facebook’s experiment. The company alternately analyzed the conversations that resulted when it turned off the photos on the site for a day, temporarily changed its rating system, and suggested that users date partners who weren’t a good match to see how the relationships turned out. Rudder wrote dismissively of people’s concerns that such experiments were unethical, positing that experiments on users are ubiquitous, even necessary, as companies fine-tune websites.
“We noticed recently that people didn’t like it when Facebook ‘experimented’ with their news feed. Even the FTC is getting involved. But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”
Rudder noted that sites experiment on users to test out which of their ideas are good, which of their ideas are bad, and which ones can be improved with more refinement. He admits glibly that OkCupid “doesn’t really know what it’s doing,” and that’s why experiments are a necessary part of the site’s improvement:
“I’m the first to admit it: we might be popular, we might create a lot of great relationships, we might blah blah blah. But OkCupid doesn’t really know what it’s doing. Neither does any other website. It’s not like people have been building these things for very long, or you can go look up a blueprint or something. Most ideas are bad. Even good ideas could be better. Experiments are how you sort all this out.”
In the first experiment, the company removed all users’ profile photos from OkCupid on the launch day of a separate blind date app (which ultimately failed to take off.) The site analyzed how users interacted without photos, and found that all of the site’s metrics, such as the number of new conversations started, went down. But compared to a “normal Tuesday” users responded to first messages 44 percent more often, and exchanged contact details more quickly. “In short,” Rudder writes, “OkCupid worked better.” When the photos were turned back on, the conversations that had started blind “melted away.” Combining that data with findings from the blind date app, Rudder concludes that, “People are exactly as shallow as their technology allows them to be.”
The second experiment saw OkCupid amending its rating system — where users originally could rate potential partners on both Looks and Personality — to replace it with just one scale. “We ran a direct experiment to confirm our hunch — that people just look at the picture,” Rudder writes. The site took a “small” sample of users, and half of the time that it displayed their profiles, hid their profile text. The idea was to generate two separate scores, one based on the photo and and text, and one for the photo alone. OkCupid compared them and found that the profile text accounted for “less than 10 percent of what people think of you.”
The third experiment — the one that has drawn the most criticism since Rudder’s blog post — sought to answer the question “does this thing even work?” (A question that you’d hope the company, at least, would be confident could be answered with a “yes.”) But OkCupid wanted to find out if the “match percentage” it calculates for users is good at predicting relationships because it actually works, or just because the site tells users that it does.
“To test this, we took pairs of bad matches (actual 30 percent match) and told them they were exceptionally good for each other (displaying a 90 percent match.) Not surprisingly, the users sent more first messages when we said they were compatible. After all, that’s what the site teaches you to do.
But we took the analysis one step deeper. We asked: does the displayed match percentage cause more than just that first message — does the mere suggestion cause people to actually like each other? As far as we can measure, yes, it does.
When we tell people they are a good match, they act as if they are. Even when they should be wrong for each other.”
Rudder writes that the site tested the same situation in reverse: telling people who were actually good for each other that they were bad, just to watch what happened. The results of the experiment revealed that “the mere myth of compatibility works just as well as the truth.”
OkCupid’s OkTrends blog promotes an upcoming book by Rudder, titled Dataclysm: Who We Are, set to release in September. In the book’s product listing on Amazon, an introduction begins: “Our personal data has been used to spy on us, hire and fire us, and sell us stuff we don’t need. In Dataclysm, Christian Rudder uses it to show us who we truly are.”
Rudder casts the OkCupid experiments in a similar light, implying that they’re research into human psychology, in addition to tests of how well the website works. At one point in the blog post, Rudder quips, “The only thing with more bugs than our HTML was our understanding of human nature.” Rudder’s OkCupid post and the Amazon blurb introducing his book cast data-aided experiments as essential to investigating human behavior. The Amazon text notes:
“For centuries, we’ve relied on polling or small-scale lab experiments to study human behavior. Today, a new approach is possible. As we live more of our lives online, researchers can finally observe us directly, in vast numbers, and without filters. Data scientists have become the new demographers.”
Tech companies’ social and psychological experiments — like OkCupid’s manipulation of match percentages or Facebook’s alteration of the emotional content of users’ News Feeds — are an extension of more “normal” methods of what amounts to the manipulation of consumers’ behavior. Routine marketing campaigns and advertisements are, by nature, designed to influence consumers’ behavior, usually by getting them to buy a product or use a service.
But experiments like OkCupid’s are a limited, even flawed way to test and measure the influence that sites and services have over their users’ behavior. Companies consider that influence a valuable power. Even when it finds out that it isn’t influential for the reasons it originally thought — in OkCupid’s case, that the match percentage could accurately predict relationships — and instead draw influence from other factors rooted in human psychology — that the match percentage works because the site tells people that it does — companies are okay with having that knowledge, and sharing it.
In the case of Facebook’s News Feed experiment, the objective was to prove that emotions could spread through social media posts, and that exposing a user to either positive or negative emotional content would alter that user’s mood and behavior. Facebook didn’t seem surprised to find out that it was right — that the (unaware) participants’ online behaviors were influenced by the content the selection of content they saw in their News Feeds. However, what Facebook seemingly didn’t account for was the public’s outrage at the possible consequences for users’ offline behavior.
The U.S. government and privacy groups have asked the U.S. Federal Trade Commission to determine whether Facebook broke the law or violated agreements with users by manipulating the algorithm that determined which posts were displayed on users’ News Feeds. Regardless of whether the experiment was unlawful or not, most people agreed that it was unethical — except, it seems, companies like OkCupid, which would be labeled a hypocrite if it criticized Facebook while engaging in similar practices itself.
The OkCupid blog post demonstrates that some companies really don’t think it’s wrong to experiment on users to make the data it gathers more useful in testing certain situations or conclusions — even if consumers and the FTC disagree. Lawyers and experts in consumer protection law say that OkCupid could be subject to an FTC inquiry. Those who spoke to Reuters noted that the company’s experiments could be considered a violation of an FTC provision that prohibits “unfair and deceptive” practices to mislead or harm consumers. While some experts said the experiments were deceptive, particularly the one that saw OkCupid misleading users about which potential partners were good matches, others said that it could be difficult to prove deception because the experiment could be construed as an effort to ascertain whether the service was working.
OkCupid’s status as a free website with some paid services might also affect the FTC’s decision as to whether or not it will investigate the company. Reuters learned that the agency is most likely to pursue cases where practices caused economic or health injuries to consumers, so it could be difficult to prove that the experiments caused consumers any real harm by misleading couples about the quality of a match.
Tech companies conduct these experiments because they’re interesting, and because it can. Whether the exact psychological experiments that OkCupid and Facebook have conducted are an effective way to figure out how to improve websites and services is up for argument. But experiments are considered a part of these companies’ business models. However, unlike groups that conduct actual academic research, companies like OkCupid and Facebook aren’t required to get anyone else’s input on whether its experiments are ethical. The problem is that research is fundamentally different from other common modes of less harmful manipulation because OkCupid and Facebook’s experiments manipulated users without their knowledge or consent.
When people watch an advertisement or engage with marketing content, they know what the company’s objective is, and they know the traditional relationship between advertiser and consumer. Conversely, when sites like OkCupid and Facebook change the way its services work for psychological research, even for a select group of users, it’s manipulating the user and the relationship that they’ve entered into, even if the research is technically within the letter of the far-reaching consent forms, terms, and conditions. Consumers are okay with being psychologically manipulated if they know they’re being manipulated — but not when companies manipulate them without their knowledge just to test and measure the extent of their influence.
Rudder’s comment that OkCupid’s understanding of human nature is flawed seems absolutely true, especially given the tone-deaf flippancy with which he delivered the news about the site’s psychological experiments. While it’s also true that research — even experiments — are necessary to improve a site and its services, if OkCupid and companies like it want to save itself from Facebook’s low user satisfaction ratings, it would be wise to think twice about how it manipulates the users and data it has access to — or at least carefully consider how it words its blog posts informing consumers of the experiments of which they’ve been test subjects.