Facebook has realized that its algorithms aren’t enough to keep the News Feed relevant. Many users complain about the annoying array of stories that show up, punctuating announcements of engagements, photos of new babies, and videos from vacations, with obnoxious listicles, annoying political videos, and a never-ending series of animal videos. So Facebook is taking a new direction to figure out exactly what users want in their News Feeds, not via the behind-the-scenes machinations of an algorithm, but by asking people direct questions about what they see in their News Feeds, and what they’d want to see instead.
Steven Levy reports on Backchannel that Facebook has figured out that algorithms alone aren’t enough to determine the makeup of a user’s News Feed. The News Feed is core to the social network, so the mix is important not only to users, but to the news business, the apps industry, and even the Internet meme machine. To strike the right balance, Facebook is turning to what Levy terms the “Antediluvian art” of directly asking people what they want, via a focus group-like research project that began with a group of 30 testers in Knoxville, Tenn., and now reaches a group of 600 Facebook users around the country.
The process is part of Facebook’s ongoing effort to make the News Feed central to users’ social lives and the time that they spend online. Chris Cox, Facebook’s chief product officer, says that “The dream is to get to this world where people feel that Facebook is an instrumental, useful, important part of their lives.” When the News Feed was first introduced in 2006, some users were shocked that Facebook was suddenly sharing news of their activities with all of their “friends,” and about 10% of the site’s users joined a group protesting the News Feed. But eventually the feature became the social network’s most popular feature by far, fulfilling what Cox says was the goal of making the News Feed’s stream of stories into Facebook’s home page, a home page personalized for each user.
Levy reports that early in the feature’s existence, Facebook expended little effort on ranking the News Feed’s stories. Cox says that the team relied on its intuition and experiences to make decisions, and it wasn’t until “a couple of years ago” that Facebook made News Feed rankings a high priority. It began using feedback loops, machine learning, and analysis of how close the user was to the author of a story. Chief executive Mark Zuckerberg said in 2013 that Facebook’s goal with the News Feed is “to build the perfect personalized newspaper” for each of its users, which involves identifying the types of content, people, and topics they care about.
Levy explains that if users perceive the hours they spend with the News Feed as less than rewarding, then they’ll inevitably spend less time on it. So the News Feed team makes two kinds of continual progress on the feature, rolling out modest incremental improvements at weekly launch meetings, and more significant algorithmic changes that occur less frequently. An update last August targeted clickbait headlines, and one this January integrated tools to prevent the spread of hoaxes.
But Levy says that a major problem for Facebook is that its user metrics have become a feedback loop for “useless diversions.” Though the News Feed performs well when choosing stories like the news of a friend’s marriage, childbirth, or the vacations of close friends, many users’ News Feeds are overpopulated with listicles, animal videos, and political “red-or-blue meat.” Levy refers to this as the “Dozen Doughnuts” problem. Most people know that it’s not a good idea to eat a doughnut everyday — and might even prefer to drink a green smoothie — but if a coworker comes in with a dozen Krispy Kremes, temptation might overrule discretion. Likewise, Facebook’s News Feed presents a “never-ending delivery” of info-doughnuts, and when users give in to temptation and click them, they send a strong signal to Facebook’s algorithm that this is the type of content they want to see.
Adam Mosseri, the News Feed product director, tells Levy, “We really try to not express any editorial judgment” about what’s in anyone’s News Feed. “We might think that Ferguson is more important than the Ice Bucket Challenge, but we don’t think we should be forcing people to eat their vegetables even though we may or may not think vegetables are healthy.” But Levy speculates that Facebook’s leaders privately decry the social network’s role in “sugar-bombing” the news industry, as writers and editors optimize their story selection and presentation to be irresistible to users scrolling through their News Feeds.
By algorithm alone
It was early last year that Facebook decided that an algorithmic approach alone might be too limited to craft a satisfying News Feed. Instead, it would have to ask people why they like the stories they do. Greg Marra, a News Feed product manager, tells Levy, “If you just watch people eat doughnuts, you’re like, ‘People love doughnuts, let’s bring them more doughnuts.’ But if you talk to people they’re like, ‘No actually what I want is to eat fewer doughnuts and maybe eat a kale smoothie….’ Then we can give them some kale smoothies, too.”
So Facebook hired a testing firm to set up an experiment with 30 people in Knoxville, Tenn. Professional raters, in their twenties and thirties and working from an office park on the outskirts of Knoxville, click on a button in a specialized version of Facebook and are instantly presented with what Facebook determines are the 30 top News Feed stories for each individual. Unlike the average Facebook user, each rater sees these stories presented randomly, and goes through each story one by one, eventually going through about 100 stories in the course of a day.
They interact with each story as they normally would, either ignoring it or engaging with it by commenting, sharing, or following links. Then for each story, they answer eight questions, like how much they care about the person in the story, how welcome the story was in the News Feed, or how entertaining the story was. Finally, they write a paragraph explaining their feelings regarding the story. These 30 are the advance guard of a project that now extends to about 6o0 people around the country, and even in the project’s early stages, it’s yielded some helpful insights.
Raters proved the hypothesis that nothing tops big news from close friends. The impact of close friends also extends to the way people read stories posted by their close friends, even ones that don’t involve personal news. Additionally, sponsored stories are less desirable than other types of content. But when asked what they’d like to see more of, few people indicated that they wanted more meaningful stories. “If anything it’s the inverse,” Mosseri tells Levy. “When we asked what are the best stories, ones people said they really want to see, the highest percentage of impact type is a strong emotional reaction. People really want to see stuff that drives a laugh or makes them feel happy, not necessarily information that’s super valuable.” Additionally, while most people say that they want to see more stories from their friends, signals from raters suggest that people want to see more content that’s public.
Mosseri says that so far, testers have only rated individual stories. The next step in the experiment will be to assess sets of posts. “If you ask people about each story individually they’re going to naturally rate emotional reactions really highly, which is what we’re seeing. Whereas once we start to ask about sets of stories, our hypothesis is that people will start to ask for more friend content and more informative content.” Mosseri says that the Knoxville project still has months, possibly years, to go as Facebook learns more about what users want in their News Feeds, and realizes that its algorithms aren’t enough to figure out what people really want.
Facebook’s experiments with a more human method of assessing the content of the News Feed comes at a time when algorithms power an increasingly large array of the services we use each day. As Ian Bogost noted recently in The Atlantic, it’s not just Facebook. Google’s search algorithms determine how we access information, while Netflix’s and Amazon’s collaborative filtering algorithms choose products and media for us. Bogost posits that algorithms “are simplifications, or distortions. They are caricatures. They take a complex system from the world and abstract it into processes that capture some of that system’s logic and discard others. And they couple to other processes, machines, and materials that carry out the extra-computational part of their work.”
He notes that most computing systems don’t want to admit that they are burlesques, and because the algorithm has taken on a “particularly mythical” role in our technology-obsessed era, we take it as an excuse not to intervene in the social shifts wrought by companies like Google or Facebook. We also forget that they are abstractions of the real world — that Facebook’s algorithms aren’t a real structure of our social lives — and that the algorithm that seemingly powers our experience with the world’s most popular social network can, by necessity, only be a small part of how we interact with the people and the information around us.