4 Facebook Faux Pas: Is It Time to Defriend This Social Media Giant?

Source: Thinkstock

Source: Thinkstock

Facebook (NASDAQ:FB) has had to make not one, but two apologies within the past few weeks — a new record for the social media giant. In one, advertisers were given receipts and data for other peoples ads, and in another case, some users were unknowingly part of an emotional study, of which the paper and its findings were posted on the Proceedings of the National Academy of Sciences’ (PNAS) website.

The billing accident was just that, though — an accident. There was a bug in the system and Facebook worked for two hours to straighten out advertisers’ orders. The company apologized for the confusion and has mostly moved on from the incident. However, the emotional study was a test that went through an internal review process and was scrutinized within the company. There was no bug, no fix that needed to be made. It was a research proposal that someone wanted to test back in 2012, and Facebook allowed these tests to be done on its users.

In the study, 689,003 users had content curated for them, showing either positive or negative posts from their friends, and sitting back to see how those users reacted in their own posts. The test set out to see if positivity bred more positivity and negativity bred more negativity within a social network.

The users who unknowingly took part in the study remained anonymous. Regardless, people are still outraged that Facebook would use them as lab rats, despite the fact that advertisers and numerous other companies curate our content all the time. Slogans, ads, and even the way grocery stores are arranged toy with our minds and perceptions. Critics of the study say that you need informed consent from people in order to conduct a study like the one Facebook pulled.

However, you may already have agreed to these tests by consenting to Facebook’s Privacy Policy and Terms of Service. Currently, Facebook’s Terms read that it can use your information for “internal operations,” which include “research.” But we aren’t talking about today’s privacy policy — what about back in 2012? During the study, according to Forbes reporter Kashmir Hill, “the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that ‘research’ is something that might happen on the platform.” It was only four months after the study that Facebook made changes to include “research” in its policies.

Whether or not the inclusion of research was in Facebook’s Terms of Service at the time may or may not put the social network in legal hot water. It’s good to point out that sites do A/B testing all the time based on location, weather, gender, etc. Most of the time two users don’t see the same content, because sites are testing or troubleshooting what resonates best with consumers in a certain demographic. Google and search engines do it as well, by putting you in your own information bubble based off of things it thinks you’d prefer to hear/read about.

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook,” said Adam Kramer, the data scientist behind the study.

Facebook may want to improve its services, but it has done so in far more shifty ways than doing research on our emotions. Check out a few other ways Facebook has played fast and loose with our privacy.

Source: Thinkstock

Source: Thinkstock

News Feed

The first time Facebook introduced its new feed in 2006, users were shocked when all of a sudden their information and updates weren’t limited to their individual hub pages. Suddenly, status updates and pictures were filtered into a stream shown to all their friends.

Mark Zuckerberg took to his Facebook page to apologize in a note, saying that, “We really messed this one up. When we launched News Feed and Mini-Feed we were trying to provide you with a stream of information about your social world. Instead, we did a bad job of explaining what the new features were and an even worse job of giving you control of them. I’d like to try to correct those errors now.”

Facebook was able to quell the alarm by creating privacy controls for users to determine what could and couldn’t show up on other people’s feeds. Today, the news feed is hardly an issue. People have forgotten about their exposed status updates and pictures.

Source: Thinkstock

Source: Thinkstock

Beacon

Facebook’s Beacon program allowed the social network to keep tabs on users’ shopping habits. If a Facebook user purchased something, a message would appear on their page of what and where they bought the product online. Beacon was also an opt-out program, so users had to manually go into their settings to disable the feature. Some argued that displaying buying habits to the world was a violation of privacy, and took that argument to court in a class-action lawsuit.

Mark Zuckerberg, again, released an apology in a Facebook note. Beacon was eventually shut down.

Source: newsroom.fb.com

New privacy changes that pushed users to share more

Facebook changed its privacy settings in 2009, simplifying a set of complicated controls to a basic set of who you wanted to share what with whom. However, the alteration defaulted many settings to show posts to everyone, or gave these as “recommended” configurations for your profile.

Mark Zuckerberg took to The Washington Post to voice his explanation, saying that, “It’s a challenge to keep that many people satisfied over time, so we move quickly to serve that community with new ways to connect with the social Web and each other. Sometimes we move too fast — and after listening to recent concerns, we’re responding.”

Recently, Facebook has decided to default posts and status settings to “Friends Only,” which may not be in line with its worldview of full openness, but sure makes its users feel better.

Source: Getty Images

Source: Getty Images

Are people flocking away because of these mishaps?

Many people are probably threatening to leave Facebook after the emotional testing. But despite all these previous mishaps Facebook has actually grown. Since the company began, it has amassed a following of almost 1.2 billion as of 2013 from just a million users when it started back in 2004. These privacy faux pas have done nothing to slow its growth over the years, so why would it now?

While users have complained about the research studies it did for a week back in 2012, how many of them are going to cut the cord?

More From Wall St. Cheat Sheet: