CMO

7 things you need to know about Facebook's mood experiment

With the uproar continuing over Facebook's manipulation of some users' News Feeds to conduct an experiment on emotions, there are several things users need to understand.

News reports recently surfaced that Facebook enabled researchers to surreptitiously control the posts, comments and photos that about 700,000 users were seeing as part of a psychological experiment.

Users, analysts and bloggers have been voicing their outrage over what many are calling emotional manipulation and a breach of users' trust.

From how the experiment was conducted to its legality, what Facebook had to say and what recourse users have, here are seven things users e need to know.

1. What happened?

The Proceedings of the National Academy of Sciences (PNAS) published a study conducted by researchers from Facebook, the University of California and Cornell University on whether people's emotions can be influenced without face-to-face contact.

The study was conducted during the week of Jan. 11 to Jan. 18, 2012. It affected 689,003 English-speaking Facebook users.

In the experiment, Facebook temporarily influenced the kind of posts and photos users could see in their News Feeds, making it possible for researchers to show either mostly positive comments, posts and photos or mostly negative ones in order to see if the nature of the content influenced users' emotions.

As a consequence, users were not shown a regular cross-section of their friends' posts, but instead were given a manipulated feed.

Facebook noted that users could have seen their friends' content if they had gone directly to those Facebook pages, but much of that content was cut or emphasized on the News Feeds, which is where most users get their social information.

2. What the experiment showed

The study found that users who saw more positive comments made more positive comments themselves, while users who saw more negative comments echoed those comments.

The research is focused on what scientists call an "emotional contagion," or the ability to influence people to show the same emotions without direct personal contact or even their awareness.

3. Does Facebook normally manipulate your News Feed?

Yes, but it's not normally done to either cheer you or depress you as part of a psychological study.

Facebook has been upfront in saying it uses an algorithm that determines which stories appear first in users' News Feeds.

The social network has said it uses the algorithm to spare users from "spammy" content, duplicates and like-baiting, a method to boost circulation by actively trying to get users to like, comment or share a post. Facebook has noted it's trying to provide the information users most want to see.

"Ideally, we want News Feed to show all the posts people want to see in the order they want to read them," the company wrote in a post. "This is no small technical feat: every time someone visits News Feed there are on average 1,500 potential stories from friends, people they follow and Pages for them to see, and most people don't have enough time to see them all."

4. What has Facebook said about the experiment?

Adam Kramer, a data scientist at Facebook who was involved in the study, apologized for upsetting users in a post on his Facebook page.

"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," Kramer wrote. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

He pointed out that the research affected only about 0.04% of Facebook's users, or 1 in 2,500. The social network today has more than 1 billion users.

5. Is this legal?

The short answer is yes.

The study's authors noted in their research paper that users accept Facebook's right to manipulate their News Feeds when they click on the site's terms and conditions of use.

Jeff Kagan, an independent analyst, told Computerworld that users do agree to information manipulation when they accept the site's terms of use. "If user didn't know that, it's their own fault," he said.

Patrick Moorhead, an analyst with Moor Insights & Strategy, noted that Facebook's user agreement states clearly that the company can do research on people who agree to the site's terms.

"Legally, even if people never read the terms, they are still bound to them," Moorhead added. "There are exceptions in the U.S. on things like bank loans and insurance documents, but this is not one of them."

6. Is it ethical or fair?

That, say industry analysts, is another matter all together.

"The biggest complaint is Facebook's mind frame -- their lack of care about customers and protecting their privacy," Kagan said. "It's about the sneaky approach that Facebook continually seems to take and not caring about the concerns of the users."

Rob Enderle, an analyst with the Enderle Goup, said the drawback to free sites and services is that users may lose control over their privacy.

"A big part of the cost of 'free' is that companies often don't value customers who don't pay them for their services," Enderle said. "There is a growing elitism in the technology market likely connected to the massive power and wealth imbalance between the people who control social media properties and those that invest in and use them."

7. What options do users have?

While analysts say they'd be surprised if there wasn't a class-action lawsuit filed over this move, they also note there's a very easy step that users can take.

Quit. Just stop using Facebook. The issue is that Facebook has had privacy issues and angry users in the past and there has never been a mass exodus.

So will users leave Facebook over this one? Doubtful.

"Whatever your view, it is the responsibility of each of us to think through the implications of behaviors such as Facebook's recent experiment and come to a position about what kinds of intrusions we are willing to accept," said Hadley Reynolds, an analyst with NextEra Research. "The conclusions we come to will impact our perspective on the companies we choose to patronize on the Web."

Moorhead added that Facebook has shown that it will play fast and loose with users, and now it's up to users to decide what to do about it.

"Facebook is not trustworthy as they step on users' privacy routinely," he said. "The reason why there isn't a mass exodus is mixed. Some users just don't care about privacy. Some stay because there isn't an alternative. Others don't know or understand the downside of an invasion of privacy."