Facebook emotional manipulation test turns users into 'lab rats'

Anger grows even as Facebook researcher posts apology for causing users anxiety

Users and analysts were in an uproar over news that Facebook manipulated users' News Feeds to conduct a week-long psychological study that affected about 700,000 people.

News reports said that Facebook allowed researchers to manipulate the positive and negative information they saw on the social network in order to test the emotions of users. The study, which was conducted Jan. 11 to Jan. 18, 2012, was published in the Proceedings of the National Academy of Sciences.

For the past several days, the media, blog posts, social commentators and industry analysts have been venting their anger over Facebook's emotional manipulation.

"I think this violates the trust of Facebook users who rely on their protection," said Patrick Moorhead, an analyst with Moor Insights & Strategy. "There were two lines that were crossed that violated trust. First, Facebook's News Feed was manipulated for the sake of the experiment. Secondly, it involved a third party who published results externally. Facebook users are more than public lab rats."

In the experiment, Facebook temporarily influenced what kind of posts and photos would be seen by about 700,000 of its English-speaking users on their News Feeds. The social network enabled researchers to either show more positive or negative comments and posts to users in order to see if it influenced the users' emotions.

As a consequence, users were not shown a regular cross-section of their friends' posts, but instead were given a manipulated feed.

The study found that users who were shown more positive comments, made more positive comments and users shown more negative comments became more negative themselves.

Facebook did not respond to a request for comment, though Adam Kramer, a data scientist at Facebook who participated in the study, apologized for upsetting users in a post on his Facebook page.

"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," Kramer wrote. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

He pointed out that the research only affected about 0.04% of users, or 1 in 2,500. Facebook today has more than 1 billion users.

While some users commented on his blog post, saying they appreciated the fact that he addressed the issue, not all were so understanding.

"I appreciate the statement, but emotional manipulation is still emotional manipulation, no matter how small of a sample it affected," wrote Kate LeFranc in a comment.

Andrew Baron wrote, "This is the nail in the coffin for my concern that Facebook is the kind of company that Google talks about when they say don't be evil... There is no turning back from this."

Hadley Reynolds, an analyst with NextEra Research, said while he finds Facebook's move "reprehensible," he also doesn't find it surprising.

"Web-based businesses have been 'experimenting' on their users since the beginning of the commercial Internet, so it's hardly surprising that Facebook would be testing the effect of various content algorithms on groups of their members," Reynolds said. "Facebook is simply gauging its ability to manipulate its customers' emotions for its own private gain. This incident won't break Facebook's franchise, but it will add another straw to the growing pile that eventually will erode user's perception of value enough to put the business in long-term decline."

Jeff Kagan, an independent analyst, called the experiment an abuse of users' trust.

"This creeped me out when I heard about what Facebook is doing to their users, their customers," Kagan said. "Facebook just sees this as an interesting experiment, but this is another case where they cross way over the line. Companies often battle to protect the privacy of their users from other companies. However, Facebook seems to be abusing their own customers themselves. This is inexcusable."

This is far from Facebook's first questionable practice regarding privacy and users have not left the social networking site. That means Facebook has little impetus to stop making these kinds of moves.

"In the old days, customers would simply leave and punish Facebook," Kagan said. "However today, customers don't leave so Facebook continues going down this same path with no self control. How can a company that is supposed to value its customers, abuse them so badly?"

The study alleges that users' accept Facebook's right to manipulate their News Feed when they click on the site's terms and conditions of use.

Rob Enderle, an analyst with the Enderle Group, said that explanation doesn't fly for him. He said he is troubled with the company's decision to purposefully make people sad to further their experiment.

"That was the goal of the study and that sadness does represent harm," Enderle said. "I'd anticipate one or more class-action lawsuits."

Kagan noted that Facebook's social experiment could easily invite various governments to launch an investigation.

"I don't think Facebook will show self control until they are forced to one way or another," he said. "I can see how governments in every country where Facebook operates may step in. When they do, it will be an expensive lesson for the company."

This article, Facebook emotional manipulation test turns users into 'lab rats', was originally published at Computerworld.com.

Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter at @sgaudin, on Google+ or subscribe to Sharon's RSS feed. Her email address is sgaudin@computerworld.com.

See more by Sharon Gaudin on Computerworld.com.

Read more about social media in Computerworld's Social Media Topic Center.

Join the CMO newsletter!

Error: Please check your email address.
Show Comments

Supporting Association

Blog Posts

Is your content marketing missing the mark?

Does it ever seem like the content you create falls flat on its face or that the leads you’re generating aren’t worth following up?

Dan Ratner

managing director, uberbrand

​ Creating a purpose-driven brand

So you want to be a brand with purpose. But what does it actually mean to build a brand with real meaning?

Paul Chappell

Partner and managing director, Brand + Story

Customer experience crisis: Proactively mitigating the risk of broken promises

Last Friday, three weeks after United Airline’s spectacular customer experience disaster, customers received a letter from the company’s CEO, Oscar Munoz.

Grate post, thanks for the post.No matter what your business is, if you do no not rank among the top most search results of Google, Yahoo...

Rahul

Image intelligence:10 must-see infographics for marketers

Read more

Thank you Shane Blandford for carrying my Smarketing vision into KM !

Peter Strohkorb

​CMO Interview: Why aligning sales and marketing drives innovation at Konica Minolta

Read more

Thanks for helping me putting those threads of thoughts together. Simplification and connection - neat idea.

Mark Bayly

Tips from IAG on how to craft human-centred design

Read more

The problem with Box is that they made a couple of big mistakes - they first hired a bunch of unprepared kids and gave them big roles and...

Tim Woods

CMO interview: Why Box's marketing chief is rewarding staff for failing

Read more

At this point, being hit hard will also be subject for a detailed study. In honesty, too early to tell but there are precedents to follow...

Sean Lindeman

Australian Government to abolish 457 visa program

Read more

Latest Podcast

More podcasts

Sign in