Facebook emotional manipulation test turns users into 'lab rats'

Anger grows even as Facebook researcher posts apology for causing users anxiety

Users and analysts were in an uproar over news that Facebook manipulated users' News Feeds to conduct a week-long psychological study that affected about 700,000 people.

News reports said that Facebook allowed researchers to manipulate the positive and negative information they saw on the social network in order to test the emotions of users. The study, which was conducted Jan. 11 to Jan. 18, 2012, was published in the Proceedings of the National Academy of Sciences.

For the past several days, the media, blog posts, social commentators and industry analysts have been venting their anger over Facebook's emotional manipulation.

"I think this violates the trust of Facebook users who rely on their protection," said Patrick Moorhead, an analyst with Moor Insights & Strategy. "There were two lines that were crossed that violated trust. First, Facebook's News Feed was manipulated for the sake of the experiment. Secondly, it involved a third party who published results externally. Facebook users are more than public lab rats."

In the experiment, Facebook temporarily influenced what kind of posts and photos would be seen by about 700,000 of its English-speaking users on their News Feeds. The social network enabled researchers to either show more positive or negative comments and posts to users in order to see if it influenced the users' emotions.

As a consequence, users were not shown a regular cross-section of their friends' posts, but instead were given a manipulated feed.

The study found that users who were shown more positive comments, made more positive comments and users shown more negative comments became more negative themselves.

Facebook did not respond to a request for comment, though Adam Kramer, a data scientist at Facebook who participated in the study, apologized for upsetting users in a post on his Facebook page.

"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," Kramer wrote. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

He pointed out that the research only affected about 0.04% of users, or 1 in 2,500. Facebook today has more than 1 billion users.

While some users commented on his blog post, saying they appreciated the fact that he addressed the issue, not all were so understanding.

"I appreciate the statement, but emotional manipulation is still emotional manipulation, no matter how small of a sample it affected," wrote Kate LeFranc in a comment.

Andrew Baron wrote, "This is the nail in the coffin for my concern that Facebook is the kind of company that Google talks about when they say don't be evil... There is no turning back from this."

Hadley Reynolds, an analyst with NextEra Research, said while he finds Facebook's move "reprehensible," he also doesn't find it surprising.

"Web-based businesses have been 'experimenting' on their users since the beginning of the commercial Internet, so it's hardly surprising that Facebook would be testing the effect of various content algorithms on groups of their members," Reynolds said. "Facebook is simply gauging its ability to manipulate its customers' emotions for its own private gain. This incident won't break Facebook's franchise, but it will add another straw to the growing pile that eventually will erode user's perception of value enough to put the business in long-term decline."

Jeff Kagan, an independent analyst, called the experiment an abuse of users' trust.

"This creeped me out when I heard about what Facebook is doing to their users, their customers," Kagan said. "Facebook just sees this as an interesting experiment, but this is another case where they cross way over the line. Companies often battle to protect the privacy of their users from other companies. However, Facebook seems to be abusing their own customers themselves. This is inexcusable."

This is far from Facebook's first questionable practice regarding privacy and users have not left the social networking site. That means Facebook has little impetus to stop making these kinds of moves.

"In the old days, customers would simply leave and punish Facebook," Kagan said. "However today, customers don't leave so Facebook continues going down this same path with no self control. How can a company that is supposed to value its customers, abuse them so badly?"

The study alleges that users' accept Facebook's right to manipulate their News Feed when they click on the site's terms and conditions of use.

Rob Enderle, an analyst with the Enderle Group, said that explanation doesn't fly for him. He said he is troubled with the company's decision to purposefully make people sad to further their experiment.

"That was the goal of the study and that sadness does represent harm," Enderle said. "I'd anticipate one or more class-action lawsuits."

Kagan noted that Facebook's social experiment could easily invite various governments to launch an investigation.

"I don't think Facebook will show self control until they are forced to one way or another," he said. "I can see how governments in every country where Facebook operates may step in. When they do, it will be an expensive lesson for the company."

This article, Facebook emotional manipulation test turns users into 'lab rats', was originally published at Computerworld.com.

Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter at @sgaudin, on Google+ or subscribe to Sharon's RSS feed. Her email address is sgaudin@computerworld.com.

See more by Sharon Gaudin on Computerworld.com.

Read more about social media in Computerworld's Social Media Topic Center.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.
Show Comments

Latest Videos

More Videos

More Brand Posts

Blog Posts

Marketing prowess versus the enigma of the metaverse

Flash back to the classic film, Willy Wonka and the Chocolate Factory. Television-obsessed Mike insists on becoming the first person to be ‘sent by Wonkavision’, dematerialising on one end, pixel by pixel, and materialising in another space. His cinematic dreams are realised thanks to rash decisions as he is shrunken down to fit the digital universe, followed by a trip to the taffy puller to return to normal size.

Liz Miller

VP, Constellation Research

Why Excellent Leadership Begins with Vertical Growth

Why is it there is no shortage of leadership development materials, yet outstanding leadership is so rare? Despite having access to so many leadership principles, tools, systems and processes, why is it so hard to develop and improve as a leader?

Michael Bunting

Author, leadership expert

More than money talks in sports sponsorship

As a nation united by sport, brands are beginning to learn money alone won’t talk without aligned values and action. If recent events with major leagues and their players have shown us anything, it’s the next generation of athletes are standing by what they believe in – and they won’t let their values be superseded by money.

Simone Waugh

Managing Director, Publicis Queensland

Sign in