Facebook messes with our emotions

|
(18)
Adam D.I. Kramer designed the emotional contagion experiment for Facebook's core data science team.

The Facebook data scientist who served as lead author in a controversial study that set out to manipulate the emotional states of 689,003 Facebook users, sparking outrage and a federal complaint last week, lists his current city as San Francisco on his Facebook page. He also calls himself “Danger Muffin.”

Adam D.I. Kramer, who works for Facebook’s core data science team, conducted the research in partnership with two co-authors he publicly described as friends – Jeffrey Hancock, a communication and information science professor at Cornell University, and Jamie Guillory, a postdoctoral scholar previously at Cornell and now affiliated with the University of California San Francisco.

The trio’s research, conducted for one week in January of 2012, sought to determine whether Facebook users would be emotionally impacted by exposure to positive or negative content on their news feeds. They published their findings, edited by a psychology researcher from Princeton University, in the journal Proceedings of the National Academy of Sciences, with the title: “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks.”

“Emotional states can be transferred to others via emotional contagion,” the study notes, “leading people to experience the same emotions without their awareness.” A press release issued by Cornell in mid-June hailed the experiment as “the first to suggest that emotions expressed via online social networks influence the moods of others.”

People exposed to more negative content “used more negative words in their status updates,” Hancock explained in the Cornell press statement, while “significantly more positive words were used” by users who saw an increase in positive content.

In this 2011 video on internal experimentation using Facebook data, Kramer gives an in-depth presentation on how users’ total word sets – everything they’ve ever posted on Facebook – can be digitally analyzed with the use of a matrix that can ultimately show “how users differ from each other.”

The emotional contagion study has prompted a major backlash, prompting the Electronic Privacy Information Center to file a formal complaint with the Federal Trade Commission, accusing Facebook of engaging in deceptive trade practices. As EPIC put it, “the company purposefully messed with people’s minds.”

Julia Horwitz, EPIC’s consumer protection counsel, noted that Facebook signed a consent order at the direction of the FTC in 2012 following “a several-year investigation about other data sharing charges.” 

As a result, “Facebook is now under this consent order that requires it to comply with various data protection provisions,” meant to safeguard the information that its users provide. “Facebook’s use of the information submitted into the data feeds, that was then processed through the psychological manipulation algorithm, is a violation of the consent order,” Horwitz explained.

To comply with the federal order, the company should have solicited express consent from users granting permission to be subjected to experimentation, she noted.

“One of the things we are hoping to gain from this complaint,” Horwitz added, “is to have Facebook publicize the news feed algorithms, so that users can understand the basis by which they’re given information.”

Jaron Lanier, author of Who Owns the Future?, railed against Facebook for its recklessness in experimenting on people’s emotional states in a New York Times OpEd, saying:

“The manipulation of emotion is no small thing. An estimated 60 percent of suicides are preceded by a mood disorder. Even mild depression has been shown to increase the risk of heart failure by 5 percent; moderate to severe depression increases it by 40 percent.

Research with human subjects is generally governed by strict ethical standards, including the informed consent of the people who are studied. … The subjects in the study still, to this day, have not been informed that they were in the study. If there had been federal funding, such a complacent notion of informed consent would probably have been considered a crime. Subjects would most likely have been screened so that those at special risk would be excluded or handled with extra care.”

While Facebook seems to be bearing the brunt of public outrage over the study, the social media giant’s partnership with academic sector has also raised questions. Guillory became affiliated with UCSF only after her involvement with the study, but in the angry aftermath of the publication of this experiment, Cornell has sought to distance its researchers from the controversy.

Jamie Guillory, formerly at Cornell and now at the University of California San Francisco, was a co-author of the study.

In an official statement, Cornell noted, “Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. … Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.”

Syl Kacapyr, a Cornell spokesperson, forwarded the canned statements to the Bay Guardian and said none of the study’s authors would be granting media interviews. Nevertheless, we reached out to Kramer and Guillory individually to request interviews. If we hear back, we’ll update this post.

Kramer, a.k.a. Danger Muffin, did publicly address the study on his Facebook page.

“The reason we did this research,” he wrote, “is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.”

He went on to note that the research “very minimally” deprioritized News Feed content, adding that “we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses [it].

"The goal of all of our research at Facebook is to learn how to provide a better service," Kramer concluded. "Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

Comments

This article manipulated my emotions.

Posted by Guest on Jul. 09, 2014 @ 7:22 am

Jamie Guillory is from UCSF, not USF.

http://profiles.ucsf.edu/jamie.guillory

Posted by guest on Jul. 09, 2014 @ 8:33 am

Really important conversations are happening here, but why post a personal photo of each of the publication's authors? Those are certainly not their official photos from their institutions or websites. Very, very creepy.

Posted by Guest on Jul. 10, 2014 @ 7:18 am

Facebook messes with our privacy, so it's fair to mess with theirs. I don't think it's creepy. I think it's fair. But I do think they themselves look creepy.

Posted by Greg on Jul. 10, 2014 @ 7:32 am

dumb enough to post personal information on a public website in the first place.

Posted by Guest on Jul. 10, 2014 @ 7:55 am

However, that doesn't change the fact that they shouldn't take advantage of people this way. I'm all for digging up their silly photos and other private things and posting them all over the internet, because that's what they do to other people.

Posted by Greg on Jul. 10, 2014 @ 11:30 am

likely to be used in ways you cannot predict and do not like.

Posted by Guest on Jul. 11, 2014 @ 7:48 am

Those who work for facebook have every expectation of privacy, while the rest of us have none. You can't have it both ways.

Posted by Greg on Jul. 12, 2014 @ 7:32 am

Do Androids Dream of Electric Sheep?

Posted by lillipublicans on Jul. 11, 2014 @ 9:33 am

I like how you specifically browsed through the lead author's FB photos and went out of your way to find and republish one of him making a silly face, like any reputable journalist would have. Stay classy, SFBG.

Posted by Guest on Jul. 10, 2014 @ 12:58 pm

Journalists always look for photos that portray their subjects in a flattering or unflattering way, depending on the message they're trying to convey with the article. Standard practice. I don't see a problem here.

Posted by Greg on Jul. 10, 2014 @ 3:01 pm

you are producing marketing and propaganda materials.

It is a problem if you are trying to do serious objective journalism. Then it just looks immature.

Posted by Guest on Jul. 11, 2014 @ 7:46 am
Posted by Greg on Jul. 12, 2014 @ 7:33 am

This sounds like the beginning of a class-action lawsuit for violation of privacy and a long list of other torts. Facebook will be required to provide the list of the human subjects. I suspect some of them won't be very happy after finding out their "participation" in the study.

Also nice to read the comments from EPIC. The group seems quite informed and vigilant monitoring these oppressive communication giants. I hope they win big on their assertion that Facebook violated a previous consent court order. If UCSF and/or Cornell was involved in the study that involved non-consenting human subjects they too should be sanctioned and heavily fined, along with Danger Muffin and Jamie Guillory who contributed to the criminal conspiracy to invade personal privacy.

Companies like Google, Facebook, Apple, Yahoo and others are a blight on Bay Area society in so many ways. They really don't belong in California. Their activities are toxic and many of the people who work with them are toxic. How many Ellis adn OMI evictions have been caused by these companies locating in the region? How much more rent are people paying because of these companies? They're despicable in almost every way.

Posted by Guest on Jul. 10, 2014 @ 6:34 pm

post private shit online.

Peole who value their privacy don't use something like FaceBook which is predicated on publishing stuff about yourself.

Posted by Guest on Jul. 11, 2014 @ 7:44 am

have no expectation of privacy once those pictures are online.

Posted by Guest on Jul. 12, 2014 @ 7:30 am

than that the Bay Area has successful world-beating businesses.

Posted by Guest on Jul. 11, 2014 @ 7:47 am

Every beginning researcher is taugt that informed consent is the cornerstone of research. There are exceptions to this: study of anonymous archival data, certain types of interviews, and observation of public behavior.

Were Guillory and Kramer arrogant enough to assume that Facebook is considered public behavior? Certainly it's not public if you need to have an internet device and create an account to view the contents.

Their behavior is reprehensible. As for Facebook, we know they are unethical--this is just one more blatant example.

Posted by Guest Marcia on Jul. 18, 2014 @ 12:04 pm

Post new comment

The content of this field is kept private and will not be shown publicly.

Also from this author

  • Garbage game

    Is Recology fudging the figures on how much SF waste is being diverted from the landfill, with the complicity of city officials?

  • Civil Grand Jury report highlights gifts made on mayor's behalf

  • Nob Hill neighbors seek to block mental-health clinic relocation