Home / General / Katie Surrence: The Problem With the Facebook Emotional Contagion Study

Katie Surrence: The Problem With the Facebook Emotional Contagion Study


Tal Yarkoni argues here that ethical concerns about the Facebook emotional contagion study are misplaced, for four reasons: Facebook only removed emotional content, and did not heighten or invent content, the Facebook news feed environment is highly manipulated anyway, so further contrivances are hardly a violation of what we should expect, in the scheme of reasons to experiment with our news feed, social science research is better than most, and manipulation and influence themselves are a constant part of life. He titles his post, “In Defense of Facebook.”  I find it unsatisfying as a general defense of waiving informed consent for this study.  Two researchers who were not Facebook employees participated in this study, it was approved by an IRB (according to Susan Fiske, quoted here, though of which institution is unclear), and published in PNAS. This was human subjects research, and should be subject to those standards, not only those set forth in the Facebook Terms of Service.

Later, in comments, he links to the HHS website and quotes the rules for when informed consent may be waived:

(d) An IRB may approve a consent procedure which does not include, or which alters, some or all of the elements of informed consent set forth in this section, or waive the requirements to obtain informed consent provided the IRB finds and documents that:

(1) The research involves no more than minimal risk to the subjects;

(2) The waiver or alteration will not adversely affect the rights and welfare of the subjects;

(3) The research could not practicably be carried out without the waiver or alteration; and

(4) Whenever appropriate, the subjects will be provided with additional pertinent information after participation.

I don’t believe either requirement (3) or (4) is met by this study. It would not have been impractical to get real consent for this study; the researchers could have done it via Facebook message. They could have formed the studies aims vaguely in the consent form to avoid demand characteristics (the phenomenon of research participants behaving the way they suspect they are supposed to, or reacting against what they suspect they are supposed to do). They could have offered a form of compensation—a free promoted post, or something similar.  They could have debriefed participants, thanked them for their participation, shown them the previously excluded content, and explained to them what the aims of the study were and why they thought it was important, all via Facebook message.  I think all of these actions would have made participants feel positive about their participation and encouraged them to participate in research in the future, via Facebook or other means, which is what human subjects researchers should be striving for.


The researchers themselves don’t appear to believe that the research qualified for a waiver of informed consent. Rather, they argue that the ToS was consent: “LIWC was adapted to run on the Hadoop Map/Reduce system (11) and in the News Feed filtering system, such that no text was seen by the researchers. As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” But the Facebook ToS don’t contain any of the elements of informed consent as defined by HHS. This would be a terrible precedent to set.


Susan Fiske is likely right that some of the heat the reaction to this research reflects a larger sense of unease and frustration that our interactions with our friends are mediated by a corporate platform that we don’t understand and can’t control.  When people say, “If you don’t like it, just don’t use the product,” it’s a little unfairly facile.  Facebook attracts enough social energy that in some communities it’s a serious act of self-exclusion not to participate.  But whether or not there’s a problem with the role Facebook plays in our lives, it’s in no one’s interest—not social scientists’ and not research participants’—for academic researchers to be perceived as or to be the same kind of shadowy force.  Facebook exists to make us click ads.  Social science research is supposed to help us understand ourselves better, to provide some benefit to humankind.  Social scientists shouldn’t adopt corporate ethical standards as their own. (Although it is likely symptomatic of a problem that we seem not to expect businesses to have the ethical provision of real utility as a primary reason for their existence. This post from Adam Kotsko made me realize how much I’d absorbed the ideology that businesses exist to make money, as opposed to things people can use.) Even if Tal Yarkoni is right that we are in “reasonable people can disagree” territory, that an IRB wasn’t obviously wrong to approve it, the retrospective knowledge that social scientists participated in an experiment with Facebook that made people feel uncomfortable—in whatever mild sense, violated and paranoid—should be enough to tell us that something went wrong.  Corporate data sets are a rich source of information about human behavior, and it would be a shame if they were off limits to academic study. The best way to avoid a backlash about collaboration between private enterprise and academic social scientists is for researchers to conduct themselves with a high degree of transparency and respect.

Finally, a word on their claims: many others have pointed this out, but the authors’ finding that their manipulation changed the number of emotion words used by participants is very different from finding that it changed participants’ emotional experience. The authors’ findings are interesting as they stand, but they overinterpret them. If they are actually interested in emotional contagion, they should do something to measure mood. One of the advantages of asking participants for consent would have been that the researchers could have asked participants directly about their mood; that is, they could have done a better study.

  • Facebook
  • Twitter
  • Google+
  • Linkedin
  • Pinterest
It is main inner container footer text