Editor’s Note: This post is written by Jenny Stromer-Galley, Associate Professor here at the iSchool and Vice President of the Association of Internet Researchers; it originally appeared on her blog on July 1, 2014.
A firestorm, in the academic world anyway, erupted this week when the AV Club posted a short review and commentary of a new research study on Facebook users.
The article, written by Adam Kramer, Jamie Guillory and Jeffrey Hancock and published in the journal Proceedings of the Academy of Natural Sciences, reports on a nearly 700,000-person study. For a week, a set of the participants experienced news feeds that had fewer positive emotional posts, and another set had news feeds with fewer negative posts. There was also a control condition—that set of participants saw news feeds in which some of the posts were randomly removed. Their study lends support to the idea that there is emotional contagion in social networks: if people see more positive emotional messages, they produce positive messages in kind; if they see negative messages, they write more negative messages.
The firestorm centers around research ethics. Researchers whose work is funded by the federal government, or who work for organizations that receive federal dollars, must submit their research proposals to review by a board of members from the community who consider the ethical implications of the research. These boards are often known as institutional review boards or IRBs. The review board ensures research participants are treated with respect and with regard for their welfare. It also helps researchers do the necessary work to inform participants of their involvement and their rights, and to consent to be a participant in a research study if they wish, or to opt out if they don’t want to participate. The reasons for this go back to unethical research conducted through the early 1970s, from medical to psychological experiments, in which people were subjected, usually without their knowledge, to what are now viewed as inhumane, injurious, and physically and psychologically damaging experiences.
Facebook Business or Facebook Research?
Except, there are two problems with these lines of reasoning. First, it’s true that Facebook is a business, but the experiment they conducted with heavy involvement by faculty from the University of California, San Francisco and Cornell University wasn’t conducted for business purposes, at least not obviously. It was conducted to inform our understanding of human behavior, which is the core of basic research. And basic research is subject to core ethical principles of human research. In the words of the Belmont report (the document all researchers must abide by if they do research with people), all generalizable research conducted with humans must ensure that “persons are treated in an ethical manner not only be respecting their decisions and protecting them from harm, but also by making efforts to secure their well-being.”
I know that big data and the temptation to experiment on thousands of people who are a captive audience on a platform like Facebook is too tempting for researchers like me to not go after. But, just because researchers can experiment on people doesn’t mean we should, and we certainly should not be conducting mass experiments on the unwitting without a purposeful, objective ethical review, with careful consideration of the balance of knowledge gained at the expense of human participants (as suggested by the Association of Internet Researchers ethics guide). Human subjects review boards need to extend not only to federally funded research but to any research that aims at generalizable knowledge. Without such protections, unfortunately, researchers in their quest to reveal the truth of human behavior will repeat history. They will fail to ensure that people are treated with dignity and respect and instead will treat them as lab rats.
Please share your thoughts in the comments below or via twitter to @profjsg.