The topic is "emotional contagion" - whether emotions on Facebook are catching. In the study, the feeds of 689,003 Facebook users - about 0.053 percent of its 1.3 billion users - were altered for one week in January 2012. The subjects never knew. They just got shown different things, and differences in their behavior, if any, were measured.
Some were shown feeds that used happier language. Some were shown mood-neutral stuff. But about 155,000 were in the "less happy" set, with less happy words.
To measure impact, the study tracked what those 689,003 people did next. Did they now use more happy/neutral/sad language? Did the tweaks affect their moods?
The study says yes: It found "massive contagion." But, frankly, that effect was extremely mild, just over the lip of what is considered significant.
The big problem is the old problem with Facebook: It uses our info, and in ways we don't know about. Here, it took a step some think is too far: It messed with our heads to see what happened.
Can Facebook do that? Facebook says it can.
After all, users accept a Data Use Policy agreement - that long page of legalese you scroll to the bottom of so you can click "Accept." Few people read it. The agreement says, in part, that "we may use the information we receive about you ... for internal operations, including troubleshooting, data analysis, testing, research, and service improvement."
Here's what, for many, really smells. On Tuesday, it emerged that Facebook put the word research into the agreement only after the study had been done. It redrew the playing field retroactively.
The study didn't have to meet the high standards required of, say, multicenter drug studies on human beings. But the ethical issue remains. As Art Caplan puts it, "No way the people in this study had anything close to informed consent."
Caplan, former director of the Center for Bioethics at the University of Pennsylvania, is now head of the division of bioethics at the New York University Langone Medical Center.
"When Facebook gives you the list of the ways they can use your info," Caplan says, "it's more like a Miranda warning - but they haven't given you any choice. You can't say no. That doesn't count as consent. Informed consent is a back-and-forth, a process. There has to be some activity from the subject indicating that she or he not only understands but is willing to go along with the stipulations. If this ever got to court, a good lawyer would ask Facebook, 'Exactly how do you know they understood?' "
Jeff Bercovici, who covers social and digital media for Forbes, says, in essence: Get real. Nearly all web pages, and almost all commercial pages we visit, he says, gather info on us. and use it for marketing and moneymaking. It's naive to say we're shocked, shocked!
"Facebook's a private company that provides a free service," Bercovici says. "The trade-off is, they get to quantify our behavior and build businesses on that."
Facebook shouldn't deceive us about what it collects and how it's used, Bercovici says. But people's unease "has a lot to do with the language used to describe it. 'Facebook experimented on our moods' and 'Facebook manipulated our emotions' sounds very scary."
That after-the-fact addition of research, though? Bercovici agrees that it's "telling." "While they said they didn't need to put the word research into the [agreement], putting it in suggests they feel they did need to. I'd be interested in a little 'Who knew what and when did they know it?' transparency."
Backlash rages. Kramer of Facebook has apologized, followed by Sandberg. And others wonder if the Facebook brand, based on openness, will at some point lose the trust of its users.