(2014-06-28) Facebook Feed Emotional Manipulation

For one week in January 2012, data scientists skewed what almost 700,000 FaceBook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves... The effect the study documents is very small, as little as one-tenth of a percent of an observed change... when researchers reduced the appearance of either positive or negative sentiments in people’s News Feeds—when the feeds just got generally less emotional—those people stopped writing so many words on Facebook... Susan Fiske had earlier conveyed to The Atlantic that the experiment was IRB-approved. “I was concerned,” Fiske told The Atlantic on Saturday, “until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time.”

Zeynep Tufekci places this in a broader Context: The secret truth of power of BroadCast is that while very effective in restricting limits of acceptable public speech, it was never that good at motivating people individually. Political and ad campaigns suffered from having to live with “broad profiles” which never really fit anyone. What’s a soccer mom but a general category that hides great variation? With new mix of big data and powerful, oligopolistic platforms (like Facebook) all that is solved, to some degree.

Jonathan Zittrain notes that FaceBook can effect Voter Turnout. Digital Gerry Mander-ing occurs when a site instead distributes information in a manner that serves its own ideological agenda. This is possible on any service that personalizes what users see or the order in which they see it, and it’s increasingly easy to effect. There are plenty of reasons to regard digital gerrymandering as such a toxic exercise that no right-thinking company would attempt it. But none of these businesses actually promises neutrality in its proprietary algorithms, whatever that would mean in practical terms. And they have already shown themselves willing to leverage their awesome platforms to attempt to influence Public Policy.

Danah Boyd explores why people seem more bothered by this than by other/past manipulations by FaceBook. The more I read people’s reactions to this study, the more I’ve started to think the outrage has nothing to do with the study at all. There is a growing amount of negative sentiment towards Facebook and other companies that collect and use data about people. In short, there’s anger at the practice of Big Data. This paper provided ammunition for people’s anger because it’s so hard to talk about harm in the abstract.

Om Malik thus thus notes With Big Data Comes Big Responsibility: a lack of clarity around data-intentions is to blame. And the only way I see to overcome that challenge is if companies themselves come up with a clear, coherent and transparent approach to data. Instead of an arcane Terms Of Service, we need plain and simple Terms of Trust.


Edited:    |       |    Search Twitter for discussion