Monday, June 30, 2014

Facebook Manipulates Our Moods For Science And Commerce: A Roundup

i i

hide captionFacebook researchers manipulated newsfeeds of nearly 700,000 users to study "emotional contagion."





iStockPhoto<img data-original="http://ift.tt/1k6XTSE; title="Facebook researchers manipulated newsfeeds of nearly 700,000 users to study "emotional contagion."" alt="Facebook researchers manipulated newsfeeds of nearly 700,000 users to study "emotional contagion.""/>



Facebook researchers manipulated newsfeeds of nearly 700,000 users to study "emotional contagion."



iStockPhoto



So, that happened.



Scientists published a paper revealing that in 2012, Facebook researchers conducted a study into "emotional contagion." The social media company altered the newsfeeds (the main page users land on for a stream of updates from friends) of nearly 700,000 users. Feeds were changed to reflect more "positive" or "negative" content, to determine if seeing more sad messages makes a person sadder.



Monkey See



Lab Rats, One And All: That Unsettling Facebook Experiment







All Tech Considered



Watch This To Put Your Facebook Feed In Perspective







The Two-Way



Facebook Scientists Alter News Feeds, Find Emotions Are Affected By It









The bottom line is newsfeeds were tweaked without warning because Facebook users agreed to the social giant's general terms of data use, and researchers tracked emotional responses of test subjects by judging any subsequent changes in their use of language. It's unclear if you, or I, were tested. As users, the checkbox agreement gave permission for this kind of psychological experimentation.



If that isn't bleak enough, we've reported previously that in a separate study, University of Michigan researchers found the very existence of feeds was making some users sadder.



The Internet is overwhelmingly outraged. "Even the Editor of Facebook's Mood Study Thought It Was Creepy," Adrienne LaFrance wrote, at The Atlantic. If you're just catching up, here are a few reads to consider:



New Statesman: Facebook can manipulate your mood. It can affect whether you vote. When do we start to worry?



Laurie Penny explains that the study's findings are not the point. That Facebook did this is the point, and argues the potential for more is why the research feels so wrong.

"... I am not convinced that the Facebook team knows what it's doing. It does, however, know what it can do — what a platform with access to the personal information and intimate interactions of 1.25 billion users can do.



...



"What the company does now will influence how the corporate powers of the future understand and monetise human emotion. Dr Adam Kramer, the man behind the study and a longtime member of the company's research team, commented in an excited Q & A that 'Facebook data constitutes the largest field study in the history of the world.' The ethics of this situation have yet to be unpacked."





Forbes: Facebook Doesn't Understand The Fuss About Its Emotion Study



Reporter Kashmir Hill has been aggressively reporting this story for Forbes and got a response from Facebook which stipulated that the research was conducted for a single week and none of the data used was associated with any specific user. The Facebook response continues:

"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."





Meanwhile, over on Hacker News, there's a lively debate on whether the response is overblown. You can check out the debate, but its premise is a thought from the venture capitalist and early Internet pioneer Marc Andreessen:



And finally, our pop culture writer Linda Holmes weighed in this morning, in her piece, Lab Rats One and All: That Unsettling Facebook Experiment. She closes with a practical suggestion for Facebook:

"If this kind of experimentation is really OK, if it's really something they believe is within their everyday operations and their existing consent, all they have to do is clarify it. Give people a chance to say yes or no to research that is psychological or sociological in nature that involves not the anonymized use of their data after the fact but the placing of users in control and experimental groups. Just get 'em to say yes or no. If it's really not a big deal, they'll say yes, right? It really seems like a pretty reasonable request."










via Smart Health Shop Forum http://ift.tt/1iNSBAy

No comments: