Facebook promises to manipulate you ‘differently’ in future research projects

[caption id=”attachment_38681″ align=”aligncenter” width=”600″] File / Facebook 

[/caption]

Facebook is promising to manipulate users “differently,” in response to backlash it received for conducting a controversial experiment that intentionally toyed with users’ emotions.

Facebook announced new research guidelines Thursday, assuring lab rats users that any research “focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin.”

By “enhanced review,” Facebook means a panel of senior researchers and members of legal and privacy teams, so you can rest easy knowing that the decision to research what is “deeply personal,” including what you self-censor and decide not to post, will be made by Oz himself and not a flying monkey.

Facebook also promised to include research training for employees, as well as to publish its research online.

It is actually amusing that Facebook—the world’s largest social network, an advertising Goliath, a technological revolution and a bottomless well of data on human behavior—chose to volunteer it had manipulated the news feeds of 689,003 users only to find out good-news-makes-people-happy-and-bad-news-makes-people-sad. Talk about proving the obvious.

Facebook insists it is not required to seek your consent when using your data because users have already opted in simply by logging on.

Since May 2012, Facebook’s small print included a section that explicitly gives Facebook permission to use your personal information for “internal operations,” including “data analysis,” “testing” and “research.” However, when the news feed research actually took place in January 2012, Facebook’s terms of service didn’t explicitly include the word research, but it did include testing and data analysis.

Facebook’s attempt at reassuring its users did not sit well with data privacy advocates. Jeff Chester, the executive director of the Center for Digital Democracy, told National Journal that users should be able to opt out of any experiments.

“Facebook routinely engages in research on its users to better perfect how it serves its advertisers,” he said. “All of this should be disclosed and a user should decide whether they can be part of any research effort.”

Facebook kowtowing to privacy concerns and user consent advocates is highly unlikely. Abandoning the social network might be an overreaction, but users should be aware that everything you do on Facebook is recorded and observed. Facebook owns your data and it controls the information you consume through it.

Consumer beware. Because big data is consuming you.

Related Content