Farhad Manjoo of the New York Times reports that Facebook has disclosed that it:
... tinkered with about 700,000 users’ news feeds as part of a psychology experiment conducted in 2012 …[this] inadvertently laid bare what too few tech firms acknowledge: that they possess vast powers to closely monitor, test and even shape our behavior, often while we’re in the dark about their capabilities.
The experiment has produced the astonishing discovery that:
... showing people slightly happier messages in their feeds caused them to post happier updates, and sadder messages prompted sadder updates,
Can you believe it?
It seems some people can’t. Believe, that is, that Facebook – Facebook– would do such a thing. Aren’t we all friends here?
Apparently not. The news has:
ignited a torrent of outrage from people who found it creepy that Facebook would play with unsuspecting users’ emotions. Because the study was conducted in partnership with academic researchers, it also appeared to violate long-held rules protecting people from becoming test subjects without providing informed consent.
Facebook users should spread the word – on Facebook, perhaps – that people with power over the masses will use it, whether the masses like it or not, Or know it or not. Consider the recent behavior of the IRS. And the NSA.
It is part of our flawed nature. And one of the arguments against giving people too much power or surrendering too much of your individual sovereignty or trusting those who operate in secret to behave honorably. It should be noted that when people who have abused their power over unsuspecting masses are exposed, their first response is not shame at getting caught but indignation and a quick shift over to the attack. Though some, as an expedient, take the
Fifth. They have rights.
Facebook’s celebrity boss, Sheryl Sandberg, as noted by Rebecca Hiscott at the Huffington Post:
...said Wednesday she was sorry a controversial study that manipulated nearly 700,000 people's Facebook accounts was “poorly communicated."
But she stopped short of actually apologizing for the study itself.
“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication we apologize. We never meant to upset you.”
No. They count on your docility.
The Facebook herd needs to know: You are no friend. You are material.
Meanwhile: Patrick Tucker at Defense One has a story that takes this thing where it was sure to go. The headline of which is:
The Military Is Already Using Facebook to Track Your Mood
Franz Kafka would like to friend you on Facebook.