Monday, June 30, 2014

McLuhan and Facebook

There is an arrogance of youth that from time to time seems to get out of hand. We all have suffered from it and we will most likely will. Sometimes that arrogance is well placed, such as movements against discrimination or wars which make no sense. At other times the arrogance is less well placed. One of my favorite gripes regards the peer grading systems which are less peer grading than exposes of that very youthful arrogance gone wild. In many of the younger generation there was a feeling that any opinion, especially that of the youth, is as valid as any others. If one feels they are right then that is all that is necessary. Then they apply that in a manner which drives facts and any semblance of truth into early graves.

Now arises a terrifying experiments. As the Guardian reports:

But now Facebook, the world's biggest social networking site, is facing a storm of protest after it revealed it had discovered how to make users feel happier or sadder with a few computer key strokes.
It has published details of a vast experiment in which it manipulated information posted on 689,000 users' home pages and found it could make people feel more positive or negative through a process of "emotional contagion". In a study with academics from Cornell and the University of California, Facebook filtered users' news feeds – the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own. Another test reduced exposure to "negative emotional content" and the opposite happened. The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

That is the researchers managed to manage what people saw and tuned it based on their known profiles. As is often the case I am reminded of the McLuhan quote from Drucker:

"Did I hear you right," asked one of the professors in the audience, "that you think that printing influenced the courses that the university taught and the role of university all together." "No sir," said McLuhan, "it did not influence; printing determined both, indeed printing determined what henceforth was going to be considered knowledge."

That is the medium, in this case a manipulated Facebook, becomes "knowledge" and "truth".  Namely we see that any "social media" site can become a truth bender site. One does not have to go somewhere, one gets the distorted truth created for you by the benevolent entity who you somehow believe is altruistic. In fact they are "truth benders".

The Guardian goes on further:

Researchers have roundly condemned Facebook's experiment in which it manipulated nearly 700,000 users' news feeds to see whether it would affect their emotions, saying it breaches ethical guidelines for "informed consent". James Grimmelmann, professor of law at the University of Maryland, points in an extensive blog post that "Facebook didn't give users informed consent" to allow them to decide whether to take part in the study, under US human subjects research. "The study harmed participants," because it changed their mood, Grimmelmann comments, adding "This is bad, even for Facebook." ...But the study has come in for severe criticism because unlike the advertising that Facebook shows - which arguably aims to alter peoples' behaviour by making them buy products or services from those advertisers - the changes to the news feeds were made without users' knowledge or explicit consent.

Namely experiments like this in the social sciences need informed consent. In this case they just manipulate half a million people.  One can just imaging the long term consequences. How this for a "peer review". The interesting arrogance of youth is that these folks even published their efforts. They were accompanied by academics who frankly should have known better, but one guesses did not.