The media have refrained from reporting Wood’s comments. This is a good thing. The joke assumes the guilt of the person accused of April Jones’ murder, so reporting it would prejudice a trial. Media restraint also minimises any distress to April’s family, and denies the attention-seeker further opportunities to provoke.
However… The only reason this Woods has received any attention in the first place was because he has been hauled before a magistrate! Had he not been arrested and charged, the comment would have been lost in the obscurity of his Facebook timeline after a couple of days. The comment obviously violates Facebook Terms & Conditions, so he might have been banned from using the site. We might describe that as a contractual matter, not criminal. And he might have lost a lot of friends (both in the real sense and the Facebook sense). But this is a social sanction, not criminal. Continue reading “Another Misguided Facebook Conviction”
Reading this article about the genesis and project management of Google+, a new social network, reminded me of the Through A Web Darkly event I attended at Demos last month. They’ve uploaded a helpful video outlining the main theme of the event – the idea that the ‘personalisation’ of the web might be a problem.
Its interesting that, as we move into an era where all the HTML code on our websites have been crafted for you us by the social networking companies, we are are nevertheless still the creators, or maybe the curators, of our online world. As Tom Chatfield put it (paraphrasing Alexis Madrigal) “Twitter is a human recommendation engine of which I am the algorithm.” The same is true of Facebook too, of course, which prioritises those people whose content you most frequently ‘like’. It is also true of Google, which is starting to take your location and your past browsing history into account when delivering search results. The danger with this, well documented with respects to Twitter, is that opinions that differ from your own are eventually weeded out of your personalised stream of information. Mistaken or ill-thought out beliefs are affirmed and not challenged, and our knowledge is weaker as a result. On a macro level, our democracies can become more polarised, with less consusus and a smaller space for compromise.
Once we are aware of this phenomenon, we can of course guard against it ‘manually’, by following people we disagree with, deliberately mixing up our RSS feeds, and otherwise introducing disruptions into the stream. There are two problems with this approach. The first, is that by confusiong or confounding the machines at Google and Facebook (to ensure that they serve you more diverse content) you are actually breaking their business model, because they can no longer target relevant adverts at you. If everyone did this, then advertisers will find other places to spend their pounds and dollars and the social internet services we rely upon may disappear. This is not necessarily our concern, and many people argue that essential web tools should not be provided by corporate bodies at all.
The second problem is that not everyone will introduce these disruptions into their stream. So while I may be reading all manner of different people with different views, they may not be reading me (or people like me) in return!
The worry, therefore, is that the liberating and equalising effects of the internet may begin to fizzle out. So far, we have been trumpeting the fact that anyone can become a global publisher with just a few keystrokes and clicks of a mouse. In recent years, once a website has been published, the author had the reasonable expectation that the site would have an equal chance of appearing, when a person looked for that subject matter on Google or other search engines. In the near future, this is unlikely to be so.
My final thought: I wonder what moral obligations Facebook etc have to me, to not filter what I publish on the web… Is there a free speech issue at stake here?