It is often said that constraints can fuel creativity. Well, the COVID-19 lockdown is a pretty big constraint. Amid the sadness and death, it has been interesting to see the new art and culture that is already emerging. Creativity working up against the boundaries we have set for ourselves. Artists looking afresh at the technology we are using to communicate, and wondering what new modes of design and storytelling they might enable. The most obvious example of this is video conferencing software. The grids of images that apps like Zoom use to display the other people in the chat have become part of our visual culture. I really enjoyed the Maltesers ‘Isolation Life’ series of adverts, and I love the video for ‘Phenom’ by Thao & The Get Down Stay Down (intriguing song, too). Continue reading “ZOOMSHIFT”
I’ve been busy recently—work, study, Christmas—and haven’t felt a huge urge to write anything here. So let’s round off the year with some old fashioned web-logging: the mere bookmarking a story on a subject that feels emblematic of the entire decade.
A discussion about a pre-publication research paper with some shoddy methodology leads me to a New York Timesarticle by Kevin Roose, published in June this year, chronicling one young man’s journey into alt.right radicalisation. A key insight:
The radicalization of young men is driven by a complex stew of emotional, economic and political elements, many having nothing to do with social media. But critics and independent researchers say YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure and advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens.
The impact of algorithms on our psyche and society has become very apparent in the last few years. The power to determine who sees what and when can change moods and swing elections. But discussing the issue this week, Roose and other data journalists present some important caveats. The algorithm isn’t everything.
Same goes for journalism. The incredible reporting from @kevinroose was so powerful because the dataset he analyzed showed what *content* Caleb watched over time. It could have been recommended in the algorithm, but not necessarily.
The only thing I’d add, coming at the issue (as I do) with an eye on freedom of expression concerns, is that the way the algos affect our interests is not in itself a bad thing. We’ve all fallen down algorithm-induced ‘YouTube Rabbit Holes in our time, and when the subject is not political, the way that the system steers users away from mainstream content and into the back-catalogue the results can be delightful. Last night, for example, I watched a load of astonishing videos of ballet performances. I know nothing about ballet and cannot now remember how I happened upon them (perhaps I clicked on a link on someone’s blog?) but it’s possible this could be the start of a deep and consuming interest that we would usually applaud.y Even political ‘radicalisation’ is not necessarily a bad thing. I imagine that ‘radicalising’ people to fight for racial or gender equality (say) or to become environmental activists, is actually desirable. The issue, as ever, is not with ‘radicalisation’ per se but ‘violent radicalisation’ or (as the Commission for Counter Extremism recently suggested) with ‘hateful extremism.’ Algorithms that serve us relevant content are useful tools for many that can be misused by a few. Or, as Kevin Roose and Becca Lewis point out above, algorithms don’t radicalise people; people radicalise people. That is not to say that we shouldn’t intervene to temper the algorithms. Just that the challenge for tech companies and governments is not one of banning, but of balance. This will be the task of the next decade. Let us hope that by 2030 we will have reached a fair settlement.