When I was twelve, I already lived in the world of Web 2.0 – of user-generated content. I used basic social networks, video sharing websites, and spent time on online forums meeting other people who liked hacking Club Penguin. Later, I joined Facebook. By then, the social network had been growing for some time. But it was not yet an all-purpose platform to ‘connect’ everyone, collecting copious quantities of personal data, to process through obscure algorithms, ostensibly for their benefit, with the objective of securing advertising revenue by nurturing compulsive habits. Like most people, I gravitated towards these big social networks. They did and do consume most of the time and energy I spend online.
Over the past decade I have seen the Internet’s potential as a vehicle for meaningful interaction diminish, as digital networks were reworked by developers advancing commercial interests. Instead of fostering specialised and integrated communities, the most used social networks developed highly individuated user profiles and a feedback mechanism which encouraged people into uncomfortably close contact with their constellation of self-representing acquaintances. These features inevitably rendered digital sociality a theatre of vanity and cynicism, and in turn nurtured an industry to produces endless, banal streams of third-party content. As these platforms grew, popular attention focussed on echo-chambers and orchestrated data scouring for targeted political campaigning. I mourn for the time wasted over the last few years scrolling through feeds of misunderstanding strangers, dubious politics, advertising, and click-bait. Dopamine on tap. How to render digital media fit for a desirable social purpose can perhaps only be discovered by the germinating field of Internet studies, should a solution arise which can viably be implemented. Until then, we depend on social media, and are stuck in a purgatory of our own making.