What is the Attention Economy? And Why is it such a problem?
New article written for RTE Brainstorm
In March, the US House of Representatives passed a bill that could see TikTok banned in America. The Chinese parent company, ByteDance, may be forced to sell its shares due to concerns over national security and the platform’s addictive nature. In the EU, a recent spate of legislation (DSA, DMA, and AI Act) grapples with the advance of social media, and research has made it clear that these platforms play a significant role in the youth mental health crisis. In 2024, it seems we are reaching a critical junction for the future of the attention economy and society. But what is the attention economy? And why is it such a problem?
The term “attention economy” describes the business model of companies that offer their goods or services for free but profit from selling user data, extracted from attention, to third parties and advertisers. Traditionally, this business model was used for television, radio, and newspapers. However, on the internet, it has become the default for large social media companies like Facebook (Meta), YouTube, LinkedIn, Twitter (X.com), and TikTok.
The critical difference between the traditional and internet models is the real-time collection of user data. The user’s thoughts and behaviour are tracked and recorded on the internet. This data is then sold to advertisers and third parties, who pay for the increased opportunity to modify the user’s behaviour to their preferences. For the internet model, the user is not the actual customer but the product. As former Facebook and Google Chief Engineer Justin Rosenstein said, “If you are not paying for the product, you are the product.”
The companies profit from advertisers and third parties. To do so, they innovate on the level of attention-capture or addictive design techniques. Silicon Valley guru Nir Eyal notes in Hooked (2014), “Companies increasingly find that their economic value is a function of the strength of the habits they create…” and that the “Brass ring” of these design techniques is to create internal triggers to use the product to reduce stress “so that the user identifies the company’s product or service as the source of relief”. The architecture and structure of attention economy platforms are optimised to capture as much attention as possible, without regard for people’s well-being, for as long as possible.
These companies succeed by creating habits, and habits that we don’t want are called addictions. One reason companies have gotten away with this type of manipulation of users is that traditionally, addiction was viewed through the lens of “substance” addiction, i.e. alcohol or drugs. However, with the advent of digital technologies, particularly online gambling, we have seen the rise of “behavioural addictions” - addiction to a behaviour or the feeling of a behaviour. The latter is the case in Social Media, where the addictive design techniques are directly imported from gambling technologies.
In Addiction by Design (2012), Natasha Dow Schull shows how gambling companies use algorithms to calculate how much a player can lose without quitting. When the algorithm senses the threshold is approaching, it offers rewards, such as vouchers or meal coupons, to keep the person gambling. On social media platforms, the user’s newsfeed is populated by a content-curation algorithm that gathers data about users to generate a newsfeed to keep them scrolling as long as possible, and this data allows the algorithm to “see” the user’s behaviour over time, thus predicting what will keep the person scrolling. For example, if providing more and more eating disorder content will keep a young girl on Instagram, this is what the algorithm will do. If negative mood predicts increased social media use, then downregulating the user’s mood will become the norm for the algorithm. Many other examples of addictive design techniques are available:
The infinite scroll: removes friction by endlessly populating new content instead of asking the user to click onto the next page, fostering a state of “dissociation” or “doom scrolling”.
Intermittent Variable rewards: notifications, messages, and novel stimuli trigger the brain’s reward system, similar to the anticipation of gambling or eating food, and reinforce “checking habits.”
Manipulating Social Validation: Features like “likes” and “shares” offer social rewards or punishments (social ostracisation and loss of status) according to the platform's game rules, taking advantage of powerful human cognitive biases like fear of missing out, social comparison, and desire for status and recognition.
As you can see, there is a fundamental conflict between the goals of users and the goals of companies in the attention economy - “No one wakes up in the morning and asks: How much time can I possibly spend using social media today?” The attention economy might sound like an episode of Black Mirror. However, it is a much more typical case of corporations transgressing against users and society to succeed in a market with bad incentives (albeit with fancier technology).
Innovations like generative AI and interactive AI will pour petrol on the existing fires and create entirely new problems, which we cannot yet anticipate, so the time to act is now. Tristan Harris, creator of the Social Dilemma documentary, argues that these companies cannot significantly change themselves to protect users because of the incentives of their business model. Therefore, the Government needs to protect users and democracy through regulation and education:
Separate the unholy alliance of behavioural data collection and the attention economy business model.
Foster media literacy in the general public.
Ban addictive design techniques on the systems and design level to protect users online.