30/05/2019 2:58 PM IST | Updated 30/05/2019 2:59 PM IST

Personalized Entertainment And Polarized Elections: Brought To You by AI

Hyper personalization through algorithms can lead to a more polarized world, if we don’t take the time to think about this effect.

OstapenkoOlena via Getty Images

Have you noticed how there’s always new content to watch on the internet that really speaks to you? How it almost feels tailor-made for your consumption? What about news articles which convince you that the political party of your choice is doing fantastic work while the others are train-wrecks? This phenomenon of hyper-personalization of entertainment and politics is driven to a very large extent by a class of Artificial Intelligence algorithms known as recommendation systems.

The algorithm

Like the name suggests, recommendation systems have a single objective — to make a few specific and personalized recommendations out of a large number of choices to keep individuals glued to their devices. The algorithms simultaneously find items that are similar to each other, and individuals who are similar to each other. This information is then used to recommend new items to each individual. So for example, you might get a recommendation because it has similar characteristics to something you already like — or because someone else, who largely enjoys the same things you do, liked this new thing.

For the latest news and more, follow HuffPost India on TwitterFacebook, and subscribe to our newsletter.

A concrete example from everyday life: a streaming platform (like Netflix) contains viewing data of different users on different films. Consider two users on the platform, 'Reshma' and 'Shruti'. The data shows that they tend to watch a lot of the same films, particularly films with a strong female lead. The data also shows that Reshma has watched a film in this category which Shruti hasn’t, “Raazi”. A  data-driven recommendation for Shruti is ready: watch “Raazi”.

The algorithm is essentially coded common sense, it behaves like a virtual friend who says to Shruti, “Reshma and you are so similar, she loved this movie and I’m sure you will too”. What makes recommendation systems particularly effective is that instead of a single common friend, they use data from millions of users who are similar to each other. Today these algorithms are used on almost all digital platforms from product recommendations on Amazon, content recommendations on Netflix and YouTube to social networks like Facebook and Twitter.

The attention economy

It’s easy to see why these algorithms have taken over. As entertainment moves online and the choice of content becomes seemingly infinite, companies vie with each other for every millisecond of our attention. The best algorithms lead to the most personalized content and the highest user engagement. All of this translates to higher advertisement revenues for the company. This seems like a win-win situation! Better content for users and more money for the platforms. But scratch below the surface and you’ll find some dark side effects of this attention economy.

Algorithms do not care about whether a purported news article is verifiably true. All that matters is that it keeps users engaged.

The most immediate danger of constant engagement with entertainment on our devices is psychological. What digital media companies strive for is closer to addiction than engagement. Psychologists now even have a term for this form of addiction among Facebook users: Facebook Addiction Disorder or FAD (a rather apt acronym). Like other addictions, digital media addiction is characterised by dependency, withdrawal symptoms and reclusive behaviour.  

The other danger is political. Economists classify some industries as natural monopolies. These industries typically require sophisticated network infrastructure and also have economies of scale, both of which restrict competition. While traditional examples like railways, water supply and electricity are limited by geographical constraints, the digital era brings with it a new set of global natural monopolies like Facebook, Netflix and YouTube. The steady influx of more engaged users onto these platforms makes their algorithms ever more efficient, giving these companies unprecedented power over vast populations. These companies can potentially push favourable content on behalf of political lobbies, sell user data to political organizations or become susceptible to politically motivated hacking.    

Polarized politics and fake news

Which brings us to the buzzword in global politics: polarization. Many commentators have spoken about the perils of polarization in electoral democracy, while others have eloquently defended it. However most have overlooked the critical role of algorithms in this process. Just like in entertainment, the internet has provided a voice to an ever-growing number of political publications, websites and blogs. This sudden excess of political content is shared on social media via recommendation systems.

But there’s a catch. Recommendation algorithms are likely to engage liberals using opinion pieces that demonize conservative governments. Conservatives in turn are likely to be bombarded with articles extolling the virtues of the very same governments. Personalized recommendations thus lead to the strengthening of individual political biases, an effect known to psychologists as confirmation bias. And since more engaged users lead to higher advertising revenues for social media platforms, they have little incentive to prevent such polarization. This leads to the creation of political echo chambers where more extreme views hold sway, while the space for centrists shrinks.

To remain relevant in the algorithmic age, experts will need to invest in creative ways to grab public attention. If not, then we can be sure that the era of experts will hurtle to a screeching halt.

News is another casualty of algorithms-driven social media. Algorithms do not care about whether a purported news article is verifiably true. All that matters is that it keeps users engaged. A drab news article on climate change may receive less traction than a sensational piece by climate change deniers. In general, more sensational pieces are more likely to be recommended frequently, creating an ecosystem where fake news thrives. And once you’ve watched (for example), a conspiracy theory video on YouTube, expect to see them take over your feed because the algorithm thinks this is what you’re interested in.

Regaining control

Recommendation systems are the new reality and efforts are already underway to contain their side effects. Some devices now come with in-built daily usage limits for users to self-regulate their dependence. Computer scientists are tweaking algorithms so that they can provide personalization without polarization. Policy makers are grappling with the nuanced question of regulating internet content to prevent the spread of fake news.

The fact that these algorithms reward engaging content means that experts and academics have to radically change how their work is communicated to the public. For too long expert writing has been dry, inaccessible and tedious. To remain relevant in the algorithmic age, experts will need to invest in creative ways to grab public attention. If this does transpire, it will be an unlikely gain from the attention economy. If not, then we can be sure that the era of experts will hurtle to a screeching halt.

Political analysts will have to account for the immense power of technology combined with big money. Along with traditional ground reporting, each victory or loss has to be contextualized within the framework of online content, political advertisements and social media reach. And political parties that keep step with this new reality will dominate electoral democracies globally.

For the average person, just being consciously aware that algorithms steer our digital consumption is an effective way to regain some control. Each time we find ourselves too hooked to an app, we should know the algorithm is trying to get us addicted. Each time we find our political opinions go unchallenged, we need to check how the other side thinks. We’ve heard that ignorance is bliss and that knowledge is power. In the age of personalized algorithms, wisdom is knowing that we’re each vastly ignorant in our very own special way. No matter how hard our devices try to convince us otherwise.