Filter bubbles, a threat to online democracy?

| Rense Kuipers

Who determines what you see online? You? Or are you merely a puppet in the hands of internet giants such as Google and Facebook? With the help of clever algorithms, these organisations know exactly what you are looking for.

Photo by: Gijs van Ouwerkerk

As a matter of fact, they are more than happy to put you in a ‘filter bubble’, your own world of information designed to reaffirm your world views. UT researchers Djoerd Hiemstra and Jan van Dijk shed light on this phenomenon.

'Did the Holocaust happen?'

For a long time, Google’s search results questioned whether the Holocaust actually happened. The first four results of a search for connections between vaccination and autism only contain websites that support this disproven theory. Computer scientist Djoerd Hiemstra believes this to be a disturbing phenomenon, as is the fact that Google automatically completes queries. ‘Perhaps this is mostly because these are businesses out to earn a profit. People pay large sums of money for ad campaigns and Google rewards them for it.’

In the past, when someone searched for ‘Did the Holocaust happen’, Google would show an article from the Neo-Nazi blog Stormfront that listed ten reasons why the Holocaust never happened. Because Google itself did nothing against this, Guardian journalist Carole Cadwalladr took matters into her own hands late last year. In her own words, she used ‘the only language that Google understands: money.’ By paying a large sum for her own Google AdWords campaign, she managed to elevate the Wikipedia page on the Holocaust to the top of the search results, instead of the Stormfront blog.

Hardly any responsibility

‘Is there nothing else Google can do?’’ Hiemstra wonders aloud. ‘The company hardly accepts any responsibility and does not take a moral stand.’ He does believe that clever search algorithms can be useful, though. ‘When you have hundreds of friends on Facebook, you are probably more interested in your friend’s status updates than those from a former classmate you knew twenty years ago. What is going wrong, however, is that people are being put into a bubble because they are commercially appealing.’

Suddenly, a Facebook timeline takes on an entirely different meaning. Professor and media sociologist Jan van Dijk agrees: ‘If you are an Ajax fan, you will mostly see content about Ajax. The social medium only reinforces the things you surround yourself with and where your interests lie.’ He does not believe that to be a good thing. ‘It only serves to exacerbate social polarisation.’

These are not absolute bubbles, Van Dijk states. ‘Just look at the American presidential election; you cannot avoid also seeing content from the other party. It is true, though, that certain opinions that fit well within your world view are reinforced. That is known as ‘confirmation bias’ and it is something we have been doing since the dawn of time. You automatically look for whatever suits you most.’


Hiemstra and Van Dijk are also critical of the lack of transparency from major organisations such as Facebook. ‘They are secretive,’ Van Dijk says. ‘Their algorithms are corporate secrets and form key aspects of their earning model. While they know everything about their users, users know hardly anything about Facebook.’ Hiemstra notes that both Facebook and Google are constantly changing their algorithms. ‘This is an act of deliberate secrecy designed to keep people from manipulating the system and protect their revenue streams.’

The corporations are not entirely to blame though, Van Dijk feels. We should also question our own actions. That is why he mentions the term echo chambers. ‘The difference with filter bubbles is that we create echo chambers ourselves by moving in the same circles and repeating the same opinions over and over again. It is rare for someone to deliberately look for information that does not confirm their views.’ Van Dijk: ‘People should know how to distinguish between facts and fiction online, but is it fair to expect that of them? There has to be a way to tell the two apart. Considering the enormous quantity of information we have to process these days, I am beginning to think that we are simply saturated.’

Federative search engines

In order to see the forest for the trees of commercial interests, Hiemstra researches the possibilities of ‘federative search engines.’ These not only show Google’s search results, but also those from other search engines. ‘You can be sure to receive relevant search results from several sources, which helps to dramatically weaken the monopoly position of an organisation such as Google.’

Returning to the core principles of the internet might also be a good idea, Van Dijk and Hiemstra believe. Hiemstra: ‘Think of internet fora for example – these are commonly overlooked today. People flock to communities where they feel safe and heard. At least they are free to choose their own community.’ Van Dijk: ‘Although there is no shortage of discussion on social media, there are no leaders involved in any of it. Everyone talks over each other. If you were to use moderators, these discussions might actually lead to something.’

Although both scientists view the filter bubble as a problem, Van Dijk believes there will always be people who do their best to discover the other side of a story. ‘Filter bubbles are not the end of online democracy,’ he says. ‘Although it is high time that we take a step in the right direction.’ 


You can also find this article in the latest issue of our Science & Technology magazine.