Filter Bubble

echo chamber by hugh macleod

A filter bubble is a concept developed by Internet activist Eli Pariser in his book by the same name to describe a phenomenon in which websites use algorithms to selectively guess what information a user would like to see based on information about the user like location, past click behavior and search history. As a result websites tend to show only information which agrees with the user’s past viewpoint. Prime examples are Google’s personalized search results and Facebook’s personalized news stream. According to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble.

Pariser related an example in which one user searched Google for ‘BP’ and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were ‘strikingly different.’ The bubble effect may have negative implications for civic discourse, according to Pariser, but there are contrasting views suggesting the effect is minimal.

Pariser defined his concept of filter bubble in more formal terms as ‘that personal ecosystem of information that’s been catered by these algorithms.’ Other terms have been used to describe this phenomenon, including ‘ideological frames’ or a ‘figurative sphere surrounding you as you search the Internet.’ The past search history is built up over time when an Internet user indicates interest in topics by ‘clicking links, viewing friends, putting movies in your queue, reading news stories’ and so forth. An Internet firm then uses this information to target advertising to the user or make it appear more prominently in a search results query page.

Pariser’s concern is somewhat similar to one made by Tim Berners-Lee (inventor of the World Wide Web) in a 2010 report in ‘The Guardian’ along the lines of a ‘Hotel California effect’ which happens when social networking sites were walling off content from other competing sites––to grab a greater share of all users––such that the ‘more you enter, the more you become locked in’ to the information within a specific site. It becomes a ‘closed silo of content’ with the risk of fragmenting the Worldwide Web, according to Berners-Lee.

Pariser in ‘The Filter Bubble’ warns that a potential downside to filtered searching is that it ‘closes us off to new ideas, subjects, and important information’ and ‘creates the impression that our narrow self-interest is all that exists.’ It is potentially harmful to both individuals and society, in his view. He criticized Google and Facebook for offering users ‘too much candy, and not enough carrots.’ He warned that ‘invisible algorithmic editing of the web’ may ‘limit our exposure to new information and narrow our outlook.’ According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that it has the possibility of ‘undermining civic discourse’ and making people more vulnerable to ‘propaganda and manipulation.’

He wrote: ‘A world constructed from the familiar is a world in which there’s nothing to learn … (since there is) invisible autopropaganda, indoctrinating us with our own ideas.’

There are conflicting reports about the extent to which personalized filtering is happening and whether such activity is beneficial or harmful. Analyst Jacob Weisberg writing in ‘Slate Magazine’ did a small non-scientific experiment to test Pariser’s theory which involved five associates with different ideological backgrounds conducting the exact same search — the results of all five search queries were nearly identical across four different searches, suggesting that a filter bubble was not in effect, which led him to write that a situation in which all people are ‘feeding at the trough of a Daily Me’ was overblown.

A spokesperson for Google suggested that algorithms were added to Google search engines to deliberately ‘limit personalization and promote variety.’ Nevertheless, there are reports that Google and other sites have vast information which might enable them to further personalize a user’s Internet experience if they chose to do so. One account suggested that Google can keep track of user past histories even if they don’t have a personal Google account or are not logged in to one. Google has collected ’10 years worth’ of information, amassed from Gmail, Maps, and other services besides its search engine.

Analyst Doug Gross of CNN suggested that filtered searching seemed to be more helpful for consumers than for citizens, and would help a consumer looking for ‘pizza’ find local delivery options based on a personalized search and appropriately filter out distant pizza stores. There is agreement that sites within the Internet, such as the ‘Washington Post,’ ‘The New York Times,’ and others are pushing efforts towards creating personalized information engines, with the principle being tailoring search results to ones which users are likely to like or agree with.

The filter bubble concept is similar to a phenomenon in which people and organizations seek information which is initially perceived as relevant but which turns out to be useless or on in fact only partially useful, and avoid information perceived as irrelevant but which turns out to be useful. The problem happens because the real relevance of a particular fact or concept in these cases is only apparent only after that fact has become known. Before that, the idea of learning a particular fact may have been dismissed because of a misperception of irrelevance. Accordingly, the information seeker is trapped in a paradox and fails to learn what he or she really needs to know, and can be caught in a kind of intellectual blind spot. This phenomenon has been described as the ‘relevance paradox’ and it has happened in many situations throughout human intellectual development. A book entitled ‘The IRG Solution’ predicted and analyzed this problem and suggested a generalized solution.

One Comment to “Filter Bubble”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.