Bursting the bubble

October 1, 2011 by · Leave a Comment 

The Filter Bubble: What the Internet Is Hiding from You. Eli Pariser; $30.00 cloth 978-1-59420-300-8, 294 pp., The Penguin Press

“Personal reductionism has always been present in information systems,” writes Jaron Lanier in his 2010 jeremiad You Are Not a Gadget: A Manifesto. Lanier uses the example of personal income taxes to illustrate how governments reduce individuals to a proscribed set of statistics and identifiers for the purposes of evaluation. However, Lanier goes on to say that in this instance, it is generally understood that the totality of a person’s life is being “represented by a silly, phony set of database entries” that allow a particular transaction to occur.

The same is not true (at least, not necessarily) when we willingly reduce ourselves by creating a profile on a social networking site. In the latter case, Lanier writes, “digital reduction becomes a causal element, mediating contact between new friends.” Instead of remaining separate from our identity as individuals, the information we provide to social networking sites becomes a determining factor in whom we encounter online, due in large part to the digital algorithms that use our information to parse us into discrete categories based on interests, political affiliations, and other nodes of affinity and disaffinity.

It is this algorithmic parsing of human identity that worries Eli Pariser, a senior fellow at the Roosevelt Institute and board president of the progressive online group MoveOn.org. By monitoring our online behaviour, Pariser writes, and by following our “click signals” – the interests and associations suggested by the people we befriend on social media and the links we click when we visit search engines – sites like Facebook and Google can begin to draw a picture of who we are (or, more precisely, who their digital algorithms think we are) and target advertising and information to us accordingly. Our Facebook news feed does not show us updates from all our friends, only those with whom we interact most frequently (or those whose updates receive the most “likes” from other users). Google tailors its search results to each individual based on variables such as geographic location and previous search history. Amazon and Netflix suggest books or movies we might enjoy based on the things we have indicated enjoying in the past.

To a certain extent, as Pariser acknowledges, this kind of filtering is not new: anyone who has subscribed to a special-interest magazine or television station has his or her news tailored to a set of specific interests. What is different online is that the filtering is invisible, and done without our agreement, choice, or volition.

Pariser, a liberal, noticed that his conservative friends were disappearing from his Facebook news feed: this turned out not to be a coincidence. Facebook’s algorithm had noticed that he was more prone to click or share links from his progressive friends, and began weeding out the other updates, in which it assumed Pariser had less interest. Google and Amazon act in similar fashion, Pariser argues, creating a “filter bubble,” in which individual users are exposed only to things they already like or ideas with which they already agree. What Pariser calls the “era of personalization” is upon us, but its downside involves a narrowing of what we are exposed to and the virtual elimination of information that may be contrary, disturbing, or transgressive. While comforting and emboldening on the one hand, this kind of invisible, uncontrolled selective customization could have deleterious consequences for learning, personal growth, and even democracy itself.

While all this may sound like the ramblings of a paranoiac, Pariser builds a solid case for his arguments. Large companies like Facebook and Google – to say nothing of the brain trusts running political campaigns – have a vested interest in learning as much as they can about us, then using that knowledge to target advertising. Because what we are offered tends to reinforce ideas we already have and images of ourselves we already entertain, the filter bubble acts as a kind of Platonic consumerist marketing tool. But it also narrows our horizons and limits opportunities for the kinds of happy accidents that lead to new discoveries or creativity.

Pariser is an advocate of Karl Popper’s philosophy of falsifiability in science – the idea that science can’t prove anything, but can only disprove a faulty hypothesis – and extends that to the digital realm. If the Netflix algorithm notices a particular user watching and rating highly a series of Hugh Grant movies, instead of offering that user more romantic comedies, the algorithm should suggest Blade Runner. If the user watches Blade Runner and also rates it highly, the algorithm would then have disproved its narrow hypothesis about that particular user liking only romantic comedies; it would also allow a user who might never otherwise have watched a dystopian science fiction film the opportunity for new experience.

The overarching problem with the filter bubble is its implicit endorsement of Mark Zuckerberg’s defiant assertion, “You have one identity.” This is no more true now than it has ever been, but Pariser is adept at illustrating the way the filter bubble is capable of turning Zuckerberg’s statement into a self-fulfilling prophecy. By reinforcing entrenched beliefs, our personalized online experience disallows the kind of intellectual, moral, or philosophical challenges to our preconceived notions that result in growth and the reevaluation of prejudices. Constant reassurance that our beliefs and interests are the best ones is comforting, no doubt, but it is also dangerous in the way it allows blind spots and selective ignorance to flourish. It also ignores the fact that what makes us most essentially human is often tied up in those moments when we behave in unpredictable ways, when we disrupt the identity the filter bubble wants to fit us into. (One of Pariser’s most compelling arguments involves the way the architects of the filter bubble work to refine it so that such moments of unpredictability are mitigated or erased altogether: Google’s CEO, Eric Schmidt, is quoted as saying his ideal for the company is to have it “tell [users] what they should be doing next.”)

“Information systems need to have information in order to run,” writes Lanier, “but information underrepresents reality.” The filter bubble depends upon this kind of underrepresentation. Pariser’s book is valuable for pointing out that the bubble exists, and for suggesting methods to counteract its influence over our lives, both online and off.