"There is no standard Google anymore," says Eli Pariser in a recent TED talk. And he's right. Try it. Google the same thing as the person sitting next to you and compare the results. Chances are, they're different. According to Pariser, that's because Google uses as many as 57 different signals to determine the unique search results it serves you. It cross-references (among other things) your computer model, your choice of browser, your geographic location, your search history, the cookies it's stored on your hard drive, and data from your various social networks to determine what you'll most likely want to see when you're curious about something. As best it can, Google is personalizing the world for you.
And Google isn't the only one. Pariser explains that many popular destinations on the Web—like Amazon, Facebook, and Yahoo! News—are dabbling in personalization to ensure their users see only what's "most relevant" to them. (Pariser, President of the Board of MoveOn.org, first became aware of the power of personalization when Facebook suddenly stopped showing him updates from his more conservative friends; because he was less likely to follow the links they posted, Facebook determined those folks weren't really important to him and silently removed them from his news feed.) What's more, he adds, they're doing it without any indication, conducting "invisible algorithmic editing of the Web."
Pariser worries that these algorithmic curators are having some damaging effects. An increasingly personalized Web is one that doesn't present its users with contrarian perspectives or novel viewpoints that might reorient their thinking. It also creates an environment less conducive to pleasant surprises when it minimizes chances for discovering something outside filters' parameters. The aggregate effect of constant personalization creates what Pariser, in his new book, calls a filter bubble—a "personal, unique universe of information that you live in online," as he explains in his TED talk.
Of course, information filters are nothing new. News media, for instance, traditionally play a gate-keeping function, filtering the news of the day into digestible packages fit for consumption in various contexts and on different platforms. So it's tempting to accuse Pariser of simply resuscitating a classic debate: Should information filters curate content to provide citizens with information they want to have, or with information they need to have in order to become well-rounded citizens, regardless of how that information jibes with their sensibilities, comfort levels, political orientations, or personal preferences?
This question, however, presupposes a key human element Pariser seizes to demonstrate how filter bubbles aren't merely an extension of traditional editorial practices. These practices are guided by codes of ethics established through decades of debate in journalism. And arguments that have delineated between audience wants and audience needs typically assume audiences can choose from a variety of information sources, making rational decisions about the kinds of content they include in their balanced media diets. Media have an ethical responsibility, the argument goes, to provide audiences with a balanced information diet, a diverse blend of content from which citizens can draw when developing the critical opinions essential to a healthy democratic society.
Algorithms policing the borders of our filter bubbles are certainly guided by a set of ethics—but not necessarily the same set guiding civic-minded news editors. And they're not working to enhance users' abilities to make careful decisions regarding information sources they find reputable, nourishing, or important. They're filtering content beneath the threshold of ordinary awareness and without direct human intervention, structuring the very field of options users have in the first place, narrowing that field before consumers can even begin to evaluate all the options they might have had.
It might seem that popping these filter bubbles is one way to resist the effects of excessive personalization. But that's another key nuance of Pariser's argument. He doesn't insist that we cease allowing algorithms to filter data for us. After all, its ability to quickly filter, organize, and prioritize a vast collection of heterogeneous bits and pieces is one ability that makes the Internet so darned useful. Instead, he's asking companies that employ filters to be more transparent about precisely what those filters are doing—not only about what's being presented to users, but also, perhaps more importantly, what's being edited out.
"We need you to make sure that they're transparent enough that we can see what the rules are that determine what gets through our filters," he says directly to software architects and decision makers attending his TED talk. "And we need you to give us some control so that we can decide what gets through and what doesn't."
Filter bubbles may be unavoidable. But Pariser wants to make sure these bubbles have permeable walls. Opening them up is one way to do just that.
6 Comments