To truly understand whether personalization is a threat or a blessing, we need a more holistic and dynamic account of the online landscape (…)

The New York Times‘ Sunday Book Review: “Your Own Facts” by Evgeny Morozov, June 10, 2011 (subscription may be required)

This is from a review about Eli Pariser’s latest book The Filter Bubble (which I haven’t read: this is merely a comment about Morozov’s review along with some tips pertaining to online personalization). What’s the book about?

We’re used to thinking of the Internet like an enormous library, with services like Google providing a universal map. But that’s no longer really the case. Sites from Google and Facebook to Yahoo News and the New York Times are now increasingly personalized – based on your web history, they filter information to show you the stuff they think you want to see. That can be very different from what everyone else sees – or from what we need to see. (From a Q&A with the author available at Amazon.com)

More on Pariser’s book: Google books (no preview available), The Filter Bubble (official website for the book), on the same website one can watch a Ted Talk where Eli Pariser explains what’s the “filter bubble” and how it works (Flash is required). There’s a biography of Eli Pariser on his official website.
I find Morozov’s review interesting since he does not condemned not praised the idea of “search personalization”, but instead tries to present it as a problem which every users should address and inquiry about. Here are his conclusions about Pariser’s book:

Although Pariser’s conclusions and prescriptions are not wholly convincing, he is to be commended for reinvigorating the conversation about the dangers of online personalization. And “The Filter Bubble” deserves praise for drawing attention to the growing power of information intermediaries whose rules, protocols, filters and motivations are not always visible. But whether we should demand more substantial civic commitments from these intermediaries is to be debated.

As Moroz points out, there are options for users to limit (to a certain extent, depending on the options) the ways by which their web surfing experience is personalized. Below you’ll find some suggestions relevant to Google and Facebook. It is by no way a complete guide and, since the web is constantly evolving, they are subject to changes. Nor am I advocating against personalization tools: cookies, for example, have their advantages. However, it can’t harm a user to know what they are and how they work. By understanding them and choosing willingly to use them or not (at a given time), one can surely make a more responsible use of the Internet. For example, I develop a practice where I’ll use one main browser for my daily research activities (cookies are activated), and another web browser where cookies are deactivated. Sometimes, I’ll switch from one to the other to see if I can get different search results.

• Control Google’s personalization

For Google users, it is good to learn more about how its “web history” works. It is interesting to know, for example, that “Google personalizes search results even for users that aren’t logged in” (see “Why Google Web History Is Enabled by Default” on Google Operating System, an unofficial blog about Google). Even so, it is possible to “disable customizations based on search activity”. See “Turning off search history personalization” (official Google support).

• Control Facebook’s personalization

For Facebook users, it is possible to:

  • Uncheck the box that says “Keep me logged in” on the login panel.
  • There are plugins and/or options available for Firefox and Chrome in order to specifically block Facebook cookies.
  • Spend some time exploring and reading about privacy settings. For example, it is possible to edit privacy settings for sponsored content so the user won’t be served “more relevant ads” (as they put it: see what are social ads on Facebook). Reading the Help Center FAQs about privacy is not a bad idea either.

Finally, I found an article listing “10 simple steps provided by Eli Pariser to de-personalize your web experience”: “10 Ways To Pop Your Filter Bubble” at Bryan Fuhr’s blog, May 31, 2011.
[UPDATE – June 24, 2011] Below I’ll add more relevant links as I found them.

  • George Brock: “The filter bubble and public reason”, June 23, 2011:

    He’s right to be concerned about what kind of public sphere the digital giants are helping to build: that should be society’s concern. But the key here is transparency: if we know exactly how search engines filter and how Facebook tunes the use of the “Like” button, we have the information we need to choose. We can make use of Google and Facebook rather than being used by them. The problem lies in current concealment of the exact ways in which information utilities manage information, not in the fact that they are doing it. If we know what they are up to, we can choose whether to mind or not. Only then should we worry about whether something is bad enough to be stopped by law.

  • Slate.com: “Bubble Trouble. Is Web personalization turning us into solipsistic twits?” by Jacob Weisberg, June 10, 2011:

    Through most of history, bubbles have been imposed involuntarily. Not so long ago, most Americans got their news primarily through three like-minded networks and local newspapers that reflected a narrow consensus. With something approaching the infinite choices on the Web, no one has to be limited in this way. Why assume that when people have more options, they will choose to live in an echo chamber?

  • Slate: “Study: Social Networks Not Total Political Echo Chambers” by Caitlin Mac Neal, March 12, 2012:

    One common critique of the Internet is that it allows people to seal themselves off in like-minded echo chambers, away from those whose views they don’t like. But a new Pew study says that those who use social networking sites often come in contact with people who have different political views.

0 Shares

Subscribe to our newsletter

This newsletter serves one purpose only: it sends a single email notification whenever a new post is published on aphelis.net, never more than once a day. Upon subscribing, you will receive a confirmation email (if you don’t, check your spam folder). You can unsubscribe at any time.