The Internet promises to provide us with different perspectives, points of view and opinions, but we continue to form connections with others whose views are similar to our own. Creating a safe environment of friends and followers, whether online or off, is great for support, but if it’s feedback and constructive criticism you’re after, Eli Pariser -- former executive director and current board member of and author of the new book, The Filter Bubble -- says to expand your circle.

Personalization Preaches to the Choir

His main argument and thesis of his book is that filtering is starting to isolate us. While filtering information is a part of our daily lives -- from the blogs we read, videos we watch, sites we visit, contacts we follow and friend -- the relevance is limiting the information we glean, and, as result, forms our opinions and point of view.

In a recent interview with Mashable, Pariser says for us to get out of our comfort zone. Especially if we want to find things beyond our general scope, we need to be more mindful of how we design and set up our filters.

While personalization and targeted-marketing are lucrative and inevitable for marketers and advertisers, you don’t necessarily think about how the information for which you search is targeted toward you. You might be surprised that it’s eerily similar. For instance, the information that gets referred to me about politics or social issues may differ from what gets referred to you, based upon what we’ve previously searched for or what sites we’ve visited. Could this affect the way we think about things, or how we vote, or what we tell others? Definitely.

Filtering Your Free Will

You might think that you have just as much free will on the Internet as you have when watching television, but unless you work in an Internet-relevant industry, it might not be apparent that you don’t. As Pariser says

When you turn on Fox News, you know what the editing rule is -- what kind of information is likely to get through and what kind is likely to be left out. But you don’t know who Google thinks you are or on what basis it’s editing your results, and therefore you don’t know what you’re missing."

Personalization is too good for the industry to ignore, but like any technology that seeks to benefit from the actions of targeted users, transparency is always recommended. In Pariser’s view, those who develop the algorithms have a responsibility to add ethics into their equations. Unike Do Not Track policies that aim to regulate a user’s online privacy, however, Pariser says the filtering bubble is more about

controlling what you’re able to see of the world -- what your filters let through and what they don’t.

The juxtaposition of personalization versus privacy versus personal responsibility makes the future of web engagement an interesting exercise in what Mashable calls “human-based content curation” which, if successful, can result in a transparent and meaningful experience for everyone.