No search engine comes close to competing with Google, but a Toronto project is determined to give the internet an alternative.
The Million Short search engine offers you the option to eliminate up to the first one million sites in its results — thereby stripping sponsored sites and ads, junk and content optimized for search engines. You can further drill down results by country and remove sites with ads, live chat and e-commerce.
To Sanjay Arora, founder and CEO of Million Short, search is a “forgotten issue.” But he claims the solution Million Short provides is simple.
“It’s just filters,” he said.
Exposing a New Side of the Internet
Arora came up with the concept late one night in 2012, when the idea for a new search engine “popped” into his head. He started working on it that night.
“What happens if you search the internet, and you no longer see Wikipedia, Expedia or TripAdvisor? I’m not anti any of these sites, but the internet has got to be bigger than these 25 sites,” Arora told CMSWire.
“We need some people to push a different way of doing things .… Imagine you’re looking at a small world, seeing a different slice of the internet.”
He felt like he was on to something, so he registered domain names and shared it on a few sites. It generated a lot of buzz, but the responses were polarizing, he said.
“Half thought it was cool. Half thought it was dumb.”
The numbers told another story. About a week after going live, Million Short counted 1 million visitors and climbed its way to the top 10,000 most trafficked sites in the US a month later. However, the investment community remains a tough sell, Arora said. With a team of 12 in Toronto, Million Short is still working on a way to monetize the search technology.
Arora says people write to him after discovering unexpected information from using Million Short. One person discovered a scholarship in her name unknown to her, and others claim to save money on their travels from deals through local and small businesses, some of which may not have the advertising dollars to make it to Google’s page one.
The Bubble and the Bias
But beyond limited results, search engine bias has long been enemy to internet activists and search and privacy advocates. Some refer to the problem as the filter bubble, where site algorithms choose what information to display based on a person's location, past results and behavior. The argument goes, algorithms serve people selective pieces of information and content aligned with who they think we are, which limits outside products and views.
Eli Pariser, chief executive of Upworthy and activist, is credited for coining “filter bubble.” In his TED Talk on the topic Pariser argues that the bubble and personalized web harm the internet and democracy by shutting us out from diverse point of views and beliefs.
Pariser first noticed the issue on Facebook when links to conservative content disappeared from his feed. Facebook had noted his frequent clicking on links from liberal sources and removed the conservative content without consulting him.
The Danger of Filters
Others recognize the same problem as Pariser, pointing to the dangers of bias. Engin Bozdag, a senior privacy consultant at PwC in Netherlands, studied the filter bubble in his PhD thesis at Delft University of Technology. Bozdag was interested in examining social media and tools combatting the filter bubble to promote democracy.
“Be aware that you might be in a self-created bubble in Twitter or Facebook by following only like-minded users,” Bozdag told CMSWire.
“Not only algorithms cause bias in search engines. Humans introduce bias, as well. Google, Bing and other search engines have human quality analysts who can demote, delete or suspend websites, and [reasons are] often not transparent.”
Media attention pushed some sites to fine-tune their algorithms, but Bozdag believes personalization could be making a comeback. Like Arora, he finds it concerning the role money plays in which sites consumers see first.
“Money is used as a proxy for ‘best.’ Those with the most money to spend can prevail over those with the most useful information. The creation of a salable audience takes priority over your authentic interests.”
First Impressions Matter
Searching online has become such a natural tendency that we’re often unaware of the results and their bias. Eduardo Graells-Garrido, a researcher at Telefonica R&D in Santiago, Chile, argues in his study on homophily that first impressions of people — or search results — play a big part in shaping our likes and dislikes.
“[Homophily is creating] bonds with people who are like you. Due to cognitive biases, if the first thing I see about someone is something I don’t like or believe in, I will reject further communications,” Graells-Garrido said.
“The biggest danger is that we might start … to classify as dangerous or threatening something or someone that is just different.”
However, changing a site’s algorithm is not enough to reverse the filter bubble. Another answer to the bubble may be a design one, as Graells-Garrido realized in his research. His visual data portraits experiment, which built profiles of people based on their Twitter interactions, tested whether visuals would change how people engage with one another.
Using word clouds generated from Twitter feeds, they showed users their own content but also layered it with recommendations to connect with people with different beliefs. “Users do not value diversity, and algorithmic changes do not change behavior,” Graells-Garrido said.
“This is because the way we perceive information is biased. We need new ways to present information, so our minds or brains process information differently [and] we can perform conscious decisions. We found that this is possible using mixed approaches with information visualization.”
Whatever the solution, as we continue relying on search engines as primary sources of information, consider the fact that page one does not guarantee the most relevant or high-quality content. There's a big internet out there, and sometimes you need to dig past the first hundred or thousand sites to find those unknowns.