[dropcap]T[/dropcap]he danger involved with this can hardly be overstated. Not to know one is blind is an excruciating blindness. If Eli Pariser is right we are all suffering from a blindness of which we are not aware.
Another way of putting it is that we are all being affected by confirmation bias in a more subtle way then we might have imagined. Confirmation bias is the tendency to interpret data in a way that fits our presuppositions about religion, science, politics, sports and life in general. Confirmation bias is why two people of opposing views can read the same news story then claim it supports their viewpoint. It is seeing that with which we agree and–usually–unconsciously ignoring that which challenges our thinking.
The danger about which Pariser (also co-founder of Upworthy.com) warns revolves around how our Internet searches are filtered to match our preferences. He calls it “the filter bubble.” The algorithms behind Facebook search, Facebook newsfeed, Google, Bing, etc, pre-filter our results based on what we like, with whom we have interacted online, where we live, from where we are accessing the Internet, the operating system and browser we use, and more. Results are then shown to us not as unfiltered data, but prioritized as data the algorithm deduces we most likely want to see. So news stories are fed to us because they are the type of story we read most often, not because they are the most comprehensive, most accurate, or from a reliable source. Most people have certain political or religious leanings. Filtering can cause the erroneous assumption that more people think like us than actually do.
This is not a long video, and it is worth your time.