As I now remember the focus of Pariser’s “Filter bubble”, the author was concerned that by search services learning our priorities, the content appearing at the top of the hits returned to response to a search would tell us what we wanted to hear or feed our biases. Two individuals with different beliefs could conduct the same search and be told different things.
I admit that I tried various ways to demonstrate this potential bias and was unable to come up with a demonstration that worked. Pariser describes having two individuals he knew who had different political leanings conduct the same search and observing that the results were different. I attempted to conduct anonymous and self-identified searches (logged into my Google account) for the word “apple” assuming that by revealing who I was to the search service my results would be biased toward technology and the anonymous searches toward the fruit. Not luck.
Researchers using Facebook data have approached the “filter bubble” issue in a different way. They have identified users along a conservative/liberal continuum and then examined the links included in posts from these groups. In the aftermath of the election, they are providing related data graphically through what they describe as the blue feed/red feed. Assuming both sources of media bias a real, the arguments would be that we receive different slants on the facts through the history of who we are and who we friend. It seems possible these two forms of bias interact to compound the effect.