Nearly every online service aims to deliver content that we find enjoyable and relevant. These services have given us tools to filter more effectively and they have implemented numerous filters behind the scenes to keep content you like in your world.
I’ve been thinking about this issue one night when I was checking Facebook and noticed that Facebook consolidated several opinions shared among my friends on a given topic and buried them into one link “View 9 more posts about [topic]” This reduces clutter at the expense of sharing our views.
This relates to a cause and effect of how we use social media. In the past eight years, we were a part of an arms race to connect with all of our friends and even strangers. While “stranger danger” existed, we still ventured to connect with people that we wanted to become our friend.
I have greatly reduced my activity of adding friends, following brands and interests and generally adding more perspectives to my social experience. It seems we have reached critical mass of adding connections in social media. Now that we have connected with hundreds of real and virtual connections, we wrestle balancing relevant and timely updates.
Thinking back thirteen years ago, I recall my first experience with online community. We initially had the AOL Buddy List, which was really a short list of close friends and family that we would Instant Message. Outside of that circle, we would innocently enjoy visiting forums, chat rooms and stumbling across web pages to meet new people. The people we would meet through these applications would often relate to our interests and were not contingent on existing relationships. We were content with meeting strangers, living with their abnormalities, passions and other human qualities.
I’m not getting nostalgic without a good reason. We encountered objectionable content without filters. Information discovery was equal, fair and was only restricted by our own imagination.
Information discovery today isn’t that easy. Today’s gateways dictate how we discover and act on information. Examples of these filters include, but not limited to:
- Facebook’s “Top Stories” News Feed Ranking
- Twitter’s “Top Tweets” in Search
- Twitter’s “No Replies” setting for Brands
- Google’s Personalized Search Results
- iTunes “Top [Apps, Books, Songs, Movies, etc]”
- Netflix’s “Popular Queue” and other personalized queues
- Retargeted Advertisements
In a TED Talk, Eli Pariser, explains this problem in vivid detail. He describes this problem as a “filter bubble” effect, where algorithms silently exclude opposing interests from our feeds and search result pages. The shift from human editors to computer-based filters is core to the issue at hand. He contrasts various examples of this inequality in relevant information and “junk food.”
His suggested solution is to include irrelevant signals to equally rank content that is uncomfortable, challenging and timely into our online experience. He does a great job explaining the problem and it affects all of us directly and indirectly. I strongly suggest you watch it — it’s only nine minutes long.
What can you and I do?
First, we have to get back in touch to discovering fresh content. Not the same cyclical content that we share on Facebook or Twitter; but instead to actually seek out relevant content nested deep in search results and insights from experts found within topical communities.
Second, We can opt-out of personalized content from online services. Yes, will some irrelevant content appear? Sure. I believe not seeing any new content or information is much more dangerous to us in the long run. Besides, you know how to search Google like a pro, right?
Image credit: regionalblind