Theories about information flow are particularly fun when they arrive to your brain through the very mechanisms they are explaining. Recently several of my Facebook friends, a couple people I follow on Twitter, and two of my favorite podcasts started reporting on a new concept from Eli Pariser called The Filter Bubble. The general idea goes like this: as search engines and social media sites use smarter and smarter algorithms to better serve what they determine our needs to be, the less and less we are exposed to opposing viewpoints. The idea dovetails very nicely with the uncomfortable fact that in politics, the smarter you are the dumber you are. The more we research political issues, the more we are sent to information that reaffirms our views, making us even more confident that the other half of you are so stupid.
The solution is a bit of moderation. In artificial intelligence programming, sometimes a bit of randomized noise needs to be added to the system to avoid overly deterministic behavior. Our search algorithms need something similar. We need some decent percentage (10%? 20%?) of our search results and news feed items to be opposing viewpoints; otherwise our society will become even more polarized.