Balanced is an attempt to build software that helps people make sense of an increasingly polarized and interconnected world, while avoiding the pitfalls of moral relativism or ideological dogma.
When you install the Balanced Chrome extension, you’ll be exposed to different opinions on various topics as you consume your daily news. If you’re reading an article from Salon about ISIS, Balanced will recommend you read one from Breitbart. If you’re reading something written about a proposed carbon tax from CNN, Balanced will recommend you read an article written by Fox or the Wall Street Journal.
Exposure to different opinions is crucial and a much better solution towards cultural and political harmony than banning “fake news” like the media establishment is talking about today. People in the West need to develop the skills necessary to discern truth on multiple dimensions while understanding one side of a debate usually doesn’t hold a monopoly on the truth (these skills were once called wisdom).
As more and more people are spending their time in tailored experiences designed to get them to react in ways that don’t take into account anything other than some Key Performance Indicators a Product Manager spec’ed out, it’s critical we regain this ancient skill set else we risk tearing society apart. Exposing people to different perspectives and break the cycle of confirmation bias created by social media and socioeconomic stratification is a good first step.
Balanced is still in it’s very early stages, so expect to encounter some bugs as you use it. Currently, Balanced only knows about articles from a small set of news sources and won’t be able to provide recommendations for breaking news. More information about all of that can be found below.
[Download Balanced here].
How come you don’t have a recommendation for a certain article? Article information is sourced from the Event Registry project, and due to their query limitations that’s done every six hours. So it’s not unusual for some breaking news not to show up in Balanced.
How does your recommendation engine work? Currently, Balanced has a predefined set of news sources it monitors which are categorized on a spectrum from “liberal” to “conservative”. Since not every issue is so neatly split between liberals and conservatives (for example, many traditional conservatives would join many radical left wingers in calling for the nationalization of railroads or banks), Balanced won’t always recommend articles which take the opposite opinion of the one you’re currently reading. In the future, I’m hoping to create a more intelligent recommendation engine, one that performs some rudimentary sentiment analysis on articles from the various sources Balanced monitors.
Contributing I’d love some help working on this project. The biggest aspects in need of improvement are 1) the recommendation algorithm, 2) design, and 3) documentation/bug fixes. Submitting Bugs
Several friends and I tested Balanced, but there are bound to be gremlins lurking in the code. If you find a bug, please submit it to the issue tracker on GitHub.
Why the Yin Yang symbol? I was watching a Jordan B. Peterson video and he was talking about ancient religions and comparing the wisdom found there with modern ideologies. One of the things he mentioned was that in many ancient traditions there’s a profound awareness that moderation is critical to human wellbeing and happiness. Too much order is bad, but so too is too much chaos.
The Taoists understood this. With the Yin Yang symbol, the white (yang) represents order while the black (yin) represents chaos. In the symbol, those two forces are in a dance with one another, and the two dots in the symbol represent where you ideally stand. It’s important to have one foot in the chaos, and another in the order so you can live your life centered on the fine line between the two.
Future Plans Besides continuing to improve Balanced, there are a few other projects exploring the same concepts as Balanced that I’m working on.
I’ve noticed that oftentimes discussions about our disagreements oscillate between passionate debate on lofty philosophical ideals and equally exhausting nitpicking over minute details surrounding the argument. We do this not just because we don’t know how to formally argue, but because we don’t fully know the epistemology of our own ideas and beliefs (both how we personally came to believe something and a more formal epistemology addressing how that idea even came into existence).
So I think an immediate next step is to build tools that help people understand the arguments they’re actually having when they disagree with someone about certain issues. But that’s a blog post for another time.