Chrome extension to make you see other perspectives
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
database_info
exploration
extension
mail-campaign
scrapers
server-config
server
.DS_Store
.gitignore
README.md
creds.txt.template
notes.txt
yinyang.psd

README.md

Balanced

Balanced is an attempt to build software that helps people make sense of an increasingly polarized and interconnected world, while avoiding the pitfalls of moral relativism or ideological dogma.

When you install the Balanced Chrome extension, you’ll be exposed to different opinions on various topics as you consume your daily news. If you’re reading an article from Salon about ISIS, Balanced will recommend you read one from Breitbart. If you’re reading something written about a proposed carbon tax from CNN, Balanced will recommend you read an article written by the Wall Street Journal.

As more and more people are spending their time in tailored experiences designed to get them to react in ways that don’t take into account anything other than some Key Performance Indicators a Product Manager spec’ed out, it’s critical we regain this ancient skill set else we risk tearing society apart. Exposing people to different perspectives and break the cycle of confirmation bias created by social media and socioeconomic stratification is a good first step.

Balanced is still in it’s very early stages, so expect to encounter some bugs as you use it. Currently, Balanced only knows about articles from a small set of news sources and won’t be able to provide recommendations for breaking news. More information about all of that can be found below.

FAQ’s

How come you don’t have a recommendation for a certain article? Article information is sourced from the Event Registry project, and due to their query limitations that’s done every six hours. So it’s not unusual for some breaking news not to show up in Balanced.

How does your recommendation engine work? Currently, Balanced has a predefined set of news sources it monitors which are categorized on a spectrum from “liberal” to “conservative”. Since not every issue is so neatly split between liberals and conservatives (for example, many traditional conservatives would join many radical left wingers in calling for the nationalization of railroads or banks), Balanced won’t always recommend articles which take the opposite opinion of the one you’re currently reading. In the future, I’m hoping to create a more intelligent recommendation engine, one that performs some rudimentary sentiment analysis on articles from the various sources Balanced monitors.

Contributing I’d love some help working on this project. The biggest aspects in need of improvement are 1) the recommendation algorithm, 2) design, and 3) documentation/bug fixes. Submitting Bugs

Several friends and I tested Balanced, but there are bound to be gremlins lurking in the code. If you find a bug, please submit it to the issue tracker on GitHub.

Why the Yin Yang symbol? In the Yin Yang symbol, the white (yang) represents order while the black (yin) represents chaos. In the symbol, those two forces are in a dance with one another, and the two dots in the symbol represent where you ideally stand. It’s important to have one foot in the chaos, and another in the order so you can live your life centered on the fine line between the two.

Future Plans Besides continuing to improve Balanced, there are a few other projects exploring the same concepts as Balanced that I’m working on.

I’ve noticed that oftentimes discussions about our disagreements oscillate between passionate debate on lofty philosophical ideals and equally exhausting nitpicking over minute details surrounding the argument. We do this not just because we don’t know how to formally argue, but because we don’t fully know the epistemology of our own ideas and beliefs (both how we personally came to believe something and a more formal epistemology addressing how that idea even came into existence).

So I think an immediate next step is to build tools that help people understand the arguments they’re actually having when they disagree with someone about certain issues. But that’s a blog post for another time.