Skip to content
News aggregator
Python HTML Shell Other
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
bin
stumble
.coveragerc
.dockerignore
.flake8
.gitignore
.python-version
.tool-versions
Dockerfile
Makefile
Procfile
README.md
UNLICENSE
docker-compose.yml
manage.py
nginx.conf
requirements-dev.txt
requirements.txt

README.md

Stumble

Stumble is a link/news aggregator. It scans selected online communities, forums, blogs and news sources and makes them available via a simple web interface, API or RSS/Atom.

You can use this have recent news and interesting links delivered to you automatically and randomly stumble through them when you're bored. It works as a centralizer of content so you don't have to navigate to a bunch of different places all the time.

How is this different than subscribing to a bunch of RSS feeds?

It adds some features:

  • it imports from Reddit via the API so it gets the actual posted Url instead of a link to Reddit's comments page (which is what you get when subscribing to Reddit via RSS)
  • You can randomly stumble through Urls (a different way to discover content)
  • You can transform NewsAPI.org queries into RSS feeds

Tech

  • Python 3.7
  • Django 2.2 + Django Rest Framework 3
  • Celery 4
  • PostgreSQL 10 for storage
  • Redis 5 for cache and celery message broker

Running development server

You will need:

  • Python 3.7 + virtualenv if running locally (recommend using asdf or pyenv)
  • Docker + docker-compose

You can either run in docker or your local machine.

To run in docker simply run:

# clone the repository
$ git clone git@github.com:gabrielhora/stumble.git
$ cd stumble

# this will start all required containers, run migrations and start the development server
$ docker-compose up app

To run on your machine, you'll need to run this:

# clone the repository
$ git clone git@github.com:gabrielhora/stumble.git
$ cd stumble

# create the virtualenv and install requirements
$ python -m venv .venv
$ pip install -r requirements.txt
$ pip install -r requirements-dev.txt

# start containers (postgres, redis and celery worker)
$ docker-compose up -d --build worker

# run the migrations
$ python manage.py migrate

# start the development server
$ python manage.py runserver

Now go to http://localhost:8000.

There is also an example import_news.sh file that you could use to import some Urls.

Import links

You can import links from these online sources:

$ python manage.py reddit -s <subreddit_name> -t <tag>
# urls are created with "technology" tag
$ python manage.py hackernews
$ NEWSAPI_API=... python manage.py newsapi -q <query_to_run> -t <tag>
  • From RSS:
$ python manage.py rss -u <rss_url> -t <tag>
You can’t perform that action at this time.