License: This project is placed in the public domain.
You'll need the
App Engine Python SDK
version 1.9.15 or later (for
support) or the
Google Cloud SDK (aka
Add it to your
export PYTHONPATH=$PYTHONPATH:/usr/local/google_appengine, and then run:
virtualenv local source local/bin/activate pip install -r requirements.freeze.txt # We install gdata in source mode, and App Engine doesn't follow .egg-link # files, so add a symlink to it. ln -s ../../../src/gdata/src/gdata local/lib/python2.7/site-packages/gdata ln -s ../../../src/gdata/src/atom local/lib/python2.7/site-packages/atom python -m unittest discover
The last command runs the unit tests. If you send a pull request, please include (or update) a test for the new functionality if possible!
To run the entire app locally, run this in the repo root directory:
dev_appserver.py --log_level debug app.yaml background.yaml
bash: ./bin/easy_install: ...bad interpreter: No such file or directory ImportError: cannot import name certs ImportError: No module named dev_appserver ImportError: cannot import name tweepy File ".../site-packages/tweepy/auth.py", line 68, in _get_request_token raise TweepError(e) TweepError: must be _socket.socket, not socket error: option --home not recognized
There's a good chance you'll need to make changes to granary, oauth-dropins, or webmention-tools at the same time as bridgy. To do that, clone their repos elsewhere, then install them in "source" mode with:
pip uninstall -y oauth-dropins pip install -e <path to oauth-dropins> ln -s <path to oauth-dropins>/oauth_dropins \ local/lib/python2.7/site-packages/oauth_dropins pip uninstall -y granary pip install -e <path to granary> ln -s <path to granary>/granary \ local/lib/python2.7/site-packages/granary pip uninstall -y webmentiontools # webmention-tools isn't in pypi ln -s <path to webmention-tools>/webmentiontools \ local/lib/python2.7/site-packages/webmentiontools
The symlinks are necessary because App Engine's
vendor module evidently
.pth files. :/
To deploy to App Engine, run
is a useful interactive Python shell that can interact with the production app's
datastore, memcache, etc. To use it,
create a service account and download its JSON credentials,
put it somewhere safe, and put its path in your
Adding a new silo
So you want to add a new silo? Maybe MySpace, or Friendster, or even Tinder? Great! Here are the steps to do it. It looks like a lot, but it's not that bad, honest.
- Find the silo's API docs and check that it can do what Bridgy needs. At minimum, it should be able to get a user's posts and their comments, likes, and reposts, depending on which of those the silo supports. If you want publish support, it should also be able to create posts, comments, likes, reposts, and/or RSVPs.
- Fork and clone this repo.
- Create an app (aka client) in the silo's developer console, grab your app's id (aka key) and secret, put them into new local files in the repo root dir, following this pattern. You'll eventually want to send them to @snarfed and @kylewm too, but no hurry.
- Add the silo to oauth-dropins if it's not already there:
- Add the silo to granary:
- Add a new
.pyfile for your silo. Follow the existing examples. At minimum, you'll need to implement
get_activities_responseand convert your silo's API data to ActivityStreams.
- Add a new unit test file and write some tests!
- Add it to
index.html, and the README.
- Add a new
- Add the silo to Bridgy:
- Add a new
.pyfile for your silo with a model class. Follow the existing examples.
- Add it to
handlers.py(just import the module).
- Add a 48x48 PNG icon to
- Add a new
templates/and add the silo to
index.html. Follow the existing examples.
- Add the silo to
about.htmland this README.
- If users' profile picture URLs can change, add a cron job that updates them
cron.yaml. Also add the model class to the datastore backup job there.
- Add a new
- Optionally add publish support:
Good luck, and happy hacking!
For alerting, we've set up Google Cloud Monitoring (née Stackdriver). Background in issue 377. It sends alerts by email and SMS when HTTP 4xx responses average >.1qps or 5xx >.05qps, latency averages >15s, or instance count averages >5 over the last 15m window.
Export the full datastore to Google Cloud Storage. Include all entities except
*Authand other internal details. Check to see if any new kinds have been added since the last time this command was run.
gcloud datastore export --async gs://brid-gy.appspot.com/stats/ --kinds Blogger,BlogPost,BlogWebmention,FacebookPage,Flickr,GitHub,GooglePlusPage,Instagram,Medium,Publish,PublishedPage,Response,SyndicatedPost,Tumblr,Twitter,WordPress
--kindsis required. From the export docs, Data exported without specifying an entity filter cannot be loaded into BigQuery.
Wait for it to be done with
gcloud datastore operations list | grep done.
for kind in BlogPost BlogWebmention Publish Response SyndicatedPost; do bq load --replace --nosync --source_format=DATASTORE_BACKUP datastore.$kind gs://brid-gy.appspot.com/stats/all_namespaces/kind_$kind/all_namespaces_kind_$kind.export_metadata done for kind in Blogger FacebookPage Flickr GitHub GooglePlusPage Instagram Medium Tumblr Twitter WordPress; do bq load --replace --nosync --source_format=DATASTORE_BACKUP sources.$kind gs://brid-gy.appspot.com/stats/all_namespaces/kind_$kind/all_namespaces_kind_$kind.export_metadata done
Check the jobs with
bq ls -j, then wait for them with
Run the full stats BigQuery query. Download the results as CSV.
Open the stats spreadsheet. Import the CSV, replacing the data sheet.
Check out the graphs! Save full size images with OS or browser screenshots, thumbnails with the Save Image button. Then post them!
The datastore is automatically backed up by an App Engine cron job that runs Datastore managed export (details) and stores the results in Cloud Storage, in the brid-gy.appspot.com bucket. It backs up weekly and includes all entities except
SyndicatedPost, since they make up 92% of all entities by size and they aren't as critical to keep.
We use this command to set a Cloud Storage lifecycle policy on that bucket that prunes older backups:
gsutil lifecycle set cloud_storage_lifecycle.json gs://brid-gy.appspot.com
Run this to see how much space we're currently using:
gsutil du -hsc gs://brid-gy.appspot.com/\*
Run this to download a single complete backup:
gsutil -m cp -r gs://brid-gy.appspot.com/weekly/datastore_backup_full_YYYY_MM_DD_\* .