A library for developing API's following the HAL spec
Clone or download
Failed to load latest commit information.
docker Use 'mobileoxford' for requestor ref in bus info api call. Aug 22, 2016
docs OxPoints / added accessibility_has_accessible_parking_spaces Nov 17, 2014
events/events/importers some changes for local dev env Aug 8, 2016
moxie Increase the geofilter distance to 20 Feb 21, 2018
solr Add missing files. Aug 8, 2016
.gitignore Added local_celeryconfig to gitignore Dec 10, 2012
.travis.yml Updated travis to print pip output and not use mirrors (deprecated) Jan 17, 2014
Dockerfile Updated dockerisation process and added some clearer documentation on… Aug 8, 2016
Dockerfile_solr Updated dockerisation process and added some clearer documentation on… Aug 8, 2016
LICENSE.txt Added license file Oct 15, 2012
MANIFEST.in Include the default_settings for setup.py and removed courses app Nov 2, 2012
README.md Correct numbering Mar 28, 2018
docker-compose.yml Updated dockerisation process and added some clearer documentation on… Aug 8, 2016
fabfile.py Update deployment process for new requirejs setup Aug 23, 2016
fig.yml Custom configuration file for docker Feb 10, 2015
requirements.txt update insecure dependencies Oct 11, 2018
requirements_dev.txt Added mock to development requirements Jun 21, 2013
runserver.py Added time to logging for runserver.py. Jan 11, 2013
runtests.py Return the correct status code when tests fail. Travis will actually … Sep 14, 2012
setup.py Specify version of html5lib to override that brought in by rdflib Aug 9, 2016



Build Status

The new Mobile Oxford

This repository contains the (server-side) JSON API.


Available at Read the docs

Documentation is also available at /docs in the repo, you need to install required packages (Sphinx) from requirements_dev.txt.

You can generate HTML documentation by running make html inside the docs directory.


  • Solr 4.4
  • Redis
  • libgeos2
  • pip (easy_install pip)

How to run


  • pip install -r requirements.txt

Running the application

  • celery worker --app moxie.worker
  • python runserver.py

Options available for runserver.py

  • run with a profiler: python runserver.py --profiler
  • specify the logging level (INFO by default): python runserver.py --log-level DEBUG

Periodically, and the first time, you have to run importers to import data into the search index. You can do this via a Python shell:

>>> from moxie.places.tasks import import_all
>>> import_all.delay()

Deploying with Fabric


  • Add your public SSH key to /srv/moxie/.ssh/authorized_keys
  • Execute the fabric script on your local machine, which will then connect to the designated server and run the pre-programmed tasks:

fab deploy_api:GIT_BRANCH -g USER@GATEWAY -H moxie@PRIVATE_IP

For example:

fab deploy_api:master -g martinfilliau@mox-api-front.oucs.ox.ac.uk -H moxie@

  • Optional: Use an ssh_config file to define the gateway to and provide aliases for machines behind the front-end server, and the user to connect as. Then the -g flag and username become unnecessary and memorable hostnames can be used instead of IP addresses:

fab deploy_api:master -H mox-api-1.oucs.ox.ac.uk

See puppet/fabric/ubuntu-ssh/ssh_config for examples.

Running a specific task

Sometimes it's necessary to trigger a specific task on the production server. First, set up a tunnel to the solr server - see 'Examining the remote solr collections' below.

Running tasks can be done as follows:

  1. Log into the front server
ssh moxie@mox-api-front.oucs.ox.ac.uk
  1. From here, log into the tasks server
ssh ubuntu@
  1. Set environment variable for moxie settings
export MOXIE_SETTINGS=/srv/moxie/tasks_settings.yaml
  1. Launch a python shell
  1. Configure basic logging. You may want to set the log level as below. If not, just leave the brackets empty
>>> import logging
>>> logging.basicConfig(level = logging.DEBUG)
  1. Import and run the task(s) e.g.
>>> from moxie.places.tasks import import_naptan
>>> import_naptan()
  1. If the data looks correct (check SOLR via your tunnel) swap the solr cores, as the moxie.places.tasks import tasks import to places_staging by default so will need putting into production
>>> from moxie.places.tasks import swap_places_cores
>>> swap_places_cores()

Examining the remote solr collections

The best way to do this via the solr web interface by tunnelling to the solr server through the front server. Note the -A flag which forwards the local identity for use in the subsequent connection.

ssh-add ~/.ssh/id_rsa
ssh -A -L 8080:localhost:8080 moxie@mox-api-front.oucs.ox.ac.uk

Then from the mox-api-front shell, as the moxie user (it will prompt for the ubuntu user's password):

ssh -A -L 8080:localhost:8080 ubuntu@

Then point your browser at localhost:8080/solr

Local Development with Docker

The repository includes a set of Dockerfiles and a docker-compose file which together allow a local development environment to be launched, which includes:

  • web app server
  • redis (some further work possibly required to make this work correctly)
  • postgres (some further work possibly required to make this work correctly)
  • solr

This process isn't quite as streamlined as we would wish, but hopefully these instructions are clear enough that it is manageable, and the best way to conduct local development.

Note, this uses solr 5.3, whereas the production API currently uses 4.x, so there may be some differences.

By default, this includes only the modules which are part of this core repository (namely places and transport. See below for how to include other modules)

This guide assumes that docker is installed, with a local host set up. (Don't forget to run eval $(docker-machine env default)!)

Start the dev environment with

docker-compose up

This will build the required images and start the containers.

Configure solr collection

At this point the solr instance does not have any collections configured. At present this must be done manually as follows:

Log into the solr container

docker exec -it moxie_solr_1 /bin/bash

Run the command to initialise the collection

/opt/solr/bin/solr create_core -c <collection_name> -d <collection config directory>

where the collection name is as specified in the docker/app_settings.yaml file, and the config is the relevant path in solr/config (which is mounted to /opt/solr/moxie_config/config).

e.g. to initialise the two collections required by the places API, invoke:

/opt/solr/bin/solr create_core -c places_staging -d /opt/solr/moxie_config/config/places/
/opt/solr/bin/solr create_core -c places_production -d /opt/solr/moxie_config/config/places/

(these both share the same config)

Run an import task

To run an import task, first attach to the web container:

docker exec -it moxie_web_1 /bin/bash

Then initiate a python session


It's then necssary to initalise a logger

import logging

Then the relevant task can be imported and run e.g.

from moxie.places.tasks import import_oxpoints

... or for a module added externally:

from moxie_courses.tasks import import_xcri_ox

Note: If python code is changed, it is necessary to quit the python session and enter a new one, in order for the change to take effect.

Adding additional modules

Moxie's architecture consists of a core, with various additional modules bolted on. In order to add an additional module to the dockerized environment, there are few steps required.

  1. Copy the code for the module into a subfolder the core moxie directory. The name must have underscores instead of minuses

  2. Add an install command to the main Dockerfile e.g. RUN pip install -e ./moxie_events

  3. Add relevant sections to the docker/app_settings.yaml file. An example config used in production can be found in the github/ox-it/puppet repository. Locations for redis/solr should be amended to use the container names, rather than ip addresses.

  4. Add any required solr config to the solr directory. It can then be available when running solr commands to create the containers.

  5. Conduct any other necessary setup with a script in Dockerfile. An example is provided for the notifications API, which needs to initialise some database tables.