The new Mobile Oxford
This repository contains the (server-side) JSON API.
Available at Read the docs
Documentation is also available at
/docs in the repo, you need to install required packages (Sphinx) from
You can generate HTML documentation by running
make html inside the
- Solr 4.4
- pip (
How to run
pip install -r requirements.txt
Running the application
celery worker --app moxie.worker
Options available for runserver.py
- run with a profiler:
python runserver.py --profiler
- specify the logging level (INFO by default):
python runserver.py --log-level DEBUG
Periodically, and the first time, you have to run importers to import data into the search index. You can do this via a Python shell:
>>> from moxie.places.tasks import import_all >>> import_all.delay()
Deploying with Fabric
- Add your public SSH key to /srv/moxie/.ssh/authorized_keys
- Execute the fabric script on your local machine, which will then connect to the designated server and run the pre-programmed tasks:
fab deploy_api:GIT_BRANCH -g USER@GATEWAY -H moxie@PRIVATE_IP
fab deploy_api:master -g firstname.lastname@example.org -H email@example.com
- Optional: Use an ssh_config file to define the gateway to and provide aliases for machines behind the front-end server, and the user to connect as. Then the -g flag and username become unnecessary and memorable hostnames can be used instead of IP addresses:
fab deploy_api:master -H mox-api-1.oucs.ox.ac.uk
See puppet/fabric/ubuntu-ssh/ssh_config for examples.
Running a specific task
Sometimes it's necessary to trigger a specific task on the production server. First, set up a tunnel to the solr server - see 'Examining the remote solr collections' below.
Running tasks can be done as follows:
- Log into the front server
- From here, log into the tasks server
- Set environment variable for moxie settings
- Launch a python shell
- Configure basic logging. You may want to set the log level as below. If not, just leave the brackets empty
>>> import logging >>> logging.basicConfig(level = logging.DEBUG)
- Import and run the task(s) e.g.
>>> from moxie.places.tasks import import_naptan >>> import_naptan()
- If the data looks correct (check SOLR via your tunnel) swap the solr cores, as the moxie.places.tasks import tasks import to places_staging by default so will need putting into production
>>> from moxie.places.tasks import swap_places_cores >>> swap_places_cores()
Examining the remote solr collections
The best way to do this via the solr web interface by tunnelling to the solr server through the front server. Note the -A flag which forwards the local identity for use in the subsequent connection.
ssh-add ~/.ssh/id_rsa ssh -A -L 8080:localhost:8080 firstname.lastname@example.org
Then from the mox-api-front shell, as the moxie user (it will prompt for the ubuntu user's password):
ssh -A -L 8080:localhost:8080 email@example.com
Then point your browser at
Local Development with Docker
The repository includes a set of Dockerfiles and a docker-compose file which together allow a local development environment to be launched, which includes:
- web app server
- redis (some further work possibly required to make this work correctly)
- postgres (some further work possibly required to make this work correctly)
This process isn't quite as streamlined as we would wish, but hopefully these instructions are clear enough that it is manageable, and the best way to conduct local development.
Note, this uses solr 5.3, whereas the production API currently uses 4.x, so there may be some differences.
By default, this includes only the modules which are part of this core repository (namely places and transport. See below for how to include other modules)
This guide assumes that docker is installed, with a local host set up. (Don't forget to run
eval $(docker-machine env default)!)
Start the dev environment with
This will build the required images and start the containers.
Configure solr collection
At this point the solr instance does not have any collections configured. At present this must be done manually as follows:
Log into the solr container
docker exec -it moxie_solr_1 /bin/bash
Run the command to initialise the collection
/opt/solr/bin/solr create_core -c <collection_name> -d <collection config directory>
where the collection name is as specified in the docker/app_settings.yaml file, and the config is the relevant path in solr/config (which is mounted to /opt/solr/moxie_config/config).
e.g. to initialise the two collections required by the places API, invoke:
/opt/solr/bin/solr create_core -c places_staging -d /opt/solr/moxie_config/config/places/ /opt/solr/bin/solr create_core -c places_production -d /opt/solr/moxie_config/config/places/
(these both share the same config)
Run an import task
To run an import task, first attach to the web container:
docker exec -it moxie_web_1 /bin/bash
Then initiate a python session
It's then necssary to initalise a logger
import logging logging.basicConfig()
Then the relevant task can be imported and run e.g.
from moxie.places.tasks import import_oxpoints import_oxpoints()
... or for a module added externally:
from moxie_courses.tasks import import_xcri_ox import_xcri_ox()
Note: If python code is changed, it is necessary to quit the python session and enter a new one, in order for the change to take effect.
Adding additional modules
Moxie's architecture consists of a core, with various additional modules bolted on. In order to add an additional module to the dockerized environment, there are few steps required.
Copy the code for the module into a subfolder the core moxie directory. The name must have underscores instead of minuses
Add an install command to the main Dockerfile e.g.
RUN pip install -e ./moxie_events
Add relevant sections to the docker/app_settings.yaml file. An example config used in production can be found in the github/ox-it/puppet repository. Locations for redis/solr should be amended to use the container names, rather than ip addresses.
Add any required solr config to the solr directory. It can then be available when running solr commands to create the containers.
Conduct any other necessary setup with a script in Dockerfile. An example is provided for the notifications API, which needs to initialise some database tables.