A library for developing API's following the HAL spec
Python HTML
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
docker
docs
events/events/importers
moxie
solr
.gitignore
.travis.yml
Dockerfile
Dockerfile_solr
LICENSE.txt
MANIFEST.in
README.md
docker-compose.yml
fabfile.py
fig.yml
requirements.txt
requirements_dev.txt
runserver.py
runtests.py
setup.py

README.md

moxie

Build Status

The new Mobile Oxford

This repository contains the (server-side) JSON API.

Documentation

Available at Read the docs

Documentation is also available at /docs in the repo, you need to install required packages (Sphinx) from requirements_dev.txt.

You can generate HTML documentation by running make html inside the docs directory.

Requirements

  • Solr 4.4
  • Redis
  • libgeos2
  • pip (easy_install pip)

How to run

Installation

  • pip install -r requirements.txt

Running the application

  • celery worker --app moxie.worker
  • python runserver.py

Options available for runserver.py

  • run with a profiler: python runserver.py --profiler
  • specify the logging level (INFO by default): python runserver.py --log-level DEBUG

Periodically, and the first time, you have to run importers to import data into the search index. You can do this via a Python shell:

>>> from moxie.places.tasks import import_all
>>> import_all.delay()

Deploying with Fabric

Steps:

  • Add your public SSH key to /srv/moxie/.ssh/authorized_keys
  • Execute the fabric script on your local machine, which will then connect to the designated server and run the pre-programmed tasks:

fab deploy_api:GIT_BRANCH -g USER@GATEWAY -H moxie@PRIVATE_IP

For example:

fab deploy_api:master -g martinfilliau@mox-api-front.oucs.ox.ac.uk -H moxie@192.168.2.102

  • Optional: Use an ssh_config file to define the gateway to and provide aliases for machines behind the front-end server, and the user to connect as. Then the -g flag and username become unnecessary and memorable hostnames can be used instead of IP addresses:

fab deploy_api:master -H mox-api-1.oucs.ox.ac.uk

See puppet/fabric/ubuntu-ssh/ssh_config for examples.

Running a specific task

Sometimes it's necessary to trigger a specific task on the production server This can be done as follows

  1. log into the tasks server as moxie user
  2. set environment variable for moxie settings
export MOXIE_SETTINGS=/srv/moxie/tasks_settings.yaml
  1. launch a python shell
/srv/moxie/python-env/bin/python
  1. Configure basic logging
>>> import logging
>>> logging.basicConfig()
  1. Import and run the task e.g.
>>> from moxie.places.tasks import import_naptan
>>> import_naptan()

Examining the remote solr collections

The best way to do this via the solr web interface by tunnelling to the solr server through the front server.

sudo ssh -L 8080:localhost:8080 moxie@mox-api-front.oucs.ox.ac.uk

Then from the mox-api-front shell:

sudo ssh -L 8080:localhost:8080 ubuntu@192.168.2.104

Then point browser at localhost:8080

Local Development with Docker

The repository includes a set of Dockerfiles and a docker-compose file which together allow a local development environment to be launched, which includes:

  • web app server
  • redis (some further work possibly required to make this work correctly)
  • postgres (some further work possibly required to make this work correctly)
  • solr

This process isn't quite as streamlined as we would wish, but hopefully these instructions are clear enough that it is manageable, and the best way to conduct local development.

Note, this uses solr 5.3, whereas the production API currently uses 4.x, so there may be some differences.

By default, this includes only the modules which are part of this core repository (namely places and transport. See below for how to include other modules)

This guide assumes that docker is installed, with a local host set up. (Don't forget to run eval $(docker-machine env default)!)

Start the dev environment with

docker-compose up

This will build the required images and start the containers.

Configure solr collection

At this point the solr instance does not have any collections configured. At present this must be done manually as follows:

Log into the solr container

docker exec -it moxie_solr_1 /bin/bash

Run the command to initialise the collection

/opt/solr/bin/solr create_core -c <collection_name> -d <collection config directory>

where the collection name is as specified in the docker/app_settings.yaml file, and the config is the relevant path in solr/config (which is mounted to /opt/solr/moxie_config/config).

e.g. to initialise the two collections required by the places API, invoke:

/opt/solr/bin/solr create_core -c places_staging -d /opt/solr/moxie_config/config/places/
/opt/solr/bin/solr create_core -c places_production -d /opt/solr/moxie_config/config/places/

(these both share the same config)

Run an import task

To run an import task, first attach to the web container:

docker exec -it moxie_web_1 /bin/bash

Then initiate a python session

python

It's then necssary to initalise a logger

import logging
logging.basicConfig()

Then the relevant task can be imported and run e.g.

from moxie.places.tasks import import_oxpoints
import_oxpoints()

... or for a module added externally:

from moxie_courses.tasks import import_xcri_ox
import_xcri_ox()

Note: If python code is changed, it is necessary to quit the python session and enter a new one, in order for the change to take effect.

Adding additional modules

Moxie's architecture consists of a core, with various additional modules bolted on. In order to add an additional module to the dockerized environment, there are few steps required.

  1. Copy the code for the module into a subfolder the core moxie directory. The name must have underscores instead of minuses

  2. Add an install command to the main Dockerfile e.g. RUN pip install -e ./moxie_events

  3. Add relevant sections to the docker/app_settings.yaml file. An example config used in production can be found in the github/ox-it/puppet repository. Locations for redis/solr should be amended to use the container names, rather than ip addresses.

  4. Add any required solr config to the solr directory. It can then be available when running solr commands to create the containers.

  5. Conduct any other necessary setup with a script in Dockerfile. An example is provided for the notifications API, which needs to initialise some database tables.