Mission Control is a monitoring service for Firefox release health, it allows you to view in (near) real time the rate of crashes and other quantitative measures of quality. It uses the dataset generated by the telemetry-streaming library.
Getting in touch
We welcome contributions to Mission Control! Working on the UI component (see instructions immediately below) does not require any special access to Mozilla's internal systems.
If you’re looking for a way to jump in and contribute, our list of good first issues is a great place to start.
Instructions for development (UI only)
If you only want to hack on the UI, you can set up a local-only of missioncontrol which pulls data from the current production server. You only need to have yarn installed.
yarn install yarn start
This should start up a webserver at http://localhost:5000 which you can connect to.
Instructions for development (full stack)
yarn install cp .env-dist .env make build make up make fixtures
After you have brought the environment up, you can bring up a development version of
the server by running
make shell and then running
from there. You should then be able to connect to
your web browser.
By default the environment uses a rather improverished set of test data, so
the environment will not be that interesting. If you have Mozilla credentials,
you can set up
SECRET_KEY variables in a
.env file to have it pull data
from a production dataset. Once you have that set up, you should be able to
download a set of recent data from a shell environment (
make shell) via the
load_measure_data subcommand. E.g.:
./manage.py load_measure_data linux release main_crashes
The recommended way of running the tests locally is via the shell environment.
make shell, execute:
By default all tests and linters are run. Often you just want to run a subset of the python tests. You can do this by adding some arguments to your tox invocation:
tox -e tests -- -k tests/test_api.py # run only tests in test_api.py
Instructions for deployment
The target environment for this project follows the dockerflow conventions. In order to run it correctly, a number of environment variables need to be set up. The full list of variables can be found in the web section of the docker-compose.yml file. From a services standpoint, this project requires:
- a Postgres DB to store the application data, defined by DATABASE_URL
- a Presto/Athena service, defined by PRESTO_URL
- an optional Redis cache service, defined by CACHE_URL