Create a virtual environment, run pip install -e .
and then run pytest
. If the test cases pass you should be good to
start developing.
The project contains the following core folders:
credentials/
# The other secrets folderblueno/
# Shared code for our ML platformdashboard/
# Code for the ELVO App Engine dashboarddata/
# Contains all downloaded data. Data used in the code should be stored in the cloud.docs/
# Documentation for the project.etl/
# All data pipeline scripts, code which should be scheduled on Airflowlogs/
# For storing application logsml/
# All ML specific scripts, contains the blueno ML tookit as wellmodels/
# For storing trained ML models (HDF5, etc.)notebooks/
# Contains all notebooks.secrets/
# Store your secrets in this folder, so they don’t get uploaded to GitHub.
dashboard
, etl
, and ml
should be seen as top-level python
projects.
This means each folder should contain top level scripts with
packages as sub-folders.
As our codebase is small, we are keeping them in a single repo but there
are plans to separate these in the future.
To contribute, create a pull request. Every PR should be reviewed by at least one other person. See this gist for a guide on how to review a PR.
For developing on GPUs. See the docs
folder for more info.