ifbcat is the database hosting and serving the IFB Catalogue through a REST API.
Code is formatted using https://github.com/psf/black (version 20.8b1). Please use pre-commit along with black to commit only well formatted code:
#install dependencies
pip install -r requirements-dev.txt
#enable black as a pre-commit hook which will prevent committing not formated source
pre-commit install
#run black as it will be before each commit
pre-commit run black
- Install requirements:
- docker,
- docker-compose,
- postgresql-devel (libpq-dev in Debian/Ubuntu, libpq-devel on Centos/Cygwin/Babun.),
- Virtual env,
- requirements.txt
virtualenv .venv -p python3
. .venv/bin/activate
pip install -r requirements.txt
- Run the DB locally:
# Copy (and optionally tweak) ini
cp resources/default.ini local.ini
cp ifbcat/settings.example.ini ifbcat/settings.ini
docker compose -f docker-compose.yaml -f docker-compose.dev.yaml run db
Note that a volume is created. To remove it run:
docker compose -f docker-compose.yaml -f docker-compose.dev.yaml down --volumes
- Retrieve import data (ask access to private repository if needed):
git clone git@github.com:IFB-ElixirFr/ifbcat-importdata.git import_data
- Run tests:
python manage.py test
Currently, you should expect to see some "ERROR" but tests should be "OK" in the end of the log.
- Do migrations, superuser creation, some imports and start the test server:
python manage.py migrate
python manage.py createsuperuser
python manage.py load_catalog
python manage.py runserver
- You can do more imports using commands available in
ifbcat_api/management/commands
. Some are not currently working properly but at least these ones below should.
python manage.py load_catalog
python manage.py load_biotools
You can run the webserver within the docker-compose, it allows you to not have a fully functional virtualenv (without psycopg2-binary system library for example). The drawback is that you will not be able to use the debugger of your IDE.
- Retrieve data
git clone git@github.com:IFB-ElixirFr/ifbcat-importdata.git ./import_data
- Build and start the whole compose with dev settings
# Copy (and optionally tweak) ini
cp resources/default.ini local.ini
docker compose build
docker compose -f docker-compose.yaml -f docker-compose.dev.yaml up -d
The webserver is running at http://0.0.0.0:8080 (the instance on 8000 does not have css served)
- Create a superuser, do some imports
docker compose exec web python manage.py migrate
docker compose exec web python manage.py createsuperuser
docker compose exec web python manage.py load_catalog
- Do some cleanup To remove db volumes run:
docker compose -f docker-compose.yaml -f docker-compose.dev.yaml down --volumes
To remove build and pulled images:
docker compose -f docker-compose.yaml -f docker-compose.dev.yaml down --rmi all
docker exec -e PGPASSWORD=the_super_password $(docker ps -q) pg_dump --clean -h localhost -U postgres --format plain | sed "s/pbkdf2_sha256[^\t]*/redacted/g" > my_dump.sql
We consider here that no container are started. You have to get the dump, and uncompress it in the root directory of the project, and name it data
docker stop $(docker ps -q)
docker compose -f docker-compose.yaml -f docker-compose.dev.yaml run -d db
docker exec -e PGPASSWORD=the_super_password $(docker ps -q) psql -h localhost -U postgres -f /code/data
All of this consider that you already are on the server and you are sudoer
sudo service ifbcat restart
- Restart stop the service, flush old images from disk, rebuild images and start the service. It clean up the server and helps in case of server full error
cd /var/ifbcat-src
sudo git pull
sudo service ifbcat restart
- Reload will rebuild the service before restarting it, it does not remove old image. It is faster
cd /var/ifbcat-src/ && sudo git pull && sudo service ifbcat reload
cd /var/ifbcat-src
sudo docker compose exec web python manage.py createsuperuser
First copy the data
# go into your local git clone of ifbcat-importdata, and then:
rsync -avz . catalogue-ifb:/var/ifbcat-importdata/ --exclude=".git"
cd /var/ifbcat-src
sudo docker compose -f docker-compose.yaml -f docker-compose.import.yaml run web python manage.py load_users
sudo docker compose -f docker-compose.yaml -f docker-compose.import.yaml run web python manage.py load_biotools
Or all imports :
sudo docker compose -f docker-compose.yaml -f docker-compose.import.yaml run web python manage.py load_catalog
To export the models to an image, you hav to:
- uncomment django_extensions in settings.py
- and then run this:
pip install pygraphviz django-extensions
python manage.py graph_models -a -g -o ifb_cat.png
python manage.py graph_models -a -g -o ifb_cat.svg