Skip to content
This repository has been archived by the owner on Jun 14, 2023. It is now read-only.

Commit

Permalink
Merge pull request #25 from deephdc/release-2
Browse files Browse the repository at this point in the history
Release 2
  • Loading branch information
alvarolopez committed Feb 10, 2020
2 parents c37b052 + d91cea0 commit 87c2ec2
Show file tree
Hide file tree
Showing 41 changed files with 1,014 additions and 1,275 deletions.
119 changes: 79 additions & 40 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,43 +1,82 @@
# sphinx build folder
_build
build

# Compiled source #
###################
*.com
*.class
*.dll
*.exe
*.o
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]

# C extensions
*.so

# Packages #
############
# it's better to unpack these files and commit the raw source
# git has its own built in compression methods
*.7z
*.dmg
*.gz
*.iso
*.jar
*.rar
*.tar
*.zip

# Logs and databases #
######################
# Distribution / packaging
.Python
env/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*,cover

# Translations
*.mo
*.pot

# Django stuff:
*.log
*.sql
*.sqlite

# OS generated files #
######################
.DS_Store?
ehthumbs.db
Icon?
Thumbs.db

# Editor backup files #
#######################
*~
.*.swp

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# DotEnv configuration
.env

# Database
*.db
*.rdb

# Pycharm
.idea

# VS Code
.vscode/

# Spyder
.spyproject/

# Jupyter NB Checkpoints
.ipynb_checkpoints/

# exclude data from source control by default
#data/

# Mac OS-specific storage files
.DS_Store
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
sphinx>=1.6.2 # BSD
sphinx-markdown-tables
recommonmark
sphinx_rtd_theme
Binary file modified source/_static/DEEP_WP2-User_Viewpoint.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/dashboard-configure.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/dashboard-deployments.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/dashboard-history-full.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/dashboard-history.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/dashboard-home.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed source/_static/deepaas-endpoint.png
Binary file not shown.
Binary file modified source/_static/deepaas.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/logo-deep-solid-white.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed source/_static/mods_20181015_lstm_6m_1h_1h.png
Binary file not shown.
Binary file removed source/_static/mods_20181018-lstm-3days.png
Binary file not shown.
Binary file added source/_static/paas-dashboard.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed source/_static/seeds1.png
Binary file not shown.
Binary file removed source/_static/seeds2.png
Binary file not shown.
6 changes: 3 additions & 3 deletions source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,9 @@
author = 'DEEP-Hybrid-DataCloud consortium'

# The short X.Y version
version = ''
version = 'DEEP-2 (XXX)'
# The full version, including alpha/beta/rc tags
release = 'DEEP-1 (Genesis)'
release = 'DEEP-2 (XXX)'


# -- General configuration ---------------------------------------------------
Expand Down Expand Up @@ -93,7 +93,7 @@
'collapse_navigation': False,
}

html_logo = "_static/logo.png"
html_logo = "_static/logo-deep-solid-white.png"

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
Expand Down
25 changes: 22 additions & 3 deletions source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,20 +7,39 @@ User documentation
If you are a user (current or potential) you should start here.

.. toctree::
:maxdepth: 2
:maxdepth: 3

user/index

Component documentation
-----------------------

Individual components' documentation can be found here:

* `DEEPaaS documentation <https://docs.deep-hybrid-datacloud.eu/projects/deepaas/>`_


Technical documentation
-----------------------

If you are searching for technical notes on various notes.
If you are searching for technical notes on various areas, please check the
following section.

.. toctree::
:maxdepth: 3
:maxdepth: 1

technical/index

.. admonition:: Useful project links
:class: important

* `DEEP IAM <https://iam.deep-hybrid-datacloud.eu/>`_
* `DEEP Open Catalog - Marketplace <https://marketplace.deep-hybrid-datacloud.eu/>`_
* DEEP Dashboard (`Training <https://train.deep-hybrid-datacloud.eu/>`_, `Advanced <https://paas.cloud.cnaf.infn.it/>`_)
* `DEEP Nextcloud <https://nc.deep-hybrid-datacloud.eu>`_
* `Official GitHub <https://github.com/deephdc>`_
* `Official DockerHub <https://hub.docker.com/u/deephdc/>`_

Indices and tables
==================

Expand Down
74 changes: 51 additions & 23 deletions source/user/howto/add-to-DEEP-marketplace.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,63 +6,91 @@ This document describes how to add your trained service or model to the DEEP mar
Creating the Github repositories
--------------------------------

You model must have a repository to host the code and a repository to host the Dockerfiles. Both these repositories can he hosted under your personal Github account. Naming conventions are that the Docker repo name is the same as the code repo name with the prefix ``DEEP-OC-``.
You model must have a repository to host the code and a repository to host the Dockerfiles.
Both these repositories can he hosted under your personal Github account.
Naming conventions enforce that the Docker repo name is the same as the code repo name with the prefix ``DEEP-OC-``.

A typical example of this can be:

* `deephdc/image-classification-tf <https://github.com/deephdc/image-classification-tf>`_ - The Github repo hosting the code of an image classfication model.
* `deephdc/DEEP-OC-image-classification-tf <https://github.com/deephdc/DEEP-OC-image-classification-tf>`_ - The Github repo hosting the Dockerfiles of the image classification model.
* `deephdc/image-classification-tf <https://github.com/deephdc/image-classification-tf>`_ -
The Github repo hosting the code of an image classification model.
* `deephdc/DEEP-OC-image-classification-tf <https://github.com/deephdc/DEEP-OC-image-classification-tf>`_ -
The Github repo hosting the Dockerfiles of the image classification model.

In case you are only developing a service based on an already existing model (like for example developing an animal classifier based on the image-classification-tf module) you only need to create the Docker repo.
In case you are only developing a service based on an already existing module (like for example developing an animal
classifier based on the image-classification-tf module) you only need to create the Docker repo.

The code repo
^^^^^^^^^^^^^

This is the repo containing the code of your model. If you are adding a service (ie. a trained model) the weights of the trained model must be stored in a location accessible over a network connection, so that your container can download them upon creation. A few MUSTs your code has to comply with in order to ensure compatibility and ease of use:
This is the repo containing the code of your model. If you are adding a service (ie. a trained model) the weights of
the trained model must be stored in a location accessible over a network connection, so that your container can download
them upon creation.

* your code must be packaged in order to be ``pip`` installable. This should be the default behaviour if you used the :doc:`DEEP cookiecutter template <../overview/cookiecutter-template>` to develop your code.
* your code must be integrated with the DEEPaaS API. Check :ref:`this guide <user/overview/api:Integrate your model with the API>` on how to do this.
A few MUSTs your code has to comply with in order to ensure compatibility and ease of use:

* your code must be packaged in order to be ``pip`` installable. This should be the default behaviour
if you used the :doc:`DEEP Data Science template <../overview/cookiecutter-template>` to develop your code.
* your code must be integrated with the DEEPaaS API.
Check :ref:`this guide <user/overview/api:Integrate your model with the API>` on how to do this.

The Docker repo
^^^^^^^^^^^^^^^

This repo has to contain at least the following files (see the `Generic container <https://github.com/deephdc/DEEP-OC-generic-container>`_ for a template):
If you used the :doc:`DEEP Data Science template <../overview/cookiecutter-template>` to develop your code, a
template of this repo should have been created alongside the template of your code.

This repo has to contain at least the following files (see the `Generic container <https://github.com/deephdc/DEEP-OC-generic-container>`_
for a template):

* ``Dockerfile``

This is the file to build a container from your application. If you developed your application from the :doc:`DEEP cookiecutter template <../overview/cookiecutter-template>` you should have a draft of this file under the ``./docker`` folder (although you might need to add additional code depending on the requirements of your model).
This is the file used to build a container from your application. If you developed your application from the
:doc:`DEEP Data Science template <../overview/cookiecutter-template>` you should have a draft of this file
(although you might need to add additional code depending on the requirements of your model).

If you are adding a service instead of a model, it is good practice to draw inspiration from the Dockerfiles of other services derived from the same model (for example the `plant classification Dockerfile <https://github.com/deephdc/DEEP-OC-plants-classification-tf/blob/master/Dockerfile>`_ derived from the `image classification model <https://github.com/deephdc/DEEP-OC-image-classification-tf>`_).
If you are adding a service derived from an existing module, it is good practice to draw inspiration from the
Dockerfiles of the module or the services derived from that module (see for example the
`plant classification Dockerfile <https://github.com/deephdc/DEEP-OC-plants-classification-tf/blob/master/Dockerfile>`_
derived from the `image classification model <https://github.com/deephdc/DEEP-OC-image-classification-tf>`_).

Some steps common to all Dockerfiles include cloning the model code repo, pip installing the DEEPaaS API, installing rclone and downloading the trained weights if you are adding a service. For the details of all these steps please refer to this `Dockerfile example <https://github.com/deephdc/DEEP-OC-plants-classification-tf/blob/master/Dockerfile>`_.
Some steps common to all Dockerfiles include cloning the model code repo, pip installing the DEEPaaS API,
installing rclone and downloading the trained weights if you are adding a service.
For the details of all these steps please refer to this `Dockerfile example <https://github.com/deephdc/DEEP-OC-image-classification-tf/blob/master/Dockerfile>`_.

* ``Jenkinsfile``

This is the file that runs the Jenkins pipeline. You can copy this `Jenkinsfile example <https://github.com/deephdc/DEEP-OC-plants-classification-tf/blob/master/Jenkinsfile>`_ replacing the repo names with your own Docker repo name.
This is the file that runs the Jenkins pipeline. You can copy this `Jenkinsfile example <https://github.com/deephdc/DEEP-OC-image-classification-tf/blob/master/Jenkinsfile>`_
replacing the repo names with your own Docker repo name.

* ``metadata.json``

This file contains the information that is going to be displayed in the Marketplace. You can build your own starting from this `metadata.json example <https://github.com/deephdc/DEEP-OC-plants-classification-tf/blob/master/metadata.json>`_
This metadata will be validated during integration tests when the PR is accepted but you can manually `validate the metadata <https://github.com/deephdc/schema4deep>`_ beforehand by running:
This file contains the information that is going to be displayed in the Marketplace. You can build your own starting
from this `metadata.json example <https://github.com/deephdc/DEEP-OC-image-classification-tf/blob/master/metadata.json>`_.
This metadata will be validated during integration tests when the PR is accepted but you can manually
`validate the metadata <https://github.com/deephdc/schema4deep>`_ beforehand by running:

.. code-block:: console
.. code-block:: console
pip install git+https://github.com/deephdc/schema4apps
deep-app-schema-validator metadata.json
pip install git+https://github.com/deephdc/schema4apps
deep-app-schema-validator metadata.json
Making the Pull Request
-----------------------
Making the Pull Request (PR)
----------------------------

Once your repos are set it's time to make a PR to add your model to the marketplace!
For this you have to fork the code of the DEEP marketplace (`deephdc/deephdc.github.io <https://github.com/deephdc/deephdc.github.io>`_) and add your Docker repo name at the end of the ``project_apps.yml`` file in the **pelican** branch.
For this you have to fork the code of the DEEP catalog repo (`deephdc/deep-oc <https://github.com/deephdc/deep-oc>`_)
and add your Docker repo name at the end of the ``MODULES.yml``.

.. code-block:: console
git clone -b pelican https://github.com/[my-github-fork]
git clone https://github.com/[my-github-fork]
cd [my-github-fork]
echo '- module: https://github.com/[my-account-name]/DEEP-OC-[my-app-name]' >> project_apps.yml
echo '- module: https://github.com/[my-account-name]/DEEP-OC-[my-app-name]' >> MODULES.yml
git commit -a -m "adding new module to the catalogue"
git push
Once the changes are done, make a PR of your fork to the original repo (again to the pelican branch) and wait for approval.
You can also make it `online on GitHub <https://github.com/deephdc/deep-oc/edit/master/MODULES.yml>`_.

Once the changes are done, make a PR of your fork to the original repo and wait for approval.
Check the `GitHub Standard Fork & Pull Request Workflow <https://gist.github.com/Chaser324/ce0505fbed06b947d962>`_ in case of doubt.
90 changes: 90 additions & 0 deletions source/user/howto/deploy-orchent.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
.. include:: <isonum.txt>

.. highlight:: console

*****************************
Deployment with CLI (orchent)
*****************************

This is a step by step guide on how to make a deployment using the command line interface (instead of the Training
Dashboard).

.. admonition:: Requirements

* `oidc-agent <https://github.com/indigo-dc/oidc-agent/releases>`_ installed and configured for `DEEP-IAM <https://iam.deep-hybrid-datacloud.eu/>`_ (see :ref:`Configure oidc-agent <user/howto/oidc-agent:Configure oidc-agent>`).
* `orchent <https://github.com/indigo-dc/orchent/releases>`_ tool


Prepare your TOSCA file (optional)
----------------------------------

The orchent tool needs TOSCA YAML file to configure and establish the deployment. One can generate an application specific TOSCA template or use a general one, `deep-oc-marathon-webdav.yml <https://github.com/indigo-dc/tosca-templates/blob/master/deep-oc/deep-oc-marathon-webdav.yml>`__, while providing necessary inputs in the bash script (see next subsecion).

If you create your own TOSCA YAML file, the following sections should be modified (TOSCA experts may modify the rest of the template to their will):

* Docker image to deploy. In this case we will be using deephdc/deep-oc-image-classification-tf::

docker_img:
type: string
description: docker image from Docker Hub to deploy
required: yes
default: deephdc/deep-oc-image-classification-tf

* Location of the ``rclone.conf`` (this file can be empty, but should be at the indicated location)::

rclone_conf:
type: string
description: rclone.conf location
required: yes
default: "/srv/image-classification-tf/rclone.conf"

For further TOSCA templates examples you can go `here <https://github.com/indigo-dc/tosca-templates/tree/master/deep-oc>`__.

.. important::
**DO NOT** save the rclone credentials in the **CONTAINER** nor in the **TOSCA** file


Orchent submission script
-------------------------

You can use the general template, `deep-oc-mesos-webdav.yml <https://github.com/indigo-dc/tosca-templates/blob/master/deep-oc/deep-oc-mesos-webdav.yml>`__, but provide necessary parameters in a bash script. Here is an example for such a script, e.g. *submit_orchent.sh* :

.. code-block:: bash
#!/bin/bash
orchent depcreate ./deep-oc-marathon-webdav.yml '{ "docker_image": "deephdc/deep-oc-image-classification-tf"
"rclone_url": "https://nc.deep-hybrid-datacloud.eu/remote.php/webdav/",
"rclone_vendor": "nextcloud",
"rclone_conf": "/srv/image-classification-tf/rclone.conf"
"rclone_user": <your_nextcloud_username>
"rclone_pass": <your_nextcloud_password> }'
This script will be the **only place** where you will have to indicate <your_nextcloud_username> and <your_nextcloud_password>. This file should be stored locally and secured.

.. important::
**DO NOT** save the rclone credentials in the **CONTAINER** nor in the **TOSCA** file

.. tip::
When developing an application with the :ref:`Data Science template <user/overview/cookiecutter-template:DEEP Data Science template>`,
the DEEP-OC-<your_project> repository will contain an exampled script, named *submit_orchent_tmpl.sh*

Submit your deployment
----------------------

The submission is then done by running the orchent submission script you generated in the previous step::

./submit_orchent.sh

This will give you a bunch of information including your deployment ID. To check status of your job::

$ orchent depshow <Deployment ID>

Once your deployment is in status **CREATED**, you will be given various endpoints::

http://deepaas_endpoint
http://monitor_endpoint

N.B.: to check all your deployments::

$ orchent depls -c me

0 comments on commit 87c2ec2

Please sign in to comment.