Skip to content
This repository has been archived by the owner on Jun 14, 2023. It is now read-only.

Commit

Permalink
add quistart guide, different way of writing sections in users index
Browse files Browse the repository at this point in the history
  • Loading branch information
valentin.kozlov committed Jan 10, 2019
1 parent 24d5d33 commit 151f6cd
Show file tree
Hide file tree
Showing 3 changed files with 120 additions and 23 deletions.
18 changes: 12 additions & 6 deletions source/user/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,11 @@ User documentation

.. todo:: Documentation is being written at this moment.

Quickstart
----------

.. raw:: html

<h2>Quickstart guide</h2>


If you want a quickstart guide, please check the following link.

Expand All @@ -15,8 +18,10 @@ If you want a quickstart guide, please check the following link.
try-model-locally
develop-model

Overview
---------
.. raw:: html

<h2>Overview</h2>


A more in depth documentation, with detailed description on the archicture and
components is provided in the following sections.
Expand All @@ -26,8 +31,9 @@ components is provided in the following sections.

overview/index

Examples
--------
.. raw:: html
<h2>Examples</h2>


The following sections provide information on how several deep learning models
have been developed and integrated with our platform.
Expand Down
125 changes: 108 additions & 17 deletions source/user/quickstart.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
Quickstart guide
----------------
=================
Quickstart Guide
=================

.. todo:: Provide information on (at least):

Expand All @@ -12,26 +13,116 @@ Quickstart guide
4. Create a container from a model


Integrate a model with the API
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Download model from the marketplace
-----------------------------------

The `DEEPaaS API <https://github.com/indigo-dc/DEEPaaS>`_ enables a user friendly interaction with the underlying Deep
Learning models and can be used both for training and inference with the models.
#. go to `DEEP Open Catalog <https://deephdc.github.io/>`_
#. `Browse <https://deephdc.github.io/#model-list>`_ available models
#. Find the model and get it either from `Docker hub <https://hub.docker.com/u/deephdc>`_ (easy) or `github <https://github.com/topics/deep-hybrid-datacloud>`_ (pro)


Run downloaded model locally
----------------------------

.. _docker-hub-way:

Docker Hub way (easy)
^^^^^^^^^^^^^^^^^^^^^
**Prerequisites:** either `docker <https://docs.docker.com/install/#supported-platforms>`_
(+ `nvidia-docker <https://github.com/nvidia/nvidia-docker/wiki/Installation-(version-2.0)>`_ for GPU support) or
`udocker <https://github.com/indigo-dc/udocker/releases>`_ (GPU support is implemented)

1. To run the Docker container directly from Docker Hub and start using the `API <https://github.com/indigo-dc/DEEPaaS>`_ simply run the following:

Via docker command::

$ docker run -ti -p 5000:5000 deephdc/deep-oc-model_of_interest

With GPU support::

$ nvidia-docker run -ti -p 5000:5000 deephdc/deep-oc-model_of_interest
Via udocker::

$ udocker run -p 5000:5000 deephdc/deep-oc-model_of_interest
Via udocker with GPU support::

$ udocker pull deephdc/deep-oc-model_of_interest
$ udocker create --name=model_of_interest deephdc/deep-oc-model_of_interest
$ udocker setup --nvidia model_of_interest
$ udocker run -p 5000:5000 model_of_interest
2. To access the downloaded model via `API <https://github.com/indigo-dc/DEEPaaS>`_, direct your web browser to http://127.0.0.1:5000

For more details on particular model, please, read :doc:`model <models/index>` documentation.

.. note:: udocker is entirely a user tool, i.e. it can be installed and used without any root priveledges, e.g. in a user environment at HPC cluster.

Github way (pro)
^^^^^^^^^^^^^^^^
**Prerequisites:** `docker <https://docs.docker.com/install/#supported-platforms>`_

Using Github way allows to modify the Dockerfile for including additional packages, for example.

1. Clone the DEEP-OC-model_of_interest github repository::

$ git clone https://github.com/indigo-dc/DEEP-OC-model_of_interest

2. Build the container::

$ cd DEEP-OC-model_of_interest
$ docker build -t deephdc/deep-oc-model_of_interest .

3. Run the container using one of the methods described above, :ref:`docker-hub-way`

.. note:: One can also clone the source code of the model, usually located in the 'model_of_interest' repository.


Integrate your model with the API
---------------------------------

.. image:: ../_static/deepaas.png

The `DEEPaaS API <https://github.com/indigo-dc/DEEPaaS>`_ enables a user friendly interaction with the underlying Deep
Learning models and can be used both for training and inference with the models. Check the full :doc:`API guide <overview/api>` for the detailed info.

An easy way to integrate your model with the API and create Dockerfiles for building the Docker image with the integrated
`DEEPaaS API <https://github.com/indigo-dc/DEEPaaS>`_ is to use our :doc:`cookiecutter-data-science <overview/cookiecutter-template>` template.


Run model on DEEP Infrastructures
---------------------------------
**Prerequisites:**

* `DEEP-IAM <https://iam.deep-hybrid-datacloud.eu/>`_ registration
* `oidc-agent <https://github.com/indigo-dc/oidc-agent/releases>`_ installed and configured for `DEEP-IAM <https://iam.deep-hybrid-datacloud.eu/>`_
* `orchent <https://github.com/indigo-dc/orchent/releases>`_ tool

If your are going to use `DEEP-Nextcloud <https://nc.deep-hybrid-datacloud.eu>`_ you also have to:

* Register at `DEEP-Nextcloud <https://nc.deep-hybrid-datacloud.eu>`_
* Include `rclone <https://rclone.org/install/>`_ installation in your Dockerfile (see :doc:`rclone howto <howto/rclone>`)
* Include call to rclone in your code (see :doc:`rclone howto <howto/rclone>`)

In order to submit your job to DEEP Infrastructures one has to create TOSCA YAML file, for some examples, please,
see `here <https://github.com/indigo-dc/tosca-templates/tree/master/deep-oc>`_.

The submission is then done via::

$ orchent depcreate ./topology-orchent.yml '{}'
If you also want to access `DEEP-Nextcloud <https://nc.deep-hybrid-datacloud.eu>`_ from your container via rclone,
you can create a following bash script for job submission::

To :ref:`integrate your model with the API <user/overview/api:Integrate your model with the API>` with the API you need
to define the :ref:`API methods <user/overview/api:Methods>` on your model.
Those methods can (optionally) include any of the following:
#!/bin/bash
orchent depcreate ./topology-orchent.yml '{ "rclone_url": "https://nc.deep-hybrid-datacloud.eu/remote.php/webdav/",
"rclone_vendor": "nextcloud",
"rclone_user": <your_nextcloud_username>
"rclone_pass": <your_nextcloud_password> }'

* :ref:`get_metadata <api-methods_get-metadata>`
* :ref:`get_train_args <api-methods_get-train-args>`
* :ref:`train <api-methods_train>`
* :ref:`predict_file <api-methods_predict-file>`
* :ref:`predict_data <api-methods_predict-data>`
* :ref:`predict_url <api-methods_predict-url>`

To test the API locally, install the API with ``pip install deepaas`` and run it with ``deepaas-run --listen-ip 0.0.0.0``.
To check status of your job::

Check the full :doc:`API guide <overview/api>` for more info.
$ orchent depshow <Deployment ID>

0 comments on commit 151f6cd

Please sign in to comment.