Skip to content
This repository has been archived by the owner on Jun 14, 2023. It is now read-only.

Commit

Permalink
Merge branch 'user-docs'
Browse files Browse the repository at this point in the history
  • Loading branch information
alvarolopez committed Jan 14, 2019
2 parents 3a9417a + 865496d commit bf1f474
Show file tree
Hide file tree
Showing 29 changed files with 1,406 additions and 8 deletions.
Binary file added source/_static/OpenID_Connect.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/deepaas-endpoint.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/deepaas.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/nc-access.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/nc-folders.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/rocky-release-logo.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/seeds1.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added source/_static/seeds2.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
13 changes: 8 additions & 5 deletions source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,8 @@
# ones.
extensions = [
'sphinx.ext.todo',
'sphinx_markdown_tables',
'sphinx.ext.autosectionlabel'
]

# Add any paths that contain templates here, relative to this directory.
Expand All @@ -50,10 +52,6 @@
}


extensions = [
'sphinx_markdown_tables',
]

# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
Expand Down Expand Up @@ -176,4 +174,9 @@
# -- Options for todo extension ----------------------------------------------

# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = True
todo_include_todos = False
todo_emit_warnings = True

# -- Options for autosectionlabel extension ----------------------------------------------

autosectionlabel_prefix_document = True
2 changes: 1 addition & 1 deletion source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ User documentation
If you are a user (current or potential) you should start here.

.. toctree::
:maxdepth: 3
:maxdepth: 2

user/index

Expand Down
149 changes: 149 additions & 0 deletions source/user/howto/develop-model.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,149 @@
.. highlight:: console

**************************************
Develop a model using DEEP UC template
**************************************


1. Prepare DEEP UC environment
------------------------------


Install cookiecutter (if not yet done)
::

$ pip install cookiecutter
Run the DEEP UC cookiecutter template
::

$ cookiecutter https://github.com/indigo-dc/cookiecutter-data-science
Answer all questions from DEEP UC cookiecutter template with attentions to
``repo_name`` i.e. the name of your github repositories, etc.
This creates two project directories:
::

~/DEEP-OC-your_project
~/your_project
Go to ``github.com/your_account`` and
create corresponding repositories: ``DEEP-OC-your_project`` and ``your_project``
Do ``git push origin master`` in both created directories. This puts your initial code to ``github``.


2. Improve the initial code of the model
----------------------------------------

The structure of ``your_project`` created using
`DEEP UC template <https://github.com/indigo-dc/cookiecutter-data-science>`_ contains
the following core items needed to develop a DEEP UC model:
::

requirements.txt
data/
models/
{{repo_name}}/dataset/make_dataset.py
{{repo_name}}/features/build_features.py
{{repo_name}}/models/model.py


2.1 Installing development requirements

Modify ``requirements.txt`` according to your needs (e.g. add more libraries) then run
::

$ pip install -r requirements.txt
You can modify and add more ``source files`` and put them
accordingly into the directory structure.


2.2 Make datasets
=================

Source files in this directory aim to manipulate raw datasets.
The output of this step is also raw data, but cleaned and/or pre-formatted.
::

{{repo_name}}/dataset/make_dataset.py
{{repo_name}}/dataset/


2.3 Build features
===================

This step takes the output from the previous step `Make datasets` and
creates train, test as well as validation ML data from raw but cleaned and pre-formatted data.
The realisation of this step depends on the concrete Use Case, the aim of the application as well as
available technological backgrounds (e.g. high-performance supports for data processing).
::

{{repo_name}}/features/build_features.py
{{repo_name}}/features/


2.4 Develop models
==================

This step deals with the most interesting phase in ML i.e. modelling.
The most important thing of DEEP UC models is located in ``model.py``
containing DEEP entry point implementations.
DEEP entry points are defined using :ref:`API methods <user/overview/api:Methods>`.
You don't need to implement all of them, just the ones you need.
::

{{repo_name}}/models/model.py
{{repo_name}}/models/


3. Create a docker containe for your model
=========================================

Once your model is well in place, you can encapsulate it by creating a docker container. For this you need to create a Dockerfile. This file will contain the information about the Docker, including the type of operating system you want to run on and the packages you need installed to make your package run.

The simplest Dockerfile could look like this::

FROM ubuntu:18.04

WORKDIR /srv
#Download and install your model package
RUN git clone https://github.com/your_git/your_model_package && \
cd image-classification-tf && \
python -m pip install -e . && \
cd ..

#Download and install DEEPaaS API
RUN git clone https://github.com/indigo-dc/DEEPaaS.git && \
cd DEEPaaS && \
python -m pip install -U . && \
cd ..

# Install rclone
RUN wget https://downloads.rclone.org/rclone-current-linux-amd64.deb && \
dpkg -i rclone-current-linux-amd64.deb && \
apt install -f && \
rm rclone-current-linux-amd64.deb && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* && \
rm -rf /root/.cache/pip/* && \
rm -rf /tmp/*

# Expose API on port 5000 and tensorboard on port 6006
EXPOSE 5000 6006

CMD deepaas-run --listen-ip 0.0.0.0


For more details on rclone or on DEEPaas API you can check :doc:`here <rclone>` and `here <https://github.com/indigo-dc/DEEPaaS>`_ respectively.

If you want to see an example of a more complex Dockerfile, you can check it `here <https://github.com/indigo-dc/DEEP-OC-image-classification-tf/blob/master/Dockerfile>`_.

In order to compile the Dockerfile, you should choose a name for the container and use the docker build command::
docker build -t your_container_name -f Dockerfile


You can then upload it to Docker hub so that you can download the already compiled image directly. To do so, follow the instructions `here <https://docs.docker.com/docker-hub/repos/>`_.

17 changes: 17 additions & 0 deletions source/user/howto/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
========
HowTo's
========

.. toctree::
:maxdepth: 1
:glob:

Use rclone <rclone>
Develop a model <develop-model>
Install and configure oidc-agent <oidc-agent>
Train a model locally <train-model-locally>
Train a model remotely <train-model-remotely>
Test a service locally <try-service-locally>
Use Openstack API with OIDC tokens <oidc-auth>
Video demos <video-demos>

77 changes: 77 additions & 0 deletions source/user/howto/oidc-agent.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
.. highlight:: console

********************************
Install and configure oidc-agent
********************************

1. Installing oidc-agent
------------------------
oidc-agent is a tool to manage OpenID Connect tokens and make them easily usable from the command line. Installation instructions and full documentation can be found `here <https://indigo-dc.gitbooks.io/oidc-agent/>`_.

2. Configuring oidc-agent with DEEP-IAM
---------------------------------------------------

.. admonition:: Prerequisites

* `DEEP-IAM <https://iam.deep-hybrid-datacloud.eu/>`_ registration


* Start oidc-agent::

$ eval $(oidc-agent)

* Run::

$ oidc-gen

You will be asked for the name of the account to configure. Let's call it **deep-iam**.
After that you will be asked for the additional client-name-identifier, you should choose the option::

[2] https://iam.deep-hybrid-datacloud.eu/

Then just click Enter to accept the default values for Space delimited list of scopes [openid profile offline_access].

* After that, if everything has worked properly, you should see the following messages::

Registering Client ...
Generating account configuration ...
accepted
* At this point you will be given a URL. You should visit it in the browser of your choice in order to continue and approve the registered client.
* For this you will have to login into your DEEP-IAM account and accept the permissions you are asked for.

* Once you have done this you will see the following message::

The generated account config was successfully added to oidc-agent. You don't have to run oidc-add

Next time you want to start oidc-agent from scratch, you will only have to do::

$ eval $(oidc-agent)
oidc-add deep-iam
Enter encryption password for account config deep-iam: ********
success

* You can print the token::

$ oidc-token deep-iam


*2.1 Usage with orchent*

* You should set OIDC_SOCK (this is not needed, if you did it before)::

$ eval (oidc-agent)
oidc-add deep-iam

* Set the agent account to be used with orchent::

$ export ORCHENT_AGENT_ACCOUNT=deep-iam

* You also need to set ORCHENT_URL, e.g::

$ export ORCHENT_URL="https://deep-paas.cloud.cnaf.infn.it/orchestrator"





6 changes: 4 additions & 2 deletions source/user/oidc-auth.rst → source/user/howto/oidc-auth.rst
Original file line number Diff line number Diff line change
Expand Up @@ -85,5 +85,7 @@ following page:

.. _Openstack CLI: https://docs.openstack.org/python-openstackclient/rocky/cli/command-list.html

.. |Using Openstack API| image:: rocky-release-logo.png
.. |with OIDC tokens| image:: OpenID_Connect.png
.. |Using Openstack API| image:: ../../_static/rocky-release-logo.png
:scale: 50%
.. |with OIDC tokens| image:: ../../_static/OpenID_Connect.png
:scale: 20 %

0 comments on commit bf1f474

Please sign in to comment.