Skip to content
This repository has been archived by the owner on Jun 14, 2023. It is now read-only.

Commit

Permalink
Merge pull request #32 from deephdc/ignacio-br0
Browse files Browse the repository at this point in the history
cleaning up
  • Loading branch information
laramaktub committed Jun 23, 2022
2 parents 02d567a + b67d152 commit 670c586
Show file tree
Hide file tree
Showing 5 changed files with 69 additions and 51 deletions.
7 changes: 6 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,9 @@
This repository contains software documentations, guides, tutorials, logbooks
and similar documents produced with DEEP Hybrid DataCloud project.

Please refer to http://docs.deep-hybrid-datacloud.eu/en/user-docs/ for more information
This documentation is deployed at: http://docs.deep-hybrid-datacloud.eu/

If you want to build the documentation locally for development, run:
```console
make html
```
17 changes: 11 additions & 6 deletions source/user/howto/develop-model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ Run :doc:`the DEEP Modules Template <../overview/cookiecutter-template>`.
This creates two project directories:
::

~/DEEP-OC-your_project
~/your_project
~/DEEP-OC-your_project
~/your_project

Go to ``github.com/your_account`` and create corresponding repositories: ``DEEP-OC-your_project`` and ``your_project``
Do ``git push origin --all`` in both created directories. This puts your initial code in Github.
Expand Down Expand Up @@ -47,7 +47,8 @@ Once you are fine with the state of your module, push the changes to Github.
Editing ``DEEP-OC-your_project`` code
-------------------------------------

This is the repo in charge of creating a single docker image to use your application.
This is the repo in charge of creating a single docker image that integrates
your application, along with deepaas and any other dependency.

You need to modify the following files according to your needs:

Expand All @@ -74,9 +75,11 @@ Once you are fine with the state of your module, push the changes to Github.
Integrating the module in the Marketplace
-----------------------------------------

Once your repos are set it's time to make a PR to add your model to the marketplace!
Once your repos are set, it's time to make a PR to add your model to the marketplace!

For this you have to fork the code of the DEEP catalog repo (`deephdc/deep-oc <https://github.com/deephdc/deep-oc>`_)
and add your Docker repo name at the end of the ``MODULES.yml``.
You can do this directly `online on GitHub <https://github.com/deephdc/deep-oc/edit/master/MODULES.yml>`_ or via the command line:

.. code-block:: console
Expand All @@ -86,7 +89,9 @@ and add your Docker repo name at the end of the ``MODULES.yml``.
git commit -a -m "adding new module to the catalogue"
git push
You can also make it `online on GitHub <https://github.com/deephdc/deep-oc/edit/master/MODULES.yml>`_.

Once the changes are done, make a PR of your fork to the original repo and wait for approval.
Check the `GitHub Standard Fork & Pull Request Workflow <https://gist.github.com/Chaser324/ce0505fbed06b947d962>`_ in case of doubt.

When your module gets approved, you may need to commit and push a change to ``metadata.json``
in ``DEEP-OC-your_project`` (`ref <https://github.com/deephdc/DEEP-OC-demo_app/blob/726e068d54a05839abe8aef741b3ace8a078ae6f/Jenkinsfile#L104>`__)
so that the Pipeline is run for the first time, and your module gets rendered in the marketplace.
8 changes: 4 additions & 4 deletions source/user/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ User documentation
New to the project? How about a quick dive?

.. toctree::
:maxdepth: 1
:maxdepth: 2

Quickstart <quickstart>

Expand All @@ -33,7 +33,7 @@ A more in depth documentation, with detailed description on the architecture and
components is provided in the following sections.

.. toctree::
:maxdepth: 1
:maxdepth: 2

DEEP architecture <overview/architecture>
User roles and workflows <overview/user-roles>
Expand All @@ -48,7 +48,7 @@ Use a model (basic user)
^^^^^^^^^^^^^^^^^^^^^^^^

.. toctree::
:maxdepth: 1
:maxdepth: 2

Perform inference locally <howto/inference-locally>

Expand All @@ -61,7 +61,7 @@ Train a model (intermediate user)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. toctree::
:maxdepth: 1
:maxdepth: 2

Train a model locally <howto/train-model-locally>
Train a model remotely <howto/train-model-remotely>
Expand Down
9 changes: 5 additions & 4 deletions source/user/overview/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,14 +32,14 @@ For example:
* **Enable model weights preloading**: implement ``warm``.
* **Enable model info**: implement ``get_metadata``.

If you don't feel like reading the DEEPaaS docs (you should), here are some
examples of files you can drawn inspiration from:
If you don't feel like reading the DEEPaaS docs (which you should), here are some
examples of files you can quickly drawn inspiration from:

* `returning a JSON response <https://github.com/deephdc/demo_app/blob/master/demo_app/api.py>`__
for ``predict()``.
* `returning a file (eg. image, zip, etc) <https://github.com/deephdc/demo_app/blob/return-files/demo_app/api.py>`__
for ``predict()``.
* a `more complex example <https://github.com/deephdc/image-classification-tf/blob/master/imgclas/api.py>`__ which also includes ``train``.
* a `more complex example <https://github.com/deephdc/image-classification-tf/blob/master/imgclas/api.py>`__ which also includes ``train()`` with monitoring.

.. tip::
Try to keep you module's code as decoupled as possible from DEEPaaS code, so that
Expand All @@ -49,7 +49,8 @@ examples of files you can drawn inspiration from:

.. code-block:: python
import utils # this is where your true predict function is
#api.py
import utils # eg. this is where your true predict function is
def predict(**kwargs):
args = preprocess(kwargs) # transform deepaas input to your standard input
Expand Down
79 changes: 43 additions & 36 deletions source/user/overview/cookiecutter-template.rst
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
.. include:: <isonum.txt>
.. highlight:: console

DEEP Modules Template
Expand Down Expand Up @@ -29,19 +30,19 @@ and then run the `cookicutter <https://cookiecutter.readthedocs.io>`_ tool as fo
You are first provided with ``[Info]`` line about the parameter and in the next line you configure this parameter.
You will be asked to configure:

* Remote URL to host your new repositories (git), e.g. https://github.com/deephdc, ``git_base_url``
* Project name, ``project_name``
* Name of your new repository, to be added after \"git_base_url\" (see above)", ``repo_name`` (aka <your_project> in the following)
* Author name(s) (and/or your organization/company/team). If many, separate by comma, ``author_name``
* E-Mail(s) of main author(s) (or contact person). If many, separate by comma, ``author_email``
* Short description of the project, ``description``
* Application version (expects X.Y.Z (Major.Minor.Patch)), ``app_version``
* Choose open source license, default is MIT. For more info: https://opensource.org/licenses, ``open_source_license``
* User account at hub.docker.com, e.g. 'deephdc' in https://hub.docker.com/u/deephdc, ``dockerhub_user``
* Docker image your Dockerfile starts from (FROM <docker_baseimage>) (don't provide the tag here), e.g. tensorflow/tensorflow, ``docker_baseimage``
* CPU tag for the baseimage, e.g. 2.9.1. Has to match python3!, ``baseimage_cpu_tag``
* GPU tag for the baseimage, e.g. 2.9.1-gpu. Has to match python3!, ``baseimage_gpu_tag``
* whether you want to receive updates if your model fails to build, ``failure_notify``
* ``git_base_url``: Remote URL to host your new git repositories (e.g. https://github.com/deephdc ).
* ``project_name``: Project name.
* ``repo_name``: Name of your new repository, to be added after \"git_base_url\" (see above)", (aka <your_project> in the following).
* ``author_name``: Author name(s) (and/or your organization/company/team). If many, separate by comma.
* ``author_email``: E-Mail(s) of main author(s) (or contact person). If many, separate by comma.
* ``description``: Short description of the project.
* ``app_version``: Application version (expects X.Y.Z (Major.Minor.Patch)).
* ``open_source_license``: Choose open source license, default is MIT. `More info <https://opensource.org/licenses>`__.
* ``dockerhub_user``: User account at hub.docker.com, e.g. 'deephdc' in https://hub.docker.com/u/deephdc .
* ``docker_baseimage``: Docker image your Dockerfile starts from (`FROM <docker_baseimage>`) (don't provide the tag here), (e.g. tensorflow/tensorflow ).
* ``baseimage_cpu_tag``: CPU tag for the baseimage, e.g. 2.9.1. Has to match python3!
* ``baseimage_gpu_tag``: GPU tag for the baseimage, e.g. 2.9.1-gpu. Has to match python3!
* ``failure_notify``: whether you want to receive updates if your model fails to build.

When these questions are answered, following two repositories will be created locally and immediately linked to your ``git_base_url``:

Expand All @@ -57,19 +58,18 @@ Project structure
Based on the on the branch you choose, the template will create different files, being master the most minimal option (see above).
The content of these files is populated based on your answer to the questions.

Master branch
^^^^^^^^^^^^^
**Master branch**

.. code-block::
.. code-block:: console
<your_project>
##############
├── LICENSE <- License file
├── README.md <- The top-level README for developers using this project.
├── requirements.txt <- The requirements file for reproducing the analysis environment, e.g.
generated with `pip freeze > requirements.txt`
├── requirements.txt <- The requirements file for reproducing the analysis
environment (`pip freeze > requirements.txt`)
├── setup.py, setup.cfg <- makes project pip installable (pip install -e .) so
│ {{cookiecutter.repo_name}} can be imported
Expand Down Expand Up @@ -97,10 +97,9 @@ Master branch
└─ metadata.json <- Defines information propagated to the DEEP Marketplace
Advanced branch
^^^^^^^^^^^^^^^
**Advanced branch**

.. code-block::
.. code-block:: console
<your_project>
##############
Expand All @@ -111,24 +110,31 @@ Advanced branch
├── docs <- A default Sphinx project; see sphinx-doc.org for details
├── models <- Trained and serialized models, model predictions, or model summaries
├── models <- Trained and serialized models, model predictions, or model
│ summaries
├── notebooks <- Jupyter notebooks. Naming convention is a number (for ordering),
│ the creator's initials (if many user development),
│ and a short `_` delimited description, e.g.
│ `1.0-jqp-initial_data_exploration.ipynb`.
├── notebooks <- Jupyter notebooks. Naming convention is a number
│ (for ordering), the creator's initials (if many
│ user development), and a short `_` delimited
│ description.
│ e.g.`1.0-jqp-initial_data_exploration.ipynb`.
├── references <- Data dictionaries, manuals, and all other explanatory materials.
├── references <- Data dictionaries, manuals, and all other explanatory
│ materials.
├── reports <- Generated analysis as HTML, PDF, LaTeX, etc.
│ └── figures <- Generated graphics and figures to be used in reporting
├── requirements.txt <- The requirements file for reproducing the analysis environment, e.g.
│ generated with `pip freeze > requirements.txt`
├── requirements.txt <- The requirements file for reproducing the analysis
│ environment, (`pip freeze > requirements.txt`)
├── test-requirements.txt <- The requirements file for the test environment
├── setup.py <- makes project pip installable (pip install -e .) so {{cookiecutter.repo_name}} can be imported
├── setup.py <- makes project pip installable (pip install -e .) so
│ {{cookiecutter.repo_name}} can be imported
├── {{cookiecutter.repo_name}} <- Source code for use in this project.
│ │
│ ├── __init__.py <- Makes {{cookiecutter.repo_name}} a Python module
│ │
│ ├── dataset <- Scripts to download or generate data
Expand All @@ -140,16 +146,16 @@ Advanced branch
│ ├── models <- Scripts to train models and make predictions
│ │ └── deep_api.py <- Main script for the integration with DEEP API
│ │
│ ├── tests <- Scripts to perfrom code testing
│ ├── tests <- Scripts to perform code testing
│ │
│ └── visualization <- Scripts to create exploratory and results oriented visualizations
│ └── visualize.py
│ └── visualization <- Scripts to create exploratory and results oriented
│ └── visualize.py visualizations
└── tox.ini <- tox file with settings for running tox; see tox.testrun.org
DEEP-OC-<your_project>
######################
├─ Dockerfile <- Describes main steps on integrationg DEEPaaS API and
├─ Dockerfile <- Describes main steps on integration DEEPaaS API and
│ <your_project> application in one Docker image
├─ Jenkinsfile <- Describes basic Jenkins CI/CD pipeline
Expand All @@ -158,6 +164,7 @@ Advanced branch
├─ README.md <- README for developers and users.
├─ docker-compose.yml <- Allows running the application with various configurations via docker-compose
├─ docker-compose.yml <- Allows running the application with various configurations
│ via docker-compose
└─ metadata.json <- Defines information propagated to the [DEEP Open Catalog](https://marketplace.deep-hybrid-datacloud.eu)
└─ metadata.json <- Defines information propagated to the DEEP Marketplace

0 comments on commit 670c586

Please sign in to comment.