Skip to content
This repository has been archived by the owner on Jun 14, 2023. It is now read-only.

Commit

Permalink
cookiecutter template doc updated
Browse files Browse the repository at this point in the history
  • Loading branch information
valentin.kozlov committed Jan 23, 2020
1 parent 363b434 commit 5d75cf5
Show file tree
Hide file tree
Showing 2 changed files with 67 additions and 58 deletions.
121 changes: 65 additions & 56 deletions source/user/overview/cookiecutter-template.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,93 +11,98 @@ In order to create your project based on the template, one has to `install <http

$ cookiecutter https://github.com/indigo-dc/cookiecutter-data-science

You will be asked several questions, e.g.:

* User account at github.com, e.g. 'deephdc'
* Project name
* Repository name for the new project
* (main) author name
* Email of the author (or contact person)
* Short description of the project
* Application version
* Choose open source license
* Version of python interpreter
* Docker Hub account name
* Base image for Dockerfile (as tensorflow/tensorflow)
* CPU tag for the Base Docker image, e.g. 1.12.0-py3
* GPU tag for the Base Docker image, e.g. 1.12.0-gpu-py3
You are first provided with [Info] line about the parameter and in the next line you configure this parameter. You will be asked to configure:

* Remote URL to host your new repositories (git), e.g. https://github.com/deephdc, ``git_base_url``
* Project name, ``project_name``
* Name of your new repository, to be added after \"git_base_url\" (see above)", ``repo_name``
* Author name(s) (and/or your organization/company/team). If many, separate by comma, ``author_name``
* E-Mail(s) of main author(s) (or contact person). If many, separate by comma, ``author_email``
* Short description of the project, ``description``
* Application version (expects X.Y.Z (Major.Minor.Patch)), ``app_version``
* Choose open source license, default is MIT. For more info: https://opensource.org/licenses, ``open_source_license``
* User account at hub.docker.com, e.g. 'deephdc' in https://hub.docker.com/u/deephdc, ``dockerhub_user``
* Docker image your Dockerfile starts from (FROM <docker_baseimage>) (don't provide the tag here), e.g. tensorflow/tensorflow, ``docker_baseimage``
* CPU tag for the baseimage, e.g. 1.14.0-py3. Has to match python3!, ``baseimage_cpu_tag``
* GPU tag for the baseimage, e.g. 1.14.0-gpu-py3. Has to match python3!, ``baseimage_gpu_tag``

.. note:: These parameters are defined in ``cookiecutter.json`` in the `cookiecutter-data-science <https://github.com/indigo-dc/cookiecutter-data-science>`_ source.

When these questions are answered, following two repositories will be created locally and immediately linked to your github.com account::

~/DEEP-OC-your_project
~/your_project
~/DEEP-OC-user_project
~/user_project

each repository has two branches: 'master' and 'test'.

your_project repo
-----------------
<user_project> repo
-------------------

Main repository for your model with the following structure::
Main repository to integrate model with the following structure::

├─ data Placeholder for the data
├─ docs Documentation on the project; see sphinx-doc.org for details
|
├── data Placeholder for the data
│ └── raw The original, immutable data dump.
├─ docker Directory for development Dockerfile(s)
├── docs Documentation on the project; see sphinx-doc.org for details
├─ models Trained and serialized models, model predictions, or model summaries
├─ models Trained and serialized models, model predictions, or model summaries
├─ notebooks Jupyter notebooks. Naming convention is a number (for ordering),
├─ notebooks Jupyter notebooks. Naming convention is a number (for ordering),
│ the creator's initials (if many user development),
│ and a short `_` delimited description,
│ e.g. `1.0-jqp-initial_data_exploration.ipynb`.
├─ references Data dictionaries, manuals, and all other explanatory materials.
├─ references Data dictionaries, manuals, and all other explanatory materials.
├─ reports Generated analysis as HTML, PDF, LaTeX, etc.
├─ reports Generated analysis as HTML, PDF, LaTeX, etc.
├─ your_project Main source code of the project
│ │
│ ├── __init__.py Makes your_project a Python module
│ │
│ ├── dataset Scripts to download and manipulate raw data
│ │
│ ├── features Scripts to prepare raw data into features for modeling
│ │
│ ├── models Scripts to train models and then use trained models to make predictions
│ │
│ └── visualization Scripts to create exploratory and results oriented visualizations
├── your_project Main source code of the project
│ │
│ ├── __init__.py Makes your_project a Python module
│ │
│ ├── dataset Scripts to download and manipulate raw data
│ │ └── make_dataset.py
│ │
│ ├── features Scripts to prepare raw data into features for modeling
│ │ └── build_features.py
│ │
│ ├── models Scripts to train models and then use trained models to make predictions
│ │ └── deep_api.py Main script for the integration with DEEP API
│ │
│ ├── tests Scripts to perfrom code testing
│ │
│ └── visualization Scripts to create exploratory and results oriented visualizations
│ └── visualize.py
├─ .dockerignore Describes what files and directories to exclude for building a Docker image
├─ .dockerignore Describes what files and directories to exclude for building a Docker image
├─ .gitignore Specifies intentionally untracked files that Git should ignore
├─ .gitignore Specifies intentionally untracked files that Git should ignore
├─ Jenkinsfile Describes basic Jenkins CI/CD pipeline
├─ Jenkinsfile Describes basic Jenkins CI/CD pipeline
├─ LICENSE License file
├─ LICENSE License file
├─ README.md The top-level README for developers using this project.
├─ README.md The top-level README for developers using this project.
├─ requirements.txt The requirements file for reproducing the analysis environment,
├─ requirements.txt The requirements file for reproducing the analysis environment,
│ e.g. generated with `pip freeze > requirements.txt`
├─ setup.cfg makes project pip installable (pip install -e .)
├─ setup.cfg makes project pip installable (pip install -e .)
├─ setup.py makes project pip installable (pip install -e .)
├─ setup.py makes project pip installable (pip install -e .)
├─ test-requirements.txt The requirements file for the test environment
├─ test-requirements.txt The requirements file for the test environment
└─ tox.ini tox file with settings for running tox; see tox.testrun.org
└─ tox.ini tox file with settings for running tox; see tox.testrun.org


Certain files, e.g. ``README.md``, ``Jenkinsfile``, ``setup.cfg``, development Dockerfiles, ``tox.ini``, etc are pre-populated
Certain files, e.g. ``README.md``, ``Jenkinsfile``, ``setup.cfg``, ``tox.ini``, etc are pre-populated
based on the answers you provided during cookiecutter call (see above).


DEEP-OC-your_project
--------------------
<DEEP-OC-user_project>
----------------------

Repository for the integration of the :doc:`DEEPaaS API <api>` and your_project in one Docker image.
::
Expand All @@ -110,6 +115,10 @@ Repository for the integration of the :doc:`DEEPaaS API <api>` and your_project
├─ LICENSE License file
├─ README.md README for developers and users.
├─ docker-compose.yml Allows running the application with various configurations via docker-compose
├─ metadata.json Defines information propagated to the [DEEP Open Catalog](https://marketplace.deep-hybrid-datacloud.eu)


All files get filled with the info provided during cookiecutter execution (see above).
Expand All @@ -118,11 +127,11 @@ Step-by-step guide
-------------------
#. (if not yet done) install cookiecutter, as e.g. ``pip install cookiecutter``
#. run ``cookiecutter https://github.com/indigo-dc/cookiecutter-data-science``
#. answer all the questions, pay attention about python version and docker tags!
#. two directories will be created: <user_project> and <DEEP-OC-user_project>
#. answer all the questions, pay attention about docker tags!
#. two directories will be created: <user_project> and <DEEP-OC-user_project> (each with two git branches: master and test)
#. go to github.com/user_account and create corresponding repositories <user_project> and <DEEP-OC-user_project>
#. go to your terminal, <user_project>, ``git push origin master``
#. go to your terminal, <DEEP-OC-user_project>, ``git push origin master``
#. go to your terminal, <user_project>, ``git push origin --all``
#. go to your terminal, <DEEP-OC-user_project>, ``git push origin --all``
#. your github repositories are now updated with initial commits
#. you can build <deep-oc-user_project> Docker image locally: go to <DEEP-OC-user_project> directory, do ``docker build -t dockerhubuser/deep-oc-user_project .``
#. you can now run deepaas as ``docker run -p 5000:5000 dockerhubuser/deep-oc-user_project``
Expand Down
4 changes: 2 additions & 2 deletions source/user/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ Run a module on DEEP Pilot Infrastructure

* `DEEP-IAM <https://iam.deep-hybrid-datacloud.eu/>`_ registration
* To run it via web interface:
access `Orchestrator Dashboard <https://deep-paas.cloud.ba.infn.it/>`_ (https://deep-paas.cloud.ba.infn.it) with DEEP-IAM credentials
access `Orchestrator Dashboard <https://deep-paas.cloud.ba.infn.it/>`_ with DEEP-IAM credentials
* To run it via command-line interface (CLI):

* `oidc-agent <https://github.com/indigo-dc/oidc-agent/releases>`_ installed and configured for `DEEP-IAM <https://iam.deep-hybrid-datacloud.eu/>`_ (see :doc:`rclone howto <howto/oidc-agent>`).
Expand All @@ -121,7 +121,7 @@ One can either use a `general template <https://github.com/indigo-dc/tosca-templ

Orchestrator Dashboard
^^^^^^^^^^^^^^^^^^^^^^
The `PaaS Orchestrator Dashboard <https://deep-paas.cloud.ba.infn.it/>`_ is an easy way to deploy an application and monitor your deployments via web interface. You login with DEEP-IAM credentials, select either application specific template or general one, *deep-oc-mesos-webdav.yml*, fill the webform and submit your job.
The `PaaS Orchestrator Dashboard <https://deep-paas.cloud.ba.infn.it/>`_ is an easy way to deploy an application and monitor your deployments via web interface. You login with DEEP-IAM credentials, select either application specific template or general one, *deep-oc-mesos-webdav.yml*, fill the webform and submit your job. For more details, please, see :doc:`The Dashboard <overview/architecture>`

.. image:: ../_static/paas-dashboard.png
:target: https://deep-paas.cloud.ba.infn.it
Expand Down

0 comments on commit 5d75cf5

Please sign in to comment.