Skip to content
A multi-user, distributed computing environment for running DL model training experiments on Intel® Xeon® Scalable processor-based systems
Branch: develop
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github CAN-2080: Introduce issue and PR templates (#1390) Mar 26, 2019
applications CAN-2319 tunnel issues (#1637) May 23, 2019
bin-containers CAN-2114 Ansible upgrade (2.7.9) (#1377) Mar 26, 2019
docs CAN-xxxx-fix-for-documentation (#1646) May 22, 2019
nauta-charts CAN-2241: include argo images in build (#1538) Apr 15, 2019
nauta-registry-charts/registry CAN-2162 update docker registry (#1382) Mar 21, 2019
nauta-user CAN-1764: Renamed project (#968) Jan 16, 2019
platform-charts CAN-2319 tunnel issues (#1637) May 23, 2019
rpm-containers CAN-2114 Ansible upgrade (2.7.9) (#1377) Mar 26, 2019
shared-containers CAN-2020-docker-version-update-flush-cache-remove-debuginfo (#1371) Mar 19, 2019
toolbox CAN-2294 move example scripts to separated dir (#1630) May 23, 2019
tools CAN-2319 tunnel issues (#1637) May 23, 2019
.gitignore CAN-1651 Gateway user mgmt (#1115) Jan 30, 2019
.yamllint CAN-1833: Fixed yamllint check for carbon (#1005) Jan 17, 2019 Introducing CHANGELOG file (#1086) Jan 28, 2019 CAN-2079: Added CoC document (#1507) Apr 11, 2019 CAN-2081: Added guidelines document (#1508) Apr 11, 2019
LICENSE Update Natua How to (#1020) Jan 22, 2019
Makefile CAN-2036: Store useful debug information during platform build (#1320) Mar 8, 2019 Nauta diagram updated (#1064) Jan 24, 2019
config.yml CAN-8 Proxy settings (#10) Apr 10, 2018


Nauta Diagram

The Nauta software provides a multi-user, distributed computing environment for running deep learning model training experiments. Results of experiments, can be viewed and monitored using a command line interface, web UI and/or TensorBoard*. You can use existing data sets, use your own data, or downloaded data from online sources, and create public or private folders to make collaboration among teams easier.

Nauta runs using the industry leading Kubernetes* and Docker* platform for scalability and ease of management. Template packs for various DL frameworks and tooling are available (and customizable) on the platform to take the complexities out of creating and running single and multi-node deep learning training experiments without all the systems overhead and scripting needed with standard container environments.

To test your model, Nauta also supports both batch and streaming inference, all in a single platform.

To build Nauta installation package and run it smoothly on Google Cloud Platform please follow our Nauta on Google Cloud Platform - Getting Started. More details on building Nauta artifacts can be found in How to Build guide.

To get things up and running quickly please take a look at our Getting Started guide.

For more in-depth information please refer to the following documents:


By contributing to the project software, you agree that your contributions will be licensed under the Apache 2.0 license that is included in the LICENSE file in the root directory of this source tree. The user materials are licensed under CC-BY-ND 4.0.


Submit Github issue to ask a question, submit a request or report a bug.

You can’t perform that action at this time.