Skip to content
Meta-repository for
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github repo: Add .github templates Aug 13, 2018
docs docs: Update prerequisites Apr 14, 2019
scripts update basic kernel of NGC 19.02 (CUDA) Apr 1, 2019
src/ai/backend/meta Bump version to 19.03.0 Apr 10, 2019
.readthedocs.yml docs: Add readthedocs config file and update docs/requirements.txt Apr 11, 2019
.travis.yml ci: Update TravisCI config Aug 22, 2018
LICENSE Initial commit Oct 6, 2016 docs: Update README to include monitoring plugins Nov 28, 2018
docker-compose.halfstack-prod.yml Set max_connections explicitly for postgres containers Mar 27, 2019
docker-compose.halfstack.yml Set max_connections explicitly for postgres containers Mar 27, 2019
docker-compose.yml Make docker-compose containers to have persistent data (#15) Sep 12, 2018
requirements-ci.txt Bump to 1.0.2 Oct 18, 2017
requirements.txt refs #10: Rename package to Sep 6, 2017
setup.cfg Update setup script for 19.03.0 Apr 10, 2019 repo: Restructure to use "src" directory and setup.cfg metadata Aug 22, 2018


PyPI release version Supported Python versions Gitter

Backend.AI is a streamlined backend API server that hosts heterogeneous programming languages and popular AI frameworks. It manages the underlying computing resources for multi-tenant computation sessions where such sessions are spawned and executed instantly on demand.

Server-side Components

If you want to run a Backend.AI cluster on your own, you need to install and configure the following server-side components. All server-side components are licensed under LGPLv3 to promote non-proprietary open innovation in the open-source community.

There is no obligation to open your service/system codes if you just run the server-side components as-is (e.g., just run as daemons or import the components without modification in your codes). Please contact us (contact-at-lablup-com) for commercial consulting and more licensing details/options about individual use-cases.

For details about server installation and configuration, please visit our documentation.

Manager with API Gateway

It routes external API requests from front-end services to individual agents. It also monitors and scales the cluster of multiple agents (a few tens to hundreds).


It manages individual server instances and launches/destroys Docker containers where REPL daemons (kernels) run. Each agent on a new EC2 instance self-registers itself to the instance registry via heartbeats.

Server-side common plugins (for both manager and agents)


A set of small ZeroMQ-based REPL daemons in various programming languages and configurations.


A programmable sandbox implemented using ptrace-based sytem call filtering written in Go.


A set of libc overrides for resource control and web-based interactive stdin (paired with agents).


A collection of utility modules commonly shared throughout Backend.AI projects.

Client-side Components

Client SDK Libraries

We offer client SDKs in popular programming languages. These SDKs are freely available with MIT License to ease integration with both commercial and non-commercial software products and services.


The front-end support libraries to handle multi-media outputs (e.g., SVG plots, animated vector graphics)

  • The Python package (lablup) is installed inside kernel containers.
  • To interpret and display media generated by the Python package, you need to load the Javascript part in the front-end.

Integrations with IDEs and Editors

You can’t perform that action at this time.