vos.ai - compute layer for the agi infrastructure
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.


VOSAI - Compute

VOSAI Compute is a layer between the AGI and the actual back end compute executing AGI code.

The reasoning for creating this layer to enable the AGI with the ability to be agnostic to compute back end while rapidly adapting to the latest technologies. This ability allows for delivering better performance and quality of output from the AGI to its consumers with minimal effort and costs.

The effort and cost would typically reflect at the VOSAI organization as well as cost pass thru to the end consumer.

System Design

To ensure success while following a gradual iterative approach, the compute layer is implemented in phases.

High Level

This diagram illustrates the high level view of the entires system with respect to how this interacts with the Compute back end.

  • A consumer is a system that leverages the AGI to perform some task. It may be using it for conversations between its end users and the purpose of their business.
  • The AGI Subsystem is the module(s) responsible for handling inbound requests from the outside world and relaying them accordingly to the AGI internal systems.
  • The Compute Subsystem is where the AGI actually resides. The AGI design and components are outside the scope of the this section. They are further explained in the AGI repository.

Phase 1

This diagram illustrates Phase 1 of VOSAI Compute. This is a preliminary design and will evolve over time.

  • Compute API - a restful API which handles request from the AGI subsystem.
  • Compute Router - a daemon responsible for routing the request to the appropriate compute back end

Phase 2

This diagram illustrates Phase 2 of the VOSAI Compute. The intent of this phase is to integrate with one cloud provider as the compute back end. For this back end, we have chosen to integrate with GCP - Google Cloud Platform. We have found GCP to perform better than AWS and is more friendly when it comes to leveraging machine learning algorithms.

  • GCP - the first of many cloud providers to integrate as a compute back end
  • Additional components are developed in this phase to allow plugging in new compute providers easily
  • Learn more about GCP

Phase 3

This diagram illustrates Phase 3 of the VOSAI Compute. The intent of this phase is to integrate with Golem or similar compute back end which heavily leverages blockchain as well as the crowd sourced approach to computing. This integration is subject to change based on the progress of Golem.

  • Similar integrate effort here as in Phase 2
  • Port existing applications and frameworks to work appropriately over to Golem (e.g. TensorFlow)
  • Learn more about Golem

Phase 4

This diagram illustrates Phase 4 of the VOSAI Compute. The intent of this phase is to create a data center which is highly specialized for both machine learning and mining applications. Todays current cloud offerings do not fully provide what is required for these two tasks.

  • ARM / Intel CPUs
  • GPU Agnostic (AMD / NVIDIA)
  • DPU Backed
  • FPGA Possibilities
  • Alternate means of power and cooling

End Phase

This diagram illustrates Phase 5 - the end Phase of the VOSAI Compute. The intent of this phase is to add additional cloud providers or remove them if they are deemed useless.


There are a few unknowns when it comes to the underlying hardware for compute. That of which what new technology will available and which direction to take when designing the systems that run on them.

Super Computers

There are various organization spanning the globe that provide access to super computers. This is early in discussions but we are considering Oak Ridge National Laboratory..

Quantum Computers

There are only a few worldwide that claim to have a Quantum Computer. At the time of this writing - Quantum Computers are still in their infancy and not quite ready for the mainstream.