Skip to content

The World's Leading Cross Platform AI Engine for Edge Devices

License

Notifications You must be signed in to change notification settings

richardsonjf/DeepStack

 
 

Repository files navigation

DeepStack

The World's Leading Cross Platform AI Engine for Edge Devices, with over 3.2 million installs on Docker Hub.

black

license

DevTest

Website: https://deepstack.cc

Documentation: https://docs.deepstack.cc

Forum: https://forum.deepstack.cc

Dev Center: https://dev.deepstack.cc

DeepStack is owned and maintained by DeepQuest AI.

Introduction

DeepStack is an AI API engine that serves pre-built models and custom models on multiple edge devices locally or on your private cloud. Supported platforms are:

  • Linux OS via Docker ( CPU and NVIDIA GPU support )
  • Mac OS via Docker
  • Windows 10 ( native application )
  • NVIDIA Jetson via Docker.

DeepStack runs completely offline and independent of the cloud. You can also install and run DeepStack on any cloud VM with docker installed to serve as your private, state-of-the-art and real-time AI server.

Installation and Usage

Visit https://docs.deepstack.cc/getting-started for installation instructions. The documentation provides example codes for the following programming languages with more to be added soon.

  • Python
  • C#
  • NodeJS

Build from Source (For Docker Version)

  • Install Prerequisites

  • Clone DeepStack Repo

    git clone https://github.com/johnolafenwa/DeepStack.git

  • CD to DeepStack Repo Dir

    cd DeepStack

  • Fetch Repo Files

    git lfs pull

  • Build DeepStack Server

    cd server && go build

  • Build DeepStack CPU Version

    cd .. && sudo docker build -t deepquestai/deepstack:cpu . -f Dockerfile.cpu

  • Build DeepStack GPU Version

    sudo docker build -t deepquestai/deepstack:gpu . -f Dockerfile.gpu

  • Build DeepStack Jetson Version

    sudo docker build -t deepquestai/deepstack:jetpack . -f Dockerfile.gpu-jetpack

  • Running and Testing Locally Without Building

    • Unless you wish to install requirements system wide, create a virtual environment with python3.7 -m venv venv and activate with source venv/bin/activate

    • Install Requirements with pip3 install -r requirements.txt

    • For CPU Version, Install PyTorch with pip3 install torch==1.6.0+cpu torchvision==0.7.0+cpu -f https://download.pytorch.org/whl/torch_stable.html

    • For GPU Version, Install Pytorch with pip3 install torch==1.6.0+cu101 torchvision==0.7.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html

    • Start Powershell pwsh

    • For CPU Version, Run .\setup_docker_cpu.ps1

    • For GPU Version, Run .\setup_docker_gpu.ps1

    • CD To Server Dir cd server

    • Build DeepStack Server go build

    • Set Any of the APIS to enable; $env:VISION_DETECTION = "True", $env:VISION_FACE = "True", $env:VISION_SCENE = "True"

    • Run DeepStack .\server

    You can find all logs in the directory in the repo root. Note that DeepStack will be running on the default port 5000.

Integrations and Community

The DeepStack ecosystem includes a number of popular integrations and libraries built to expand the functionalities of the AI engine to serve IoT, industrial, monitoring and research applications. A number of them are listed below

Contributors Guide

(coming soon)

About

The World's Leading Cross Platform AI Engine for Edge Devices

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 55.2%
  • JavaScript 29.8%
  • C++ 11.7%
  • C 2.0%
  • Inno Setup 0.5%
  • CMake 0.3%
  • Other 0.5%