Skip to content

thebruce87m/Tensorflow-On-Deepstream-With-Triton-Server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

https://developer.nvidia.com/blog/deploying-models-from-tensorflow-model-zoo-using-deepstream-and-triton-inference-server/

https://forums.developer.nvidia.com/t/deploying-models-from-tensorflow-model-zoo-using-nvidia-deepstream-and-nvidia-triton-inference-server/155682

# Download the files
./001-download.sh


# Run the docker:
./002-run-docker.sh


# Prepare things inside the docker, copy files etc
# Note - run in docker
./003-prepare.sh


# Run the example
# Note - run in docker
./004-run.sh


# Exit the docker
exit

About

Run a tensorflow model in deepstream using the Triton Inference Server.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages