Skip to content

Milittle/TensorRT_test

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TensorRT5.0 Test Integration

Project TensorRT_test is a TensorRT Library Example integrated based on Windows Visual Studio 2017, which make our machine learning can run fastly at inference stage.

you can look more information about TensorRT in TensorRT Dev Guide

Not NVIDIA TensorRT official Sample(BY Myself)

  • sampleLoadEngineStream: deserializing the engine stream by engineStream.bin locating in the {SolutionDir}/data/mnist/ folder.
  • sampleResNetv2: using the Resnetv2 pb file transform to uff file and executing the inference.
  • sampleDetection: (Defect Detection Demo)Solving the TensorFlow BatchNormalization operator. TensorRT do not support the BN's Switch and Merge. I use pb graph and remove some nodes about Switching and Merging then merging related node to pb's graph, which convert to uff file using for TensorRT uff parser parsing the model file. I use ten defect images to inference the results. So the fllowing time performance is 10 images inferencing time.

sampleDetection Time consume:

tensorflow(python)- Titan-12G tensorrt(c++)- Qudra 4G Conclusion
pure run time(1344.3049ms) pure execution time(44.5ms) 30 times
load data and related Tensor nodes(3473ms) load data and execute(171.373ms) 20 times
GPU mem-2GB ---

Table of Content

Prerequisites

Getting the code

You can use the git tool to clone the Project, through:

git clone git@github.com:Milittle/TensorRT_test.git

Project Structure

The Following is my Integrated Project's Structure, and you can download data and 3rdparty by:

Google Driver : data and 3rdparty download link

Once you download the data and 3rdparty, you can open the TenosrRT_test.sln file and exec the samples by Visual Studio 2017.

Good luck to you.

TensorRT_test:
|	3rdparty
└---|	TensorRT-5.0.1.3
|	└-------------------
|	common
└---|	windows
|	|	argsParser.h
|	|	BatchStream.h
|	|	buffers.h
|	|	common.h
|	|	dumpTFWts.py
|	|	half.h
|	|	sampleConfig.h
|	└-------------------
|	data
└---|	char-rnn
|	|	example_gif
|	|	faster-rcnn
|	|	googlenet
|	|	mlp
|	|	mnist
|	|	movielens
|	|	nmt
|	|	ssd
|	└-------------------
|	src
└---|	sampleCharRNN
|	|	sampleFasterRCNN
|	|	sampleGoogleNet
|	|	sampleINT8
|	|	sampleMLP
|	|	sampleMNIST
|	|	sampleMNISTAPI
|	|	sampleMovieLens
|	|	sampleNMT
|	|	samplePlugin
|	|	sampleUffMNIST
|	|	sampleUffSSD
|	└--------------------
|	.gitignore
└------------------------
|	README.md
└------------------------
|	TensorRT_test.sln
└------------------------

Run the Example using VS

sampleUffMNIST

Demo

sampleUffSSD

This example load the model and build the engine taking a long time, you need more patience.

step1: Begin parsing model...

​ End parsing model...

step2: Begin building engine...

​ End building engine...

step3: Begin inference.

sampleMNIST

sampleMNISTAPI

sampleSSD

This example has some error, I cannot through the model prototxt parser the model.

samplePlugin

sampleCharRNN

sampleFasterRCNN

sampleGoogleNet

sampleINT8

Note: my computer isn't support the FP16 and INT8. so:

sampleMLP

sampleMovieLens

sampleNMT

Contact Getting Help

Email: mizeshuang@gmail.com

QQ: 329804334

Author: Milittle

About

TensorRT sample integrated in Windows Platform with Visual Studio 2017. The release is TensorRT5.0.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published