Skip to content
Branch: master
Find file History
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
..
Failed to load latest commit information.
BUILD
README.md
tensorrtInferOp.cpp
tensorrtInferOp.h

README.md

TensorRTInferOp

A plugin for DALI https://github.com/NVIDIA/DALI/ that allows users to include TensorRT engines in DALI pipelines. This lets people use the same DALI GPU accelerated data preprocessing pipelines used to training in inference.

Compiling

To compile this library just run the applicable command for your platform:

x86_64-linux

dazel run //plugins/dali/TensorRTInferOp:libtensorrtinferop.so

aarch64-linux

dazel run //plugins/dali/TensorRTInferOp:libtensorrtinferop.so --config=[D5L/L4T]-toolchain

aarch64-qnx

dazel run //plugins/dali/TensorRTInferOp:libtensorrtinferop.so --config=D5Q-toolchain

Usage

Op Name: TensorRTInfer

Perform inference over the TensorRT engine

Arguments:

Required:
  • input_nodes Vec<string>: Inputs nodes in the engine

  • output_nodes Vec<string>: Outputs nodes in the engine

  • engine string: Path to TensorRT engine file to run inference

Optional
  • log_severity int (nvinfer::Severity): Logging severity for TensorRT

  • plugins Vec<string>: Plugin library to load

  • num_outputs int: Number of outputs

  • inference_batch_size int: Batch size to run inference

  • use_dla_core int: DLA core to run inference upon

You can’t perform that action at this time.