The NVIDIA TensorRT™ for RTX plugin is a runtime for Unreal Engine's Neural Network Engine (NNE).
This repository contains UE5_NNETRT_Sample, a sample Unreal Engine project for demonstrating NNE running with the TensorRT for RTX runtime.
The sample project is configured to use the following Unreal Engine plugins:
NNERuntimeTRTNNERuntimeRDGNeuralRendering
To use this sample project, ensure the following are available:
- Unreal Engine 5.7 with support for the required plugins
- NVIDIA TensorRT for RTX plugin for Unreal Engine NNE
- An NVIDIA RTX GPU
- A Windows system with DirectX 12 support
- Clone the repository
- Get UE5.7 source.
- Add TensorRT runtime support to the Neural Post-processing as follows:
- Modify
neuralprofile.h/cppunder this folderEngine\Source\Runtime\Engine\Classes\Engine\ - In the header file add:
UNNERuntimeRDGTensorRT UMETA(DisplayName = "NNERuntimeTRT"),to theENeuralProfileRuntimeType. The full enum class should look like this:
UNNERuntimeRDGTensorRT UMETA(DisplayName = "NNERuntimeTRT"), UENUM(BlueprintType) enum class ENeuralProfileRuntimeType : uint8 { NNERuntimeORTDml UMETA(DisplayName = "NNERuntimeORTDml"), /** Does not have full operator support*/ NNERuntimeRDGHlsl UMETA(DisplayName = "NNERuntimeRDGHlsl"), UNNERuntimeRDGTensorRT UMETA(DisplayName = "NNERuntimeTRT"), MAX UMETA(Hidden) };
- In the CPP file add:
,TEXT("NNERuntimeTRT")to thekRuntimeNamesThe fullkRuntimeNamesarray should be:
static const TCHAR* const kRuntimeNames[] = { TEXT("NNERuntimeORTDml"), TEXT("NNERuntimeRDGHlsl"), TEXT("NNERuntimeTRT") };
- Modify
- Compile the engine.
- Launch the project in the compiled Unreal Editor.
- Open the sample level
LVL_PPVStyleTestincluded with the project content. - Play in Standalone.
- Press
Sto show performance stats. - Press
Tto toggle between DirectML and TensorRT for RTX.
This project is licensed under the MIT License. See LICENSE.txt for details.
The Candy style transfer model is not owned or developed by NVIDIA. See the link to the non-NVIDIA model card. License: BSD-3-Clause.