Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WARNING: [TRT]: Unknown embedded device detected. #420

Open
whittenator opened this issue Aug 7, 2023 · 4 comments
Open

WARNING: [TRT]: Unknown embedded device detected. #420

whittenator opened this issue Aug 7, 2023 · 4 comments

Comments

@whittenator
Copy link

whittenator commented Aug 7, 2023

Specs:
Nvidia Jetson Orin 64GB
Deepstream Version: 6.2
Deepstream-Yolo: Latest

When I convert a YoloV7, YoloV8 or YOLO-NAS ONNX file to an engine file using the Nvidia Jetson Orin, I get the following warning constantly being output:

WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.

This is continuously printed to terminal until the .engine file is created. Is this anything to be concerned about?

At the end I get the following messages:

WARNING: [TRT]: TensorRT encountered issues when converting weights between types and that could affect accuracy.
WARNING: [TRT]: If this is not the desired behavior, please modify the weights or retrain with regularization to adjust the magnitude of the weights.
WARNING: [TRT]: Check verbose logs for the list of affected weights.
WARNING: [TRT]: - 72 weights are affected by this issue: Detected subnormal FP16 values.
WARNING: [TRT]: - 11 weights are affected by this issue: Detected values less than smallest positive FP16 subnormal value and converted them to the FP16 minimum subnormalized value.
Building complete

Any concerns here?

Thanks

@marcoslucianops
Copy link
Owner

When I convert a YoloV7, YoloV8 or YOLO-NAS ONNX file to an engine file using the Nvidia Jetson Orin, I get the following warning constantly being output:

Probably this is a issue from NVIDIA side with the JetPack components (mainly the TensorRT for the Orin series).

At the end I get the following messages:

This is normal warnings, you can ignore it.

@Crear12
Copy link

Crear12 commented Sep 17, 2023

How long did it take for your Orin to generate the .engine file? I got exactly same behavior.....It's been printing forever....

@whittenator
Copy link
Author

How long did it take for your Orin to generate the .engine file? I got exactly same behavior.....It's been printing forever....

I am using the Jetson Orin 64GB and it took about 15-20 minutes. If you are using something smaller(i.e Nano or 32GB) it could take like an hour or so.

@marcoslucianops
Copy link
Owner

It takes a very long time, it's a TensorRT issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants