-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No me permite instalar TENSORrt #3946
Comments
I'd be glad to assist you with installing TensorRT, even without the specific error message #3946. Here's a comprehensive guide that incorporates insights from previous responses and addresses potential issues: Prerequisites: NVIDIA GPU: TensorRT requires an NVIDIA GPU with compute capability 3.0 or higher. You can check your GPU model using nvidia-smi in the terminal (Linux/macOS) or NVIDIA Control Panel (Windows). Choose Installation Method: Package Manager (Recommended): For Linux distributions like Ubuntu, Debian, or CentOS, you can often use the package manager: Open a Python interpreter and run import tensorrt. If there's no error, TensorRT is installed correctly. Incorrect CUDA or Driver Version: Ensure they're compatible with TensorRT. Refer to the TensorRT documentation for supported versions. If you encounter specific errors, search online for solutions related to that error code. Provide More Details: Share more information about your environment (OS, GPU model, CUDA version, etc.) and the exact error message you're encountering. This will help in providing more tailored assistance. |
trt install doc see https://docs.nvidia.com/deeplearning/tensorrt/install-guide/index.html#installing |
Description
Environment
TensorRT Version:
NVIDIA GPU:
NVIDIA Driver Version:
CUDA Version:
CUDNN Version:
Operating System:
Python Version (if applicable):
Tensorflow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if so, version):
Relevant Files
Model link:
Steps To Reproduce
Commands or scripts:
Have you tried the latest release?:
Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (
polygraphy run <model.onnx> --onnxrt
):The text was updated successfully, but these errors were encountered: