Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tensor-RT LLM RTX2080s - Incompatible #2952

Closed
1AntonioOrlo1 opened this issue May 27, 2024 · 1 comment
Closed

Tensor-RT LLM RTX2080s - Incompatible #2952

1AntonioOrlo1 opened this issue May 27, 2024 · 1 comment
Labels
needs info Not enough info, more logs/data required type: bug Something isn't working

Comments

@1AntonioOrlo1
Copy link

Turing can work with Tensor-RT LLM, but it is incompatible and i can't install inference engine

@1AntonioOrlo1 1AntonioOrlo1 added the type: bug Something isn't working label May 27, 2024
@Van-QA Van-QA added the needs info Not enough info, more logs/data required label May 28, 2024
@CameronNg
Copy link

@1AntonioOrlo1 We're sorry to inform you that currently we have no plans to support Turing for TensorRT-LLM in Jan app.

@Van-QA Van-QA closed this as not planned Won't fix, can't repro, duplicate, stale Aug 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs info Not enough info, more logs/data required type: bug Something isn't working
Projects
Archived in project
Development

No branches or pull requests

3 participants