-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug: can't swap back to llamacpp after using trtllm #2358
Comments
I was able to reproduce this, but I don't think should block the 0.4.9 release. We can handle this next sprint
|
Still experiencing this on 324. Also, interesting: when I switch from trt to gguf, there is a brief "starting [the old trt model]" loading state (for like 1 second), before getting stuck in the "starting [new gguf model]" loading state. If it's a UI glitch, we also need to fix that |
Please help attach the app.log @0xSage 🙏 |
Root cause: the nitro file is missing due to a failed app update. |
the main issue is resolved and working fine in Jan v0.4.8-326 ✅, as for the Nitro issue, we will resolve it in this follow up ticket: janhq/cortex.tensorrt-llm#27 |
@louis-jan Related to the UI glitchs, after observation, we need to correct the behavior of the status message when switching between models:
|
Tested and looking good as of |
Describe the bug
A clear and concise description of what the bug is.
v. 0.4.8-322
windows amd ryzen with 4070
Steps to reproduce
Steps to reproduce the behavior:
Additional things Ive tried:
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your issue.
Environment details
Logs
If the cause of the error is not clear, kindly provide your usage logs:
tail -n 50 ~/jan/logs/app.log
if you are using the UItail -n 50 ~/jan/logs/server.log
if you are using the local api serverMaking sure to redact any private information.
Additional context
Add any other context or information that could be helpful in diagnosing the problem.
The text was updated successfully, but these errors were encountered: