New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: gpt-3.5-turbo-instruct not supported #411
Comments
cc: @ishaan-jaff |
@StanGirard acknowledging the bug, I'll pick this up and revert back with a fix in a few hours. |
@StanGirard which version of litellm are you using. gpt-3.5-turbo-instruct was added a few days ago. If you could bump the version and let me know if that works, it'd be great! I think the issue is that the model list in the older version doesn't contain the instruct model name. |
As a potential improvement, we could decouple the model lists from the local package. This would make it easy to update lists and ensure everyone has the latest version. thoughts @ishaan-jaff / @StanGirard |
I'm on litellm==0.1.531 That is probably why ;) Yeah that would be amazing to decouple it. |
It works |
Sounds good - I'll close this issue, and open a new one to track the decoupling idea. |
In my PR for integrating LiteLLM with Aider, I'm updating the backup whenever the backup is older than 12 hours on Aider startup. If the backup is newer than 12 hours, I set the |
What happened?
Tried to use gpt-3.5-turbo-instruct on Quivr
Relevant log output
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: