-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update proxy_cli.py #4325
Update proxy_cli.py #4325
Conversation
Fixed indentation to so we don't get an `UnboundLocalError`. Fixes BerriAI#4324
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
Hey @vanpelt how were you triggering this? Wondering how none of our testing caught it
|
@vanpelt - any suggestions on how we can test this code path - we have several tests on starting the proxy + using the docker image https://app.circleci.com/pipelines/github/BerriAI/litellm/11477/workflows/1b43c6d3-d33f-4828-b9c2-73815a48191a/jobs/26679 figuring out what we can do better here |
Hey guys, appreciate the enthusiasm around getting this case covered in tests. I wasn't doing anything funky. My use case is simply to start the proxy with a config that adds standard models like Claude or Mistral authenticated via api keys via the default environment path. I think a simple integration test that loaded a config like the following would do it: model_list:
- model_name: claude-3-opus
litellm_params:
model: claude-3-opus-20240229
- model_name: claude-3-5-sonnet
litellm_params:
model: claude-3-5-sonnet-20240620
- model_name: claude-3-haiku
litellm_params:
model: claude-3-haiku-20240307 It might have been the fact that |
we found the root cause - adding a test + making sure linting runs on proxy_cli.py to avoid this occurring again |
For the record, I'm a big fan and am really impressed with the work you guys are doing. Much gratitude ❤️ . This project is making it much easier to develop AI applications and I'm looking forward to seeing it grow! Keep up the great work 🙏 |
Thanks @vanpelt! Curious - do you use litellm proxy in production? |
I'm using it in my fun side project: https://github.com/wandb/openui, it's currently in a branch. I'm gonna release it as a native feature tomorrow. As a founder that sells AI software, happy to meet if you want to geek out on sales motion 😜 |
Fixed indentation to so we don't get an
UnboundLocalError
. Fixes #4324Title
Un-bork indentation
Relevant issues
Fixes #4324
Type
🐛 Bug Fix
Changes
Fixe indentation
[REQUIRED] Testing - Attach a screenshot of any new tests passing locall
If UI changes, send a screenshot/GIF of working UI fixes
No testing, did this via the GH UI.