Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update proxy_cli.py #4325

Merged
merged 1 commit into from
Jun 21, 2024
Merged

Update proxy_cli.py #4325

merged 1 commit into from
Jun 21, 2024

Conversation

vanpelt
Copy link
Contributor

@vanpelt vanpelt commented Jun 21, 2024

Fixed indentation to so we don't get an UnboundLocalError. Fixes #4324

Title

Un-bork indentation

Relevant issues

Fixes #4324

Type

🐛 Bug Fix

Changes

Fixe indentation

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

No testing, did this via the GH UI.

Fixed indentation to so we don't get an `UnboundLocalError`.  Fixes BerriAI#4324
Copy link

vercel bot commented Jun 21, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 21, 2024 0:51am

@krrishdholakia krrishdholakia merged commit c36a2cc into BerriAI:main Jun 21, 2024
2 of 3 checks passed
@krrishdholakia
Copy link
Contributor

krrishdholakia commented Jun 21, 2024

Hey @vanpelt how were you triggering this?

Wondering how none of our testing caught it

  • do you use a secret manager today?

@ishaan-jaff
Copy link
Contributor

@vanpelt - any suggestions on how we can test this code path - we have several tests on starting the proxy + using the docker image https://app.circleci.com/pipelines/github/BerriAI/litellm/11477/workflows/1b43c6d3-d33f-4828-b9c2-73815a48191a/jobs/26679

figuring out what we can do better here

@vanpelt
Copy link
Contributor Author

vanpelt commented Jun 21, 2024

Hey guys, appreciate the enthusiasm around getting this case covered in tests. I wasn't doing anything funky. My use case is simply to start the proxy with a config that adds standard models like Claude or Mistral authenticated via api keys via the default environment path.

I think a simple integration test that loaded a config like the following would do it:

model_list:
- model_name: claude-3-opus
  litellm_params:
    model: claude-3-opus-20240229
- model_name: claude-3-5-sonnet
  litellm_params:
    model: claude-3-5-sonnet-20240620
- model_name: claude-3-haiku
  litellm_params:
    model: claude-3-haiku-20240307

It might have been the fact that claude-3-5-sonnet-20240620 was released today...

@vanpelt vanpelt deleted the patch-3 branch June 21, 2024 03:15
@ishaan-jaff
Copy link
Contributor

we found the root cause - adding a test + making sure linting runs on proxy_cli.py to avoid this occurring again

@vanpelt
Copy link
Contributor Author

vanpelt commented Jun 21, 2024

For the record, I'm a big fan and am really impressed with the work you guys are doing. Much gratitude ❤️ . This project is making it much easier to develop AI applications and I'm looking forward to seeing it grow! Keep up the great work 🙏

@krrishdholakia
Copy link
Contributor

Thanks @vanpelt! Curious - do you use litellm proxy in production?

@vanpelt
Copy link
Contributor Author

vanpelt commented Jun 21, 2024

I'm using it in my fun side project: https://github.com/wandb/openui, it's currently in a branch. I'm gonna release it as a native feature tomorrow.

As a founder that sells AI software, happy to meet if you want to geek out on sales motion 😜

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: The latest release broke the proxy command when specifying a config
3 participants