Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enable max tolkens for anthropic/claude-3-sonnet-20240229 #1094

Closed
4 tasks
bjornjorgensen opened this issue Mar 7, 2024 · 15 comments · Fixed by #1107
Closed
4 tasks

enable max tolkens for anthropic/claude-3-sonnet-20240229 #1094

bjornjorgensen opened this issue Mar 7, 2024 · 15 comments · Fixed by #1107

Comments

@bjornjorgensen
Copy link
Contributor

bjornjorgensen commented Mar 7, 2024

Bug Report

Cant change max tolken when running anthropic/claude-3-sonnet-20240229
so the text is cut.

Description

image

Bug Summary:
[Provide a brief but clear summary of the bug]

Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]
add anthropic/claude-3-sonnet-20240229

image

as for something bigger then 2 + 2 ?

Expected Behavior:
[Describe what you expected to happen.]

Actual Behavior:
[Describe what actually happened.]

Environment

  • Operating System: [e.g., Windows 10, macOS Big Sur, Ubuntu 20.04]
  • k8s I am using dev image. ghcr.io/open-webui/open-webui:dev
  • Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]
    breave

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

@tjbck
Copy link
Contributor

tjbck commented Mar 8, 2024

I have a suspicion that it's an issue on the LiteLLM-side, could you try isolating the issue by trying with LiteLLM only? Keep us updated!

@justinh-rahb
Copy link
Collaborator

I have a suspicion that it's an issue on the LiteLLM-side, could you try isolating the issue by trying with LiteLLM only? Keep us updated!

I'm not so sure that it's LiteLLM to blame here, I've tried now with an older version of it and the same behaviour is happening on Claude 2 as well, from other clients than WebUI. I believe this behavior started the day that Claude 3 was released. By chance @bjornjorgensen are you using an Anthropic developer account like mine? I am beginning to wonder if they simply limit max_tokens now for free dev keys.

@justinh-rahb
Copy link
Collaborator

Update: now I'm not so sure where the blame lies. Testing in another chat app works fine with Claude 3 endpoints. So it could indeed be a LiteLLM issue then, but it also wasn't working with older versions of it that previously did work? Really need someone with actual paid API keys to test this I think.

@bjornjorgensen
Copy link
Contributor Author

it is a free key that i'm are using
image

it does however work on
image

So its not the key that are the problem.

@justinh-rahb
Copy link
Collaborator

Yes I just tried it with Lobechat as well and got a full response. So ball seems to be back in LiteLLM's court but I need to do further testing with components in isolation to be really certain of this.

@justinh-rahb
Copy link
Collaborator

So after futher testing... I can only observe this happening when WebUI is involved. So it seems it may be something to do with our code, I just cannot at the moment nail down what it could possibly be. It seemingly only affects Claude API via LiteLLM in Open WebUI

@bjornjorgensen
Copy link
Contributor Author

are there any configs that over rights tolkens that it can print on outputs?

@justinh-rahb
Copy link
Collaborator

@bjornjorgensen nah, in our testing we've checked we're not sending anything like that would limit the max tokens, but nonetheless the API response says the stop reason is length which would indicate that it's been given one and reached it... very strange. Still being investigated and I hope we'll have an answer soon!

@justinh-rahb
Copy link
Collaborator

Ladies and gentleman, we got em. Claude's API now requires that the max_tokens param be sent in the payload, and LiteLLM will set a default of 256 tokens if you don't specify this. Currently the WebUI does not send a max_tokens param when using external APIs, so the proposed fix would be to add that feature, or allow this parameter override to be set in the LiteLLM configuration UI. For now, it can be worked around by mounting and modifying the config.yaml file as such:

- litellm_params:
    api_key: your_api_key
    model: anthropic/claude-3-sonnet-20240229
    max_tokens: 4096
  model_info:
    id: 810226a0-61e2-4d97-9de0-822bd4300fcd
  model_name: claude-3-sonnet

Note: the maximum value is 4096, you'll get an error from Anthropic's API if you request more.

@tjbck
Copy link
Contributor

tjbck commented Mar 8, 2024

@justinh-rahb
Copy link
Collaborator

justinh-rahb commented Mar 8, 2024

v0.1.111 (not merged to :main yet) has a new field in the LiteLLM UI to configure the max_tokens parameter override, which will make modifying your config.yaml by hand unneccesary. This can be tested now in the :dev branch.

@tjbck tjbck mentioned this issue Mar 9, 2024
@tjbck tjbck linked a pull request Mar 9, 2024 that will close this issue
@tjbck
Copy link
Contributor

tjbck commented Mar 10, 2024

max_tokens: 4096 should be explicitly set from the settings!

@bjornjorgensen
Copy link
Contributor Author

yes, I have to delete the old one and add it back.. but now it works :)
Thanks

@bjornjorgensen
Copy link
Contributor Author

hmm.. have some issues to day with dev images.. i cant see whats wrong there but when I try main it works
but i have deleted my storage for openchat and now i have to set everything up again. I add claude3 opus without seting enything othere then the model name and api key
image

so must i set max_tokens: 4096 when i use claude-3 models? if so then it must be in a readme somewhere.

@justinh-rahb
Copy link
Collaborator

justinh-rahb commented Mar 22, 2024

@bjornjorgensen I haven't migrated this to the docs site yet, there's a thread:

max_tokens must be 4096 to get the most out of Claude API, as noted there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants