Skip to content

"max context token" not work for anthropic models. #3447

Discussion options

You must be logged in to vote

If you're setting context this low, you need to match the output tokens respectively. It's recommended you use a preset for this as well to avoid setting it every time.

From your screenshot, you have output tokens set to 8192 and the context gets screwed up to -6192.

if you want to give the model 1500 response tokens, your settings should be:

Max Context Tokens: 2000
Max Output Tokens: 1500

this by proxy limits the user prompt, with or without any previous messages, to 500 tokens.

Replies: 1 comment 4 replies

Comment options

You must be logged in to vote
4 replies
@2p990i9hpral
Comment options

@danny-avila
Comment options

@2p990i9hpral
Comment options

@danny-avila
Comment options

Answer selected by 2p990i9hpral
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #3442 on July 24, 2024 16:33.