Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Input validation error" on https://huggingface.co/chat when using llama-3-70b #1149

Open
rotemdan opened this issue May 16, 2024 · 1 comment

Comments

@rotemdan
Copy link

rotemdan commented May 16, 2024

Input validation error: inputs tokens + max_new_tokens must be <= 8192. Given: 6204 inputs tokens and 2047 max_new_tokens

Screenshot_178

This error occurs again and again even if I try to edit the message to a very short one, or refresh the page. My initial message was relatively long (source code) and failed with this error. Even if I edit it and reduce the length of the message to a single short sentence it still gives the same exact error, that there are 6204 inputs and 2047 max_new_tokens.

Seems like the conversation got a bit long, but I wouldn't expect that to produce an error for any message I try to send?

@adolfousier
Copy link

That is actually a discussion that has been going on here for a while:
https://huggingface.co/spaces/huggingchat/chat-ui/discussions/430

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants