-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Huggingface backend seems to be broken #42
Comments
I've identified an issue with Huggingface's backend in their generation function: it unexpectedly retains the prompt portion due to its failure to omit special tokens. This mishap leads to the To resolve this, truncating the generated content at the length of the initial input text effectively corrects the output, yielding results that align more closely with those from VLLM-backed models.
after changes:
I can submit a PR for you guys if you wish. |
Excellent! A PR will be much appreciated! |
@zyzzzz-123 Thanks for the investigation and I have fixed that. :) |
#41
The text was updated successfully, but these errors were encountered: