-
-
Notifications
You must be signed in to change notification settings - Fork 695
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generation config commented out? #38
Comments
The main reason is I copied it from HuggingFace's transformers library code, but either (1) no supported model uses it or (2) it hasn't been implemented yet. The reason it's commented out (and not, well, deleted), is because I plan to add it :) For example, the |
Got it, feel free to close this issue then. Would be great to be able to use the params you mentioned! |
Is there a rough timeline for implementing additional config, like the three you mentioned above? Just curious since it would be very useful to my project. I could also take a look at implementing them if there is no work started on that. I saw the python code here: https://huggingface.co/transformers/v4.1.1/internal/generation_utils.html#transformers.RepetitionPenaltyLogitsProcessor |
That would be awesome if you could try implementing it! I'm currently adding new model types, so, I'd only be able to get to adding this in ~a week or so. Here's the original implementation code from HF: https://github.com/huggingface/transformers/blob/main/src/transformers/generation/logits_process.py and this library's implementation is here: https://github.com/xenova/transformers.js/blob/main/src/generation.js The conversion shouldn't be too difficult, but if you have any questions, feel free to ask! |
Great, I'll give it a shot. |
Is there a reason why usage of several generation config params is commented out in the
_get_logits_processor
function on line 370 ofmodels.js
?The text was updated successfully, but these errors were encountered: