Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generation config commented out? #38

Closed
spencekim opened this issue Mar 21, 2023 · 5 comments
Closed

Generation config commented out? #38

spencekim opened this issue Mar 21, 2023 · 5 comments

Comments

@spencekim
Copy link

Is there a reason why usage of several generation config params is commented out in the _get_logits_processor function on line 370 of models.js?

_get_logits_processor(
        generation_config,
        input_ids_seq_length,
        // encoder_input_ids, TODO
        // prefix_allowed_tokens_fn, TODO
        logits_processor = null
    ) {
        const processors = new LogitsProcessorList();

        // if (generation_config.diversity_penalty !== null && generation_config.diversity_penalty > 0.0) {
        //     processors.push(new HammingDiversityLogitsProcessor(
        //         generation_config.diversity_penalty,
        //         generation_config.num_beams,
        //         generation_config.num_beam_groups
        //     ));
        // }
@xenova
Copy link
Owner

xenova commented Mar 21, 2023

The main reason is I copied it from HuggingFace's transformers library code, but either (1) no supported model uses it or (2) it hasn't been implemented yet.

The reason it's commented out (and not, well, deleted), is because I plan to add it :) For example, the RepetitionPenaltyLogitsProcessor, NoRepeatNGramLogitsProcessor, and MinLengthLogitsProcessor classes will be coming soon!

@spencekim
Copy link
Author

Got it, feel free to close this issue then. Would be great to be able to use the params you mentioned!

@xenova xenova closed this as completed Mar 21, 2023
@spencekim
Copy link
Author

spencekim commented Mar 22, 2023

Is there a rough timeline for implementing additional config, like the three you mentioned above? Just curious since it would be very useful to my project. I could also take a look at implementing them if there is no work started on that.

I saw the python code here: https://huggingface.co/transformers/v4.1.1/internal/generation_utils.html#transformers.RepetitionPenaltyLogitsProcessor

@xenova
Copy link
Owner

xenova commented Mar 22, 2023

That would be awesome if you could try implementing it! I'm currently adding new model types, so, I'd only be able to get to adding this in ~a week or so.

Here's the original implementation code from HF: https://github.com/huggingface/transformers/blob/main/src/transformers/generation/logits_process.py

and this library's implementation is here: https://github.com/xenova/transformers.js/blob/main/src/generation.js

The conversion shouldn't be too difficult, but if you have any questions, feel free to ask!

@spencekim
Copy link
Author

Great, I'll give it a shot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants