Skip to content

Issues: huggingface/transformers

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

Caching Past Key values of any length for Vision LLM's Feature request Request for a new feature
#31096 opened May 28, 2024 by saikoneru
Python suffix stripping in transformers.dynamic_model_utils.get_class_in_module strips additional characters bug Should Fix This has been identified as a bug and should be fixed.
#31061 opened May 27, 2024 by gmastrapas
2 of 4 tasks
Checkpoint saving by different evaluation criterias Feature request Request for a new feature
#31049 opened May 27, 2024 by daehuikim
Add sanity validation steps Feature request Request for a new feature
#31047 opened May 26, 2024 by dhruvbpai
Whether the OutEffHop can support with Transfomers Feature request Request for a new feature
#31046 opened May 26, 2024 by robinzixuan
Llama tokenizer inconsistency for the newline character for convert_tokens_to_ids Core: Tokenization Internals of the library; Tokenization.
#31030 opened May 25, 2024 by JackCai1206
2 of 4 tasks
ProTip! Find all open issues with in progress development work with linked:pr.