-
Notifications
You must be signed in to change notification settings - Fork 28.4k
Issues: huggingface/transformers
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Community contribution: Adding Flash Attention 2 support for more architectures
Good Second Issue
Issues that are more difficult to do than "Good First" issues - give it a try if you want!
#26350
opened Sep 22, 2023 by
younesbelkada
19 of 24 tasks
[Community Event] Doc Tests Sprint
Good First Issue
#16292
opened Mar 21, 2022 by
patrickvonplaten
100+
Tranformers documentation translation to Italian
Documentation
WIP
Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress
#17459
opened May 27, 2022 by
mfumanelli
13 of 41 tasks
Model Parallelism and Big Models
Model Parallel
Model Parallelilsm Implementations
WIP
Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress
#8771
opened Nov 24, 2020 by
alexorona
Uniform kwargs for processors
contributions-welcome
Good Second Issue
Issues that are more difficult to do than "Good First" issues - give it a try if you want!
#31911
opened Jul 11, 2024 by
zucchini-nlp
39 of 40 tasks
Community contribution: enable dynamic resolution input for more vision models.
Good First Issue
Vision
#30579
opened Apr 30, 2024 by
amyeroberts
10 of 11 tasks
Finetuning Whisper with prompts
Feature request
Request for a new feature
#24272
opened Jun 14, 2023 by
AvivSham
Running the run_mlm_flax on TPU v4 pods
WIP
Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress
#20252
opened Nov 16, 2022 by
peregilk
4 tasks
Fine-tuning GPT-J-6B in colab: 8-bit weights with low-rank adaptors
New model
Quantization
#14839
opened Dec 19, 2021 by
dvmazur
Community contribution: Adding GGUF support for more architectures
Feature request
Request for a new feature
Good Second Issue
Issues that are more difficult to do than "Good First" issues - give it a try if you want!
#33260
opened Sep 2, 2024 by
SunMarc
12 of 15 tasks
Difference in LlamaAttention & LlamaFlashAttention2 attn_output
WIP
Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress
#27050
opened Oct 24, 2023 by
ringohoffman
4 tasks
Adding RelationExtraction head to layoutLMv2 and layoutXLM models
New model
#15451
opened Feb 1, 2022 by
R0bk
Trying to add support for GPT2 as decoder in EncoderDecoder model
Core: Encoder-Decoder
Good First Issue
#4483
opened May 20, 2020 by
dimi1357
Is there any plan to add kosmos-2 to the transformers.
New model
#24671
opened Jul 5, 2023 by
BIGBALLON
2 tasks done
The same situation as #31377 occurred when using Qwen/Qwen2-VL-7B-Instruct
bug
Cache
Multimodal
#33399
opened Sep 10, 2024 by
toondata
3 of 4 tasks
Add in-layer TF Tokenizer to BPE tokenizers
WIP
Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress
#19992
opened Oct 31, 2022 by
piEsposito
[Whisper] TypeError: '<=' not supported between instances of 'NoneType' and 'float'
Audio
bug
Core: Tokenization
Internals of the library; Tokenization.
#33552
opened Sep 18, 2024 by
felipehertzer
4 tasks
Add training support for EnCodec
Feature request
Request for a new feature
Good Second Issue
Issues that are more difficult to do than "Good First" issues - give it a try if you want!
#24295
opened Jun 15, 2023 by
ArthurZucker
New and better T5 checkpoints from scaling transformers paper
New model
#15467
opened Feb 1, 2022 by
Xirider
3 tasks done
safetensor/mmap memory leak when per-layer weights are converted do other dtypes
bug
contributions-welcome
Core: Modeling
Internals of the library; Models.
Quantization
#34366
opened Oct 24, 2024 by
Qubitium
2 of 4 tasks
Add support for BLIP and GIT in image-to-text and VQA pipelines
Good First Issue
#21110
opened Jan 13, 2023 by
NielsRogge
Previous Next
ProTip!
Follow long discussions with comments:>50.