-
Notifications
You must be signed in to change notification settings - Fork 28.4k
Issues: huggingface/transformers
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
MacOs: register_pytree_node got an unexpected keyword argument 'flatten_with_keys_fn'
bug
#36906
opened Mar 22, 2025 by
CloseChoice
4 tasks
PixtralVisionModel does not support Flash Attention 2.0 yet
Feature request
Request for a new feature
#36904
opened Mar 22, 2025 by
xihuai18
Warning: "No label_names provided for PeftModel" persists despite dataset containing "labels" column
bug
#36902
opened Mar 22, 2025 by
Septemberlemon
2 of 4 tasks
GPT2Model model output inconsistency between different transformers versions
bug
#36897
opened Mar 22, 2025 by
wenzhong2005
2 of 4 tasks
Forced to hit
UserWarning
when generating with temperature=0
bug
#36896
opened Mar 21, 2025 by
jamesbraza
2 of 4 tasks
Florence2 stopped working after upgrade to 4.50.0 ("Unrecognized configuration class")
bug
#36886
opened Mar 21, 2025 by
senarvi
2 of 4 tasks
Qwen2-VL-7B-Instruct shape error when using TP=4
bug
#36875
opened Mar 21, 2025 by
KimmiShi
1 of 4 tasks
Optimize tokenizer.decode() Performance for Request for a new feature
List[int]
Inputs
Feature request
#36872
opened Mar 21, 2025 by
n0gu-furiosa
Facing RunTime Attribute error while running different Flax models for RoFormer
bug
Flax
#36854
opened Mar 20, 2025 by
ctr-pmuruganTT
2 of 4 tasks
Unable to load google/siglip2-so400m-patch14-384/
bug
#36845
opened Mar 20, 2025 by
SHYuanBest
4 tasks
GOT-OCR2 docs indicate model can produce markdown, but it only produces LaTeX.
#36836
opened Mar 19, 2025 by
piercelamb
Build for Windows and VS 2022 does not compile CUDA sources
bug
#36830
opened Mar 19, 2025 by
JRGit4UE
4 tasks
Need Option to Disable Flash Attention in VideoLLaMA2.1-7B-AV (SiglipVisionModel)
#36819
opened Mar 19, 2025 by
harshmoothat
Gemma3 can't be fine-tuned on multi-image examples
bug
#36816
opened Mar 19, 2025 by
FredrikNoren
1 of 4 tasks
When I use BF16 or FP16 to perform Lora fine-tuning on GemMA-3-12B-it, there will be an error when saving the checkpoint, but FP32 is normal
bug
#36814
opened Mar 19, 2025 by
Fluchw
4 tasks
Previous Next
ProTip!
Follow long discussions with comments:>50.