typing: add rule 14 - checks for tie_word_embeddings presence#44988
typing: add rule 14 - checks for tie_word_embeddings presence#44988tarekziade wants to merge 6 commits intomainfrom
Conversation
b0005e2 to
bf1e985
Compare
|
run-slow: ibert |
|
This comment contains models: ["models/ibert"] |
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
bf1e985 to
e3c3d20
Compare
Cyrilvallez
left a comment
There was a problem hiding this comment.
Just tried it in real world and does not work correctly all the time!
E.g. if _tied_weights_keys are in the main model but tie_word_embeddings in the text config it passes, even though it should not!
Also, would be nice to have which config type should be updated when it correctly finds the missing pattern, instead of saying a geenral: TRF014: AlignVisionModel defines _tied_weights_keys but configuration_align.py does not declare tie_word_embeddings. Add 'tie_word_embeddings: bool = True' to the config class.
Would be super nice to have the reverse as well: if you have tie_word_embeddings in Config -> model using THIS AND ONLY THIS config type should have a _tied_weights_keys, otherwise let's remove tie_word_embeddings
ah! that's the
That would be a very nice addition indeed I will see how we could templatize error messages.
👍 |
3721886 to
9aa263e
Compare
|
[For maintainers] Suggested jobs to run (before merge) run-slow: blip, ibert, janus, kosmos2, kosmos2_5, pe_audio_video, perception_lm, qwen3_omni_moe |
let's do this in a separate twin rule, it would be too complex to add in this one |
What does this PR do?
Adds Rule 14