Skip to content

Conversation

ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Sep 4, 2025

What does this PR do?

See https://app.circleci.com/pipelines/github/huggingface/transformers/144943/workflows/f4bc060c-b21f-4528-9393-9fadc27fe49e/jobs/1916024

This is specific to T5GemmaForTokenClassification (maybe T5GemmaForSequenceClassification too), as they specify

self.model(use_cache=False, ...)

and goes through

        if attention_mask is None and past_key_values is None:
            attention_mask = make_default_2d_attention_mask(input_ids, inputs_embeds, self.config.pad_token_id)

The logic is correct, but a sequence of no place to attend is problematic in terms of numerical equivalence.

@ydshieh ydshieh changed the title Avoid `` being flaky Avoid T5GemmaModelTest::test_eager_matches_sdpa_inference being flaky Sep 4, 2025
input_ids = torch.where(input_ids == self.bos_token_id, 42, input_ids)
decoder_input_ids = torch.where(decoder_input_ids == self.bos_token_id, 42, decoder_input_ids)

# Avoid leading PAD tokens from inputs.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we probably better to force this for all model tester classes

Copy link
Contributor

github-actions bot commented Sep 4, 2025

[For maintainers] Suggested jobs to run (before merge)

run-slow: t5gemma

Copy link
Contributor

@manueldeprada manueldeprada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😮 nice detective work!! ofc, argmax over -inf mask is very unstable, I remember chasing something similar.

I agree we should search similar cases. Maybe adding a check in PretrainedModel.forward() so that sum of key projections >>> -inf? and then locally run the testsuite and see if that conditions triggers?

@ydshieh
Copy link
Collaborator Author

ydshieh commented Sep 4, 2025

Leave as a TODO, got to sleep now 😴

Thanks a lot for approving.

@ydshieh ydshieh enabled auto-merge (squash) September 4, 2025 20:44
@ydshieh ydshieh merged commit 16b821c into main Sep 4, 2025
18 checks passed
@ydshieh ydshieh deleted the fix_t5gemma_3 branch September 4, 2025 20:44
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants