Skip to content

Fix decoder_attention_mask None handling in generation utils#45985

Open
damodharg6 wants to merge 1 commit into
huggingface:mainfrom
damodharg6:fix-decoder-attention-mask-bug
Open

Fix decoder_attention_mask None handling in generation utils#45985
damodharg6 wants to merge 1 commit into
huggingface:mainfrom
damodharg6:fix-decoder-attention-mask-bug

Conversation

@damodharg6
Copy link
Copy Markdown

Summary

This PR fixes a potential issue in generation/utils.py where decoder_attention_mask could be accessed before proper validation.

Changes

  • Replaced direct dictionary access with:
    model_kwargs.get("decoder_attention_mask", None)
  • Added a None check before applying torch.cat(...)

Motivation

This prevents failures during generation/evaluation workflows when decoder_attention_mask is not provided.

@Rocketknight1
Copy link
Copy Markdown
Member

Hey, can you give us some sample code that shows how the issue can be triggered? We get a lot of agent fixes that don't fix actual bugs, so we'd like to see a reproducer!

@damodharg6
Copy link
Copy Markdown
Author

damodharg6 commented May 15, 2026 via email

@Rocketknight1
Copy link
Copy Markdown
Member

That code snippet runs fine for me.

@damodharg6
Copy link
Copy Markdown
Author

damodharg6 commented May 15, 2026 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants