Skip to content

Attention with past and no unidirectional mask#5557

Merged
tianleiwu merged 5 commits intomasterfrom
tlwu/no_unidir_mask
Oct 22, 2020
Merged

Attention with past and no unidirectional mask#5557
tianleiwu merged 5 commits intomasterfrom
tlwu/no_unidir_mask

Conversation

@tianleiwu
Copy link
Contributor

@tianleiwu tianleiwu commented Oct 21, 2020

Description: Some user model has past state but the mask is not unidirectional (lower triangular). For example, all elements of the mask is 1. Update the attention op and fusion to support such case.

Also, handle subgraph pattern where multiple shape nodes are merged into one.

Motivation and Context

  • Why is this change required? What problem does it solve?
  • If it fixes an open issue, please link to the issue here.

@tianleiwu tianleiwu requested a review from a team as a code owner October 21, 2020 00:18
@tianleiwu tianleiwu marked this pull request as draft October 21, 2020 00:19
@tianleiwu tianleiwu changed the title WIP: support attention with past and not unidirectional mask Attention with past and no unidirectional mask Oct 21, 2020
@tianleiwu tianleiwu marked this pull request as ready for review October 21, 2020 06:53
@tianleiwu tianleiwu merged commit 1f304fb into master Oct 22, 2020
@tianleiwu tianleiwu deleted the tlwu/no_unidir_mask branch October 22, 2020 03:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants