Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make the self-attention operator replaceable in Transformer #334

Merged
merged 3 commits into from
Apr 2, 2024

Conversation

WenjieDu
Copy link
Owner

@WenjieDu WenjieDu commented Apr 2, 2024

What does this PR do?

  1. fixing Make Transformer layers more flexible #333;

Before submitting

  • This PR is made to fix a typo or improve the docs (you can dismiss the other checks if this is the case).
  • Was this discussed/approved via a GitHub issue? Please add a link to it if that's the case.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have written necessary tests and already run them locally.

@coveralls
Copy link
Collaborator

coveralls commented Apr 2, 2024

Pull Request Test Coverage Report for Build 8517997717

Details

  • 19 of 22 (86.36%) changed or added relevant lines in 7 files are covered.
  • 2 unchanged lines in 1 file lost coverage.
  • Overall coverage increased (+0.03%) to 80.64%

Changes Missing Coverage Covered Lines Changed/Added Lines %
pypots/nn/modules/transformer/attention.py 10 11 90.91%
pypots/nn/modules/transformer/layers.py 2 4 50.0%
Files with Coverage Reduction New Missed Lines %
pypots/nn/modules/transformer/layers.py 2 83.67%
Totals Coverage Status
Change from base Build 8510512186: 0.03%
Covered Lines: 6598
Relevant Lines: 8182

💛 - Coveralls

@WenjieDu WenjieDu merged commit f7425e7 into dev Apr 2, 2024
15 of 16 checks passed
@WenjieDu WenjieDu deleted the (refactor)flexible_transformer_layers branch April 2, 2024 08:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants