Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add Transformer encoder, decoder, and decoder_layer; #172

Merged
merged 1 commit into from
Aug 19, 2023

Conversation

WenjieDu
Copy link
Owner

What does this PR do?

Fixing #171

Before submitting

  • Was this discussed/approved via a GitHub issue? Please add a link to it if that's the case.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have written necessary tests and already run them locally.

@coveralls
Copy link
Collaborator

coveralls commented Aug 19, 2023

Pull Request Test Coverage Report for Build 5910317953

  • 57 of 90 (63.33%) changed or added relevant lines in 2 files are covered.
  • 4 unchanged lines in 2 files lost coverage.
  • Overall coverage decreased (-0.8%) to 83.383%

Changes Missing Coverage Covered Lines Changed/Added Lines %
pypots/modules/self_attention.py 38 71 53.52%
Files with Coverage Reduction New Missed Lines %
pypots/modules/self_attention.py 1 73.85%
pypots/cli/doc.py 3 82.28%
Totals Coverage Status
Change from base Build 5899489151: -0.8%
Covered Lines: 3106
Relevant Lines: 3725

💛 - Coveralls

@WenjieDu WenjieDu closed this Aug 19, 2023
@WenjieDu WenjieDu deleted the complete_transformer_moduels branch August 19, 2023 08:40
@WenjieDu WenjieDu restored the complete_transformer_moduels branch August 19, 2023 08:40
@WenjieDu WenjieDu reopened this Aug 19, 2023
@WenjieDu WenjieDu merged commit 69bc79b into dev Aug 19, 2023
16 of 30 checks passed
@WenjieDu WenjieDu deleted the complete_transformer_moduels branch August 19, 2023 08:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants