Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix AttentionExplainer for AttentiveFP #8244

Merged
merged 3 commits into from
Oct 22, 2023
Merged

Conversation

rusty1s
Copy link
Member

@rusty1s rusty1s commented Oct 22, 2023

No description provided.

@codecov
Copy link

codecov bot commented Oct 22, 2023

Codecov Report

Merging #8244 (2e1c1cc) into master (114ddca) will decrease coverage by 0.69%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master    #8244      +/-   ##
==========================================
- Coverage   88.09%   87.40%   -0.69%     
==========================================
  Files         473      473              
  Lines       28635    28634       -1     
==========================================
- Hits        25225    25028     -197     
- Misses       3410     3606     +196     
Files Coverage Δ
...geometric/explain/algorithm/attention_explainer.py 93.22% <100.00%> (+1.69%) ⬆️
torch_geometric/nn/models/attentive_fp.py 96.00% <100.00%> (-4.00%) ⬇️

... and 36 files with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@rusty1s rusty1s merged commit 405ef2c into master Oct 22, 2023
16 checks passed
@rusty1s rusty1s deleted the update_attention_expl branch October 22, 2023 07:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant