Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Pre-trained DimeNet++ performance on QM9 #8239

Merged
merged 3 commits into from
Oct 21, 2023

Conversation

xnuohz
Copy link
Contributor

@xnuohz xnuohz commented Oct 21, 2023

Fix #4698

Target: 00, MAE: 0.02975 ± 0.05869
Target: 01, MAE: 0.04322 ± 0.15740
Target: 02, MAE: 24.43286 ± 35.56775
Target: 03, MAE: 19.42164 ± 31.41084
Target: 05, MAE: 0.28941 ± 0.66366
Target: 06, MAE: 1.21997 ± 2.13234
Target: 07, MAE: 6.15220 ± 13.26018
Target: 08, MAE: 6.20371 ± 12.61268
Target: 09, MAE: 6.48553 ± 14.85413
Target: 10, MAE: 7.41481 ± 14.50145
Target: 11, MAE: 0.02268 ± 0.02604

@codecov
Copy link

codecov bot commented Oct 21, 2023

Codecov Report

Merging #8239 (dfe371e) into master (cd6d1f2) will decrease coverage by 0.69%.
Report is 2 commits behind head on master.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master    #8239      +/-   ##
==========================================
- Coverage   88.07%   87.39%   -0.69%     
==========================================
  Files         473      473              
  Lines       28615    28617       +2     
==========================================
- Hits        25203    25010     -193     
- Misses       3412     3607     +195     
Files Coverage Δ
torch_geometric/nn/models/dimenet.py 98.18% <100.00%> (+0.02%) ⬆️

... and 37 files with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@rusty1s
Copy link
Member

rusty1s commented Oct 21, 2023

Thank you. I would assume this change will break DimeNet due to the difference in their implementation?

@xnuohz
Copy link
Contributor Author

xnuohz commented Oct 21, 2023

Actually, it shouldn't be. I'll continue to fix this.

@xnuohz
Copy link
Contributor Author

xnuohz commented Oct 21, 2023

@rusty1s Checked the origin code again(See dimenet and dimenet++), and the author indeed used different angle calculations for these 2 models. Seems it's not consistent with the paper, anyway we just align with the author's code.

@rusty1s
Copy link
Member

rusty1s commented Oct 21, 2023

Yeah, can we somehow condition that on which DimeNet version we are using?

@rusty1s rusty1s merged commit 9632694 into pyg-team:master Oct 21, 2023
16 checks passed
@xnuohz xnuohz deleted the fix-dimenetpp branch October 22, 2023 04:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Bug: Pre-trained DimeNet++ performance on QM9
2 participants