New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove pytorch warning message #828
Conversation
…ning message when running cbmr.
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## main #828 +/- ##
==========================================
+ Coverage 88.67% 88.84% +0.17%
==========================================
Files 47 47
Lines 6020 5997 -23
==========================================
- Hits 5338 5328 -10
+ Misses 682 669 -13
☔ View full report in Codecov by Sentry. |
nimare/diagnostics.py
Outdated
if len(np.where((mask_ijk == coord).all(axis=1))[0]) | ||
] | ||
# Only retain coordinates inside the brain mask | ||
keep_idx = [i for i, coord in enumerate(dset_ijk) if masker_array[coord[0], coord[1], coord[2]] == 1] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice! this is several orders of magnitude faster! 🚀
n_iter=2000, | ||
lr=1, | ||
lr_decay=0.999, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what's the rationale for the changes to the defaults? is it because of the changes to the optimization procedure?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the rationale is to take a full step for the initial iteration, then the learning rate decay will exponentially reduce the step size.
lr=1, | ||
lr_decay=0.999, | ||
n_iter=1000, | ||
tol=1e-2, | ||
n_iter=2000, | ||
tol=1e-9, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here, how were defaults decided?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Closes # .
Changes proposed in this pull request:
functorch.hessian
bytorch.func.hessian
in cbmr code, to remove warning message if pytorch version >=2.0.