-
-
Notifications
You must be signed in to change notification settings - Fork 388
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
0.1.4.1 #504
Conversation
Codecov Report
@@ Coverage Diff @@
## master #504 +/- ##
==========================================
- Coverage 85.32% 85.28% -0.05%
==========================================
Files 199 199
Lines 9736 9750 +14
==========================================
+ Hits 8307 8315 +8
- Misses 1429 1435 +6
Continue to review full report at Codecov.
|
fluctuation = \ | ||
np.float_power(np.mean(np.float_power(var, q / 2), axis=1), 1 / q.T) | ||
np.seterr(**old_setting) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
np.seterr(**old_setting) | |
np.seterr(**old_setting) # restore previous setting |
@@ -446,8 +448,11 @@ def _fractal_dfa_fluctuation(segments, trends, multifractal=False, q=2): | |||
var = np.var(detrended, axis=1) | |||
# obtain the fluctuation function, which is a function of the windows | |||
# and of q | |||
# ignore division by 0 warning | |||
old_setting = np.seterr(divide="ignore", invalid="ignore") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's quite dangerous to play with settings tho isn't there any other way?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i don't really see any warning when I was running with individual data but I saw some warnings when I was running with the batch of data from multiple participants. I cannot pinpoint the problem tho so I thought abt turning off the warning. And I restore the setting immediately after that. Tho if you think we shouldn't fiddle with the settings then I can revert it. It's just a little too many warnings 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After reviewing the warnings, I noticed that I observed the "division by zero" warning in
NeuroKit/neurokit2/complexity/fractal_dfa.py
Line 368 in abfd08d
slopes[i] = np.polyfit(np.log2(windows), np.log2(fluctuations[:, i]), 1)[0] |
when the warning occurred in
NeuroKit/neurokit2/complexity/fractal_dfa.py
Line 450 in abfd08d
np.float_power(np.mean(np.float_power(var, q / 2), axis=1), 1 / q.T) |
so perhaps some instances might return detrended values equal to zero and thus the fluctuation is returned as zero (undefined float_power of zero), leading to an undefined log value (when zero is input to log function, it apparently returns the "division by zero" warning too, as discussed here
So instead of suppressing the warning, we can also straight away return zero when input is zero 😅
fix fractal_higuchi
Optimizing parameters for a list of signals
Sanitize maxbeat before creating window for dfa alpha2
@DominiqueMakowski do make a review before the patch version is merged |
Co-authored-by: Dominique Makowski <dom.mak19@gmail.com>
Code Climate has analyzed commit 2fc854e and detected 0 issues on this pull request. View more on Code Climate. |
No description provided.