Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0.1.4.1 #504

Merged
merged 22 commits into from
Aug 30, 2021
Merged

0.1.4.1 #504

merged 22 commits into from
Aug 30, 2021

Conversation

DominiqueMakowski
Copy link
Member

No description provided.

@codecov-commenter
Copy link

codecov-commenter commented Aug 24, 2021

Codecov Report

Merging #504 (2fc854e) into master (abfd08d) will decrease coverage by 0.04%.
The diff coverage is 93.02%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #504      +/-   ##
==========================================
- Coverage   85.32%   85.28%   -0.05%     
==========================================
  Files         199      199              
  Lines        9736     9750      +14     
==========================================
+ Hits         8307     8315       +8     
- Misses       1429     1435       +6     
Impacted Files Coverage Δ
neurokit2/complexity/__init__.py 100.00% <ø> (ø)
neurokit2/signal/__init__.py 100.00% <ø> (ø)
neurokit2/complexity/complexity_optimize.py 48.68% <66.66%> (ø)
neurokit2/complexity/fractal_higuchi.py 94.84% <90.90%> (+0.59%) ⬆️
neurokit2/__init__.py 84.61% <100.00%> (ø)
neurokit2/complexity/fractal_dfa.py 93.24% <100.00%> (+0.18%) ⬆️
neurokit2/complexity/utils.py 91.13% <100.00%> (ø)
neurokit2/hrv/hrv_nonlinear.py 96.78% <100.00%> (ø)
neurokit2/rsp/rsp_simulate.py 94.64% <0.00%> (-5.36%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update abfd08d...2fc854e. Read the comment docs.

fluctuation = \
np.float_power(np.mean(np.float_power(var, q / 2), axis=1), 1 / q.T)
np.seterr(**old_setting)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
np.seterr(**old_setting)
np.seterr(**old_setting) # restore previous setting

@@ -446,8 +448,11 @@ def _fractal_dfa_fluctuation(segments, trends, multifractal=False, q=2):
var = np.var(detrended, axis=1)
# obtain the fluctuation function, which is a function of the windows
# and of q
# ignore division by 0 warning
old_setting = np.seterr(divide="ignore", invalid="ignore")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's quite dangerous to play with settings tho isn't there any other way?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't really see any warning when I was running with individual data but I saw some warnings when I was running with the batch of data from multiple participants. I cannot pinpoint the problem tho so I thought abt turning off the warning. And I restore the setting immediately after that. Tho if you think we shouldn't fiddle with the settings then I can revert it. It's just a little too many warnings 😅

Copy link
Member

@Tam-Pham Tam-Pham Aug 24, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After reviewing the warnings, I noticed that I observed the "division by zero" warning in

slopes[i] = np.polyfit(np.log2(windows), np.log2(fluctuations[:, i]), 1)[0]

when the warning occurred in

np.float_power(np.mean(np.float_power(var, q / 2), axis=1), 1 / q.T)

so perhaps some instances might return detrended values equal to zero and thus the fluctuation is returned as zero (undefined float_power of zero), leading to an undefined log value (when zero is input to log function, it apparently returns the "division by zero" warning too, as discussed here

So instead of suppressing the warning, we can also straight away return zero when input is zero 😅

@pull-request-size pull-request-size bot added size/M and removed size/S labels Aug 26, 2021
Optimizing parameters for a list of signals
@pull-request-size pull-request-size bot added size/L and removed size/M labels Aug 26, 2021
Tam-Pham and others added 2 commits August 26, 2021 12:15
@DominiqueMakowski DominiqueMakowski changed the title 0.1.5 0.1.4.1 Aug 26, 2021
@Tam-Pham
Copy link
Member

@DominiqueMakowski do make a review before the patch version is merged

NEWS.rst Outdated Show resolved Hide resolved
Co-authored-by: Dominique Makowski <dom.mak19@gmail.com>
@codeclimate
Copy link

codeclimate bot commented Aug 30, 2021

Code Climate has analyzed commit 2fc854e and detected 0 issues on this pull request.

View more on Code Climate.

@Tam-Pham Tam-Pham merged commit c110438 into master Aug 30, 2021
@Tam-Pham Tam-Pham deleted the dev branch August 30, 2021 04:48
@DominiqueMakowski DominiqueMakowski moved this from To do to Done in 📝 Documentation website Jun 22, 2022
@DominiqueMakowski DominiqueMakowski moved this from To do to Done in Complexity Jun 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Development

Successfully merging this pull request may close these issues.

Error when computing nonlinear HRV parameters bug in hrv_nonlinear
4 participants