New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG+1] more closesly match the BayesShrink paper in _wavelet_threshold #2241
Conversation
Current coverage is 90.63% (diff: 96.92%)@@ master #2241 diff @@
==========================================
Files 304 304
Lines 21475 21521 +46
Methods 0 0
Messages 0 0
Branches 1846 1850 +4
==========================================
+ Hits 19458 19505 +47
Misses 1663 1663
+ Partials 354 353 -1
|
# BayesShrink variance estimation doesn't work well on levels with | ||
# extremely small coefficient arrays, so skip a few of the coarsest | ||
# levels. | ||
# Note: ref [1] used a fixed wavelet_levels = 4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please, move this to the docstring Notes
section.
travis failure was just a time-out on OS X |
@grlee77 Could you rebase please? |
ee45575
to
c32ef87
Compare
|
||
|
||
def _wavelet_threshold(img, wavelet, threshold=None, sigma=None, mode='soft', | ||
wavelet_levels=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't we expose this parameter in the outmost function?
I've made some remarks, but this PR looks good to me in general. |
This matches what is done in the Chang et. al. reference from the DocString.
…efficient arrays. This is necessary because these arrays may be extremely small leading to unreliable variance estimation and noticeable blocklike artifacts in the denoised image.
return pywt.waverecn(denoised_coeffs, wavelet) | ||
|
||
|
||
def denoise_wavelet(img, sigma=None, wavelet='db1', mode='soft', | ||
multichannel=False): | ||
"""Performs wavelet denoising on an image. | ||
multichannel=False, wavelet_levels=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@grlee77 Please, set multichannel
to be the last argument.
One more comment, then 👍. |
done. I also avoided duplicate code for the mean-absolute-deviation estimate of sigma by reusing Is there a policy/convention that Also, on another tangent, I noticed that all of the denoising tests use |
@grlee77 Perfect, thanks! Regarding |
@grlee77 thank you! |
Description
pinging @stsievert, @JDWarner, @sciunto, @soupault from the discussion in #2190 for feedback on this one.
This small PR modifies the recently merged #2190 to more closely match the implementation of the Chang et. al. IEEE paper mentioned in the docstring. I apologize for not being more active in reviewing that PR before it was merged. I think either we need to make these changes to match the reference, or we need to change the docstring to match what is currently implemented.
I realize there is also ongoing work in #2240 related to color denoising.
There are no API changes proposed, but the resulting denoised images look much different than before.
The primary changes made to match the referenced article are:
I have not tested extensively whether using separate thresholds per sub-band is truly better, but if we cite the BayesShrink paper in the References we should more closely match the implementation in the citation.
Demonstration
Here is a quick demo for which I have pasted the image for current master as opposed to this PR.
Result with code in this PR:
or when providing
sigma=0.9*sigma
as anargument to `_wavelet_threshold:It is still possible to supply
threshold
to get a single threshold used at all levels:The default level of denoising produced by the master branch is much lower: