-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bnb/bc smoothing #151
Bnb/bc smoothing #151
Conversation
@@ -64,6 +65,12 @@ def local_linear_bc(input, feature_name, bias_fp, lr_padded_slice, | |||
source shape will be used. | |||
out_range : None | tuple | |||
Option to set floor/ceiling values on the output data. | |||
smoothing : float |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why did you add this to the transforms module? Just so we don't have to recalc the bc factors? Or so you can test a bunch of smoothing factors at runtime?
The docstring isn't accurate: no inside/outside spatial domain or threshold input here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah so we don't have to recalc, which also allows me to test. Wasn't sure if we should just keep the smoothing here and remove the other.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One argument for only keeping the smoothing at the bias calc step: we'll be less likely to double-smooth. Consider that we will probably smooth the extrapolated extent and will be hard to not duplicate the smoothing in the transforms module.
For testing i would advocate for just writing a script outside of the repo that duplicates the bias correction files with different smoothing factors.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree on the double smoothing, although I'm not sure how much this matters outside the valid domain. I would argue for keeping the smoothing in the transforms since I don't see a reason to have to rerun bias-calc just to change smoothing. Keeping the stored bias correction factors as "true" as possible seems reasonable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah you just loose the ability to smooth inside the valid domain vs. outside... That's fine though if that's your preference... I'd say keep both because the ability to smooth inside vs. outside is important esp. for sup3rcc
a95ccf2
to
5cdef6b
Compare
arr_smooth = arr[..., idt] | ||
|
||
needs_fill = (np.isnan(arr_smooth).any() and fill_extend | ||
or smooth_interior > 0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should be cautious with these kinds of operations, logical operators have an order of operations priority that is not intuitive. It's fine as-is but I would use parenthesis to make the order of operations explicit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah good call, it's bitten me in the past.
No description provided.