-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Image sharpening is dependent on image size #418
Comments
@danielmwatkins Have you thought of a specific way to do this? An easy way would be to use the gcd = gcd(width, height) of the image's height and width, and choose |
@cpaniaguam That's basically the approach I was thinking about, depending on what gcd stands for in this case. There is likely some best-performing size of partition that depends on the spatial scale of the features we are trying to see. We'll want to have some level of tolerance for the image size not being exactly divisible by the size. I could see it also being important to make sure that if the segment is all ice that we don't stretch the histogram the same amount as if it was a mix of ice and water. I wonder if the block-based method sometimes results in discontinuities in the level of sharpening? |
@danielmwatkins gcd = greatest common divisor |
Gotcha. Actually I was thinking more along the line of saying that the parameter would be something like block_size=200 and we'd have something that estimated the number of blocks that would make the block size closest to the desired size. |
@cpaniaguam Monica, Minki, and I spent some time looking through the Matlab code. We determined that the matlab version does the approach I was suggesting: it takes a set pixel dimension, then determines the number of blocks for the adaptive equalization. There’s also a step that calculates entropy within a block to determine whether there is enough variability in pixel brightness to apply the adaptive equalization. I think with these two adjustments we could get rid of a lot of the oversharpening issue. |
@danielmwatkins Thanks for this! Could you point to the relevant blocks of code where these operations are performed? |
Yes, in the version that's on the Wilhelmus lab git repo, look at the code near line 435 of MASTER.m. It calculates entropy for each tile, then only applies the histogram equalization if the entropy is larger than a threshold. |
@danielmwatkins @mmwilhelmus @hollandjg @mirestrepo @mkkim400 julia> num_pixels_tile = 200;
julia> tile_size = round(Int, sqrt(num_pixels_tile)) # 196 pixels/square is best here
14
julia> num_pixels_tile = 220;
julia> tile_size = round(Int, sqrt(num_pixels_tile)) # 225 pixels is the best fit
15 If the tiles of the chosen size don't cover the whole image, one gets clippings; the package provides these leftover tiles also. julia> A = rand(5, 5);
julia> tile_size = (2, 2);
julia> tiles = collect(TileIterator(axes(A), tile_size))
3×3 Matrix{Tuple{UnitRange{Int64}, UnitRange{Int64}}}:
(1:2, 1:2) (1:2, 3:4) (1:2, 5:5)
(3:4, 1:2) (3:4, 3:4) (3:4, 5:5)
(5:5, 1:2) (5:5, 3:4) (5:5, 5:5) Matlab seems to have the same behavior >> A = rand(3, 3);
>> mat2tiles(A, [2,2])
ans =
3×3 cell array
{2×2 double } {2×2 double } {2×1 double}
{2×2 double } {2×2 double } {2×1 double}
{[0.5681 0.7043]} {[0.9093 0.5946]} {[ 0.5030]} QuestionsHow to handle the potential clippings? @danielmwatkins mentioned the possibility of making the edge tiles slightly bigger than the rest (I am guessing by splicing the adjacent clippings to the whole ones). How to handle the corner tile? Perhaps if the clippings are large enough nothing needs to be done? |
The default settings of IceFloeTracker.imsharpen include parameters rblocks, cblocks that divide an image into a set number of row and column blocks. For a small image, the default values result in many blocks with few pixels, and for a large value, the image may be undersharpened. For consistent results, rblocks and cblocks should depend on the image size and consistently result in approximately square blocks.
The text was updated successfully, but these errors were encountered: