Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle a corner case in hex_aperture for even and odd npix #111

Closed
mperrin opened this issue Aug 23, 2018 · 9 comments
Closed

Handle a corner case in hex_aperture for even and odd npix #111

mperrin opened this issue Aug 23, 2018 · 9 comments
Milestone

Comments

@mperrin
Copy link
Collaborator

mperrin commented Aug 23, 2018

Issue by josePhoenix
Thursday Aug 27, 2015 at 18:17 GMT
Originally opened as mperrin/poppy#111


hex_aperture was behaving differently depending on even or odd npix values.

The best option we could come up with was picking only pixels entirely enclosed in the hexagon for inclusion in the aperture. Now, even and odd npix values alike will produce the behavior where the corners on the non-flat sides lack a pixel that would take them to the edge of the array. (In other words, hex_aperture arrays have a zero column or row between the corner and the edge of the array.)


josePhoenix included the following code: https://github.com/mperrin/poppy/pull/111/commits

@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by mperrin
Friday Aug 28, 2015 at 04:24 GMT


Can you explain a bit more the motivation / use case behind this one? Under what circumstances does the presence or absence of zero rows or columns matter in this way?

I'm both trying to understand more clearly exCtly what trouble the current code was causing, and also wondering whether this needs to be handled consistently across other aperture shapes. For instance I don't think you will get zero rows always for circular or square apertures.

What would be the correct behavior in the case of having subpixel antialiased values instead?

@mperrin mperrin added this to the 0.4 milestone Aug 23, 2018
@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by josePhoenix
Friday Aug 28, 2015 at 13:14 GMT


Alex Greenbaum pointed out that the behavior was inconsistent depending on the parity of npix, in that sometimes (for odd npix values greater than about 63) there would be 1.0 values in columns 0 and -1, and other times (for even npix values at all sizes) those two columns would be all zeroes.

The best I could do fiddling with the inequalities in hex_aperture was make the behavior consistently like the even-npix case, but after sleeping on it I thought of another way to rearrange them which might ensure they reach the edge (instead of using less than or equal to, use strictly greater than and invert the sense of the mask to identify pixels entirely outside).

I'm not actually sure what issues Alex had with it that prompted her report.

@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by josePhoenix
Friday Aug 28, 2015 at 13:24 GMT


What would be the correct behavior in the case of having subpixel antialiased values instead?

I think it would be closer to the behavior in this PR, since the existing behavior for odd npix values is essentially "rounding up" on corner pixels that should some shade of gray.

@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by mperrin
Monday Nov 16, 2015 at 17:13 GMT


For the record, I have been ignoring / leaving aside this PR for now, because I still don't entirely understand the root issue or the desired behavior. So I am not sure how to assess if this fix is the right one, nor whether other apertures would need similar fixes.

Perhaps this will all jus the obviated once we have properly gray scaled apertures?

@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by josePhoenix
Monday Nov 16, 2015 at 17:29 GMT


Basically it's a question of "should pixels not entirely within the aperture be marked as having 100% transmission", from what I remember. The current behavior (I think) erroneously marks some corner pixels as being entirely within the hex aperture

@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by agreenbaum
Monday Nov 16, 2015 at 20:38 GMT


The issue I encountered (calling zernike.hex_aperture) was that given an odd set of pixels the aperture extended to the edge of the array, and with an even set it was padded by a column of zeros. It seems that in one case I got an aperture of full "npix" size, in the other case, slightly undersized, does that make sense? I thought this might be a (small) problem in defining the support for generating zernikes/hexikes.

@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by mperrin
Tuesday Nov 17, 2015 at 17:36 GMT


Oh, somehow I missed that this was about zernike.hex_aperture as opposed to optics.HexagonAperture.

I've just done some testing and confirmed that those two functions do not give consistent outputs. The hexagon is often about 1 pixel smaller as returned from hex_aperture. The definitions of the hexagon aperture in terms of the rectangle and left and right triangles are identical between the two; both use <= for the comparisons.

However it turns out they are using different coordinate calculations for the pixel x,y values: poppy.Wavefront (and by extension all optical element classes) takes the diameter as the total distance across the array: if you have a 1 meter radius and 10 pixels, then it is 0.2 meter per pixel, and the outermost pixel centers are at 0.9 m from the center of the array (i.e. the pixel extends across 0.8 - 1.0 meters from the center). hex_aperture takes the radius (implicitly 1) as the distance to the outer pixel centers of the array, so the outer pixels would be exactly at 1.0 pixels. Each pixel is then 0.222 repeating meters across.

So I think the fix here is going to be to make the coordinates inside hex_aperture become consistent with the convention used by poppy.Wavefront.image_coordinates

@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by mperrin
Tuesday Nov 17, 2015 at 17:52 GMT


Addressed by 36395f0 but I (again) put the wrong issue number in the commit message.

@mperrin
Copy link
Collaborator Author

mperrin commented Aug 23, 2018

Comment by mperrin
Tuesday Nov 17, 2015 at 17:58 GMT


I think the fix I just committed in 36395f0 is the proper way of addressing this. Alex can you take a look? Partially it has the opposite effect of Joseph's changes - in that the first column of pixels will now always have a nonzero pixel rather than always a zero pixel - but the end result is the same, more consistent output from the function. And now there is a test case that enforces consistency between hex_aperture and HexagonAperture with hexagon side length=1.

Given that I am inclined to set aside the code changes in this PR. Does that sound OK?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant