Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

recon image has shape (width+1,height+1) #34

Closed
stggh opened this issue Nov 25, 2015 · 23 comments
Closed

recon image has shape (width+1,height+1) #34

stggh opened this issue Nov 25, 2015 · 23 comments
Labels
Milestone

Comments

@stggh
Copy link
Collaborator

stggh commented Nov 25, 2015

$ ipython
Python 2.7.10 (default, Oct 14 2015, 16:09:02) 

IPython 2.3.0 -- An enhanced Interactive Python.

In [1]: import numpy as np

In [2]: from abel.basex import BASEX

In [3]: im = np.loadtxt("O-ANU2048.txt")

In [4]: (n,m) = np.shape(im)

In [5]: print (n,m)
(2048, 2048)

In [6]: recon,speed = BASEX(im,center=(int(n/2),int(m/2)),n=n,basis_dir='./',verbose=True,calc_speeds=True)
Loading basis sets...           
0.72 seconds
Reconstructing image...         
16.19 seconds
Generating speed distribution...
2.07 seconds

In [7]: print (np.shape(recon))
(2049, 2049)

PS: Great work guys!

@DhrubajyotiDas
Copy link
Member

Could you retry with an image that has width = odd integer? Either by cropping or padding the original image with zeroes.

@rth rth added this to the Version 0.6 milestone Nov 25, 2015
@rth
Copy link
Collaborator

rth commented Nov 25, 2015

Hi Steve,

thanks for pointing this out.

As far as I understand, the current BASEX implementation only supports n odd, and will raise an error if that is not the case. When given an image with an odd n, the BASEX transform produces a result of the same shape, as checked by this test. Now, what must be happening here is that the centering function pads the original image to make it odd, which result in a transform of shape (2049, 2049).

What I'm wondering about is there actually a physical reason why we can't generate basis sets for n=even? If so, we probably should ensure that the centering function produces the result of the shape (n x n) and raise an error if it is odd in the BASEX function and not just in the generate_basis_sets routine. What do you think about it ?

@rth rth added the bug label Nov 25, 2015
@stggh
Copy link
Collaborator Author

stggh commented Nov 26, 2015

Thanks for the very fast response. The odd size image is a good point. I have trimmed the image by one pixel and the new basis calculation is in progress.

Unfortunately, I am unfamiliar with the basex algorithm and do not know any reason for the odd pixel image size.

My group uses the Hansen and Law J. Opt. Soc. Am A2, 510-520 (1985). I have written a version that could be included, as an alternative to the basex method (and onion peel). I am just waiting on approval for the upload from my colleague. The agreement is excellent apart from a small displacement in the radial position of the speed profile.
hansenlaw

[edit] I should clarify, that your great work in converting basex.exe to Python has now made basex accessible for my larger velocity-map images, which was previously restricted to 1000x1000 pixel and did not provide for the reconstructed image pixel value to be negative.

@DanHickstein
Copy link
Member

Oh, this is weird, previously we would always produce an image that had an even size! We must have switched something when we were messing with the requirements for n and nbf.

I don't see a conceptual reason why there should be any odd/even requirement for n, though I would expect that having nbf be the same odd/even character as n makes the most sense.

For computing the speeds, it makes the most sense not to have a "zero pixel", which we would have if n is odd. (I think that for the speed calculation to be strictly correct, we should be multiplying by a radial factor to take the volume into account. We currently do a naive angular integration of the 2D image that does not take into account the real 3D shape.)

We would be very happy to have the Hansen and Law iterative method incorporated into PyAbel. It seems to produce a bit less noise (especially centerline noise) than BASEX. Does it take a long time? I recall using one of Vrakking's iterative methods several years ago, and the results were good, but it took several minutes even for 1000x1000 pixels. I suppose downsampling the image to a smaller size could improve the speed...

Either way, it would still be fun to experiment with! Having more methods to compare will just make things better.

@DanHickstein
Copy link
Member

Oh, and thanks for pointing out that basex.exe cannot handle negative values. I did not realize this. We will need to point this out (along with the ability to use large images) if we write a "Features of PyAbel" section! :)

One more thought about the 2048->2049 bug: It could just be the centering algorithm. If you have an even number image dimension, then it's not actually clear where the center should be. For example, if your image is 2 pixels wide, then the center is actually between pixel 0 and pixel 1.

What happens if you set the center to be 1025 instead of 1024 as you are currently doing?

@stggh
Copy link
Collaborator Author

stggh commented Nov 26, 2015

Excellent, open source wins.

The Hansen and Law algorithm core is ~13 lines of code. The inversion for a 2048x2048 pixel image takes 42 sec cf 16 sec for basex (for an already constructed basis), so about 1/2 the basex speed, but very usable. It is great to have other algorithms to use, and compare, which helps track down implementation faults, possibly centring in my case.

Good comments about the definition of the image centre. While on this topic,
IMO centre, and the vital image circularization, should be external to the inversion algorithms, otherwise each new inversion algorithm will need to duplicate the centre offset code.

My images are centred with respect to left cf right, and top cf bottom radial profiles, and are self-consistent with my own code (of course). For pyAbel I played with different centre offsets with the results that the spectral features broadened, so the image centre appears consistent with pyAbel.

For even pixel number images, the centre is not a pixel, but the grid between pixels. It appears to be more complicated for odd pixel number images.

@DanHickstein
Copy link
Member

Thanks for the details Steve. 42 seconds is not so bad for a huge image like that. The reduction in the centerline noise could be worth it depending on the situation. So yes, it would be great to incorporate it.

The centering code actually lives in abel.tools. And you can use abel.basex.basex_transform if you just want the transform and no centering. The abel.BASEX function just wraps the centering, basex transform, and angular integration (speeds generation) into a single function in case people were looking for a one-liner for processing their data.

@DanHickstein
Copy link
Member

So, here is a minimal example that reproduces a problem with the centering function:

import numpy as np
import matplotlib.pyplot as plt
import abel

data = np.zeros((5,5))

data[2,2]=1

fig = plt.figure(figsize=(12,6))
ax1 = plt.subplot(121)
ax2 = plt.subplot(122)

d2 = abel.tools.center_image(data,center=(2,2),n=5)

ax1.imshow(data,interpolation='nearest')
ax2.imshow(d2,interpolation='nearest')

ax1.set_title('Raw data')
ax2.set_title('Center=(2,2)')

plt.show()

image

In the case of an image with an odd-size image, the center is clearly defined. But for an even-sized image, we need to make a formal decision if the center is at the left or right edge of the defined pixel.

@stggh
Copy link
Collaborator Author

stggh commented Nov 26, 2015

Hah, as an even-size image person, I tend to think that the even image centre is clearly defined, as the bottom left corner of the pixel (or the grid). This also facilitates the use of Python slicing functions.
In reality the pixels are irrelevant. The image centre is defined with respect to the centre of symmetry of the measurement.

[May not be on topic]
I have not looked at the centre issue in detail, but the basex algorithm appears to do a great job for even-size images, so excluding even-sized images from the basex inversion is too stringent a condition.

@rth
Copy link
Collaborator

rth commented Nov 26, 2015

To add to the discussion, independently of the centering function, it looks like currently BASEX doesn't handle well even image sizes.

For instance, here is a reconstruction of the Gaussian with n=21, nbf=11, which is correct except at the origin,
figure_1
now if I disable all the checks in the basis generation, and force it to run with n=20, nbf=10, I get the following image that is wrong,
figure_2

Therefore, even if we agree on the way to fix the centering function (which we should at some point), the BASEX algorithm will produce incorrect results (and somebody would need to redo the math to understand why).

Well, essentially this is only a problem when analyzing images (20x20), for e.g. 1024x1024 images, setting the center with +- 1 px precision should not matter (particularly as there is amplification of the noise on the axis), no? Can't we just keep the processing the way it is now (i.e. centering function that produces odd shapes, process with abel) and then simply remove one pixel row/column to get to the size of the original image? This way we would support both odd and even image, and although this is not ideal nor very clean, for all practical applications this shouldn't be an issue, don't you think?

EDIT: sorry, yes as pointed by @stggh , for n odd the BASEX transform appears to be correct but shifted by 1/2 pixel on the right (I initially mistook the red curve with the blue one, intially). So either way of dealing with it should work.

@stggh
Copy link
Collaborator Author

stggh commented Nov 26, 2015

The even basex result appears to be correct, apart from the displacement off-centre. I agree that "for all practical applications this shouldn't be an issue". Definitely, allow basex processing of even images.

@DanHickstein
Copy link
Member

@stggh, when you say that you define the center as the bottom left of the pixel, you have the origin at the bottom left, correct? So, the center is at the "lower index" side of the pixel in both the horizontal and vertical directions. This definition makes sense to me, so I would recommend that in the case of even images, we adopt the convention that the center is located at the low-index side of the defined pixel. This means that the defined center has different meanings in the case of odd and even sized images, but this seems unavoidable and as long as we document it, users should not get too confused.

@rth, in the short run it looks like we should slide the even-sized image transform over a pixel in order to get it to line up. In the long run, we probably need to re-write the basex code for better readability and agreement with some sort of analytical math, either the Dribinski Rev. Sci. Instrum. paper or something that we write ourselves. We can put that off for a while.

@stggh
Copy link
Collaborator Author

stggh commented Nov 28, 2015

centre
Agreed, for the even case, although, counting pixels from bottom/top is depends on the image format definition, and is not important for all (velocity-map ?) images which are z-symmetric.

Having not previously encountered an odd-pixel image, the even centre definition makes perfect sense.

I have just realised that the hansenlaw recursion will work with odd images, it is only the image slicing that breaks. In principle, HL does not care about pixels, each row is just a 1d grid. However, I prefer the (more common?) grid definition of the centre. I also think, that somehow we should use the same centre definition, otherwise confusion will reign.

[edit] reduced image size (50dpi).
On second thoughts our definition is ok, centre = centre of image relative to physical boundary.

@DanHickstein
Copy link
Member

Now that I think about it more, the basex output for the even-sized image is not really correct. It contains a "central pixel", which an even-sized image should not. So, our basex algorithm only seems to make sense for an odd-sided image. I think that @rth is correct that the "quick fix" is to just make the original image odd-sized. In this case, it seems like the most honest thing to do would be to also return an odd sized image to the user.

@stggh
Copy link
Collaborator Author

stggh commented Nov 29, 2015

This is tricky. I would prefer that recon has the same size as the input image. Your plan for even images is to add one row and column AND shift the centre by 1/2 a pixel, then apply the basex inversion, returning an odd size image. I just want to check at the centre is also relocated? Tks.

@DhrubajyotiDas
Copy link
Member

@DanHickstein would it make sense to eliminate the "first" basis function (k = 0 case, Fig 1, Dribinski et al paper) in the case of even functions? That way, there wouldn't be a need for a central pixel.

@DanHickstein
Copy link
Member

Hmmmm, okay, maybe I need to go back and read the paper and figure out what's going on with the basis set generation. I think that you're correct that we need to eliminate the center basis function in the case of even sized images. I have some travel and other obligations for the next few days, so if one of you has time to think about this, that would be great. Otherwise I can work more about this later this week.

@DhrubajyotiDas
Copy link
Member

@DanHickstein another possibility to think about is creating a "fake" column/row/both in even sized images (possibly by taking the average of the two columns/rows on either side of the "real" center), inserting it into the even raw_data and doing the transformation.

We could then return the transform as-is to the user, or remove the added fake column/row/both and return a transform of the same size as raw_data.

@stggh
Copy link
Collaborator Author

stggh commented Nov 30, 2015

@DhrubajyotiDas - the image centre would need to be also shifted = my query

Edit: Opps! I miss interpreted your comment - explained by @rth approach #3 below. My apologies.

@rth
Copy link
Collaborator

rth commented Nov 30, 2015

So just to summarize the conversation above.

Problem 1. I think we all agree on the definition of the center for an image as illustrated the @stggh comment. The centering function should be fixed properly in PR #38, #37. This can be done in v0.6

Problem 2. The BASEX transform currently does not work with even image sizes.

  • Approach 1. Return the an odd reconstruction even if the image is even => I think that's not ideal (shapes should be preserved), and besides since PR Add unit tests for the centering function #38, Fixed centering function #37 fix the centering function it will not produce odd images for an even input anymore.
  • Approach 2. Comment checks for odd images in BASEX, pass even image, then shift the output by 1/2 pixel (i.e. interpolate the result on a new grid).
  • Approach 3. If the image is even, add a fake central row, pass the image to BASEX, then remove the added row (@DhrubajyotiDas)
  • Approach 4. If the image has an even shape, eliminate the "first" basis function. Read the paper etc..

I think that the image size should not be changed by the Basex transform. If we want a quick solution, I would rather go with the Approach 3., as interpolating on a new grid to shift by 1/2 pixel in Approach 2. would also have other undesired side effects (not sure what happens to noise etc). We could do this in v0.6. The clean solution would be Approach 4. which would require reading the paper in more detail. Personally, I won't be able to look into it this week.

So maybe we can implement the Approach 3. which won't be very hard and finally release v0.6? Then we can work on fixing this problem more thoroughly under v0.7 with the Approach 4. ?

@stggh
Copy link
Collaborator Author

stggh commented Nov 30, 2015

Excellent summary and solution. I agree. The resulting speed data may also need correction?

@DanHickstein
Copy link
Member

Today, on the call we arrived at the following conclusions about even vs odd images:

  1. Each transform function should just complete the transform for odd images. (This works for all of the transform methods except for the onion peeling method, but it's possible that it can be modified.)

  2. The abel.iabel/abel.fabel methods will allow for centering with a center specified by the central pixel, which generates an odd image. The array size returned by the function will be odd, so if an even image is passed, the image shape may not be preserved.

  3. We will provide code that allows users that have even-sized images (say from a CCD camera) to re-interpolate them to odd-sized images through a half-pixel shift. Currently, this is implemented in the tools.vmi.find_image_center_by_slice() function, though I wonder if it would be better to incorporate this directly into the centering function. We could allow users to specify the center in terms of decimal pixels and re-interpolate the image by a fraction of a pixel, ultimately generating an odd-sized image.

@DanHickstein
Copy link
Member

We have now implemented a abel.tools.center.center_image() function that uses interpolation to always produce an odd image. So, I think that we can close this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants