Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

antsApplyTransformsToPoints loads CSV points as nan #733

Closed
Borda opened this issue Mar 10, 2019 · 30 comments
Closed

antsApplyTransformsToPoints loads CSV points as nan #733

Borda opened this issue Mar 10, 2019 · 30 comments

Comments

@Borda
Copy link

Borda commented Mar 10, 2019

Describe the bug
I have some difficulty with transforming points according to the estimated registration transform.

To Reproduce
Steps to reproduce the behaviour:

  1. using binary - ANTs-1.9.v4-Linux.tar.gz from https://sourceforge.net/projects/advants/files/ANTS/ANTS_Latest/
  2. the image registration works fine as well as warping points
  3. the input CSV file
x,y
62,290
75,408
78,470
119,226
125,321
...
  1. warping command
/Applications/ANTs-1.9.v4-Linux/bin/antsApplyTransformsToPoints     --dimensionality 2     --input /results/BmANTs/0/Rat_Kidney_PanCytokeratin.csv     --output /results/BmANTs/0/Rat_Kidney_PanCytokeratin.csv     --transform [/results/BmANTs/0/trans0GenericAffine.mat, 1]  -t /results/BmANTs/0/trans1InverseWarp.nii.gz
  1. logging message
Input csv file: /results/BmANTs/0/Rat_Kidney_PanCytokeratin.csv
=============================================================================
The composite transform is comprised of the following transforms (in order): 
  1. /results/BmANTs/0/trans1InverseWarp.nii.gz (type = DisplacementFieldTransform)
  2. inverse of /results/BmANTs/0/trans0GenericAffine.mat (type = AffineTransform)
=============================================================================
 point-in = [nan, nan] point-out = [nan, nan]
 point-in = [nan, nan] point-out = [nan, nan]
 point-in = [nan, nan] point-out = [nan, nan]
 point-in = [nan, nan] point-out = [nan, nan]
 point-in = [nan, nan] point-out = [nan, nan]
...

Expected behavior
I am wondering why the loaded CSV have only NAN as input? What is the supported CSV format?

@stnava
Copy link
Member

stnava commented Mar 10, 2019 via email

@Borda
Copy link
Author

Borda commented Mar 10, 2019

thx for such your fast reply. I have tried the sample code from the repo
chicken-master$ ~/Applications/ANTs-1.9.v4-Linux/bin/antsApplyTransformsToPoints -d 2 -i ./data/chicken-3-ref.csv -o testq2.csv -t [ data/chicken3to4-ref.mat , 1]
and got very similar results

Input csv file: ./data/chicken-3-ref.csv
=============================================================================
The composite transform is comprised of the following transforms (in order): 
  1. inverse of data/chicken3to4-ref.mat (type = AffineTransform)
=============================================================================
 point-in = [nan, nan] point-out = [nan, nan]
 point-in = [nan, nan] point-out = [nan, nan]
Output warped points to csv file: testq2.csv

any idea what I am doing wrong?

@stnava
Copy link
Member

stnava commented Mar 10, 2019 via email

@Borda Borda closed this as completed Mar 11, 2019
@Borda
Copy link
Author

Borda commented Mar 11, 2019

@stnava could you please extend example with points warping especially what is the transformation order?
For warping points from moving the frame to fix frame we need to use the inverse transform (compare to warping moving the image to fix frame where we use direct transform), right?
Then the notation [ trans0GenericAffine.mat, 1 ] denotes an inverse, right?

My case with registration:

/antsbin/bin/antsRegistration
     --dimensionality 2
     --initial-moving-transform [ /results/BmANTs/1/Izd2-29-041-w35-He.nii, /results/BmANTs/1/Izd2-29-041-w35-proSPC.nii , 1]
     --metric mattes[ /results/BmANTs/1/Izd2-29-041-w35-He.nii, /results/BmANTs/1/Izd2-29-041-w35-proSPC.nii, 1 , 32, regular, 0.2 ]
     --transform affine[ 0.1 ]
     --convergence [500x500x0,1.e-6,20]
     --smoothing-sigmas 4x2x1mm
     --shrink-factors 3x2x1
     --use-estimate-learning-rate-once 1
     --metric cc[ /results/BmANTs/1/Izd2-29-041-w35-He.nii, /results/BmANTs/1/Izd2-29-041-w35-proSPC.nii, 1 , 2 ]
     --transform SyN[ .1, 3, 0.0 ]
     --convergence [ 50x50x0,0,5 ]
     --smoothing-sigmas 1x0.5x0mm
     --shrink-factors 4x2x1 -l 1 -u 1 -z 1
     --output [/results/BmANTs/1/trans]

and warping image and points

/antsbin/bin/antsApplyTransforms
     --dimensionality 2
     --input /results/BmANTs/1/Izd2-29-041-w35-proSPC.nii
     --output /results/BmANTs/1/Izd2-29-041-w35-proSPC.nii
     --reference-image /results/BmANTs/1/Izd2-29-041-w35-He.nii
     --transform /results/BmANTs/1/trans1Warp.nii.gz -t /results/BmANTs/1/trans0GenericAffine.mat
     --interpolation Linear
/antsbin/bin/antsApplyTransformsToPoints
     --dimensionality 2
     --input /results/BmANTs/1/Izd2-29-041-w35-proSPC.csv
     --output /results/BmANTs/1/Izd2-29-041-w35-proSPC.csv 
     --transform [ /results/BmANTs/1/trans0GenericAffine.mat , 1] -t /results/BmANTs/1/trans1InverseWarp.nii.gz

@ntustison
Copy link
Member

Hi @Borda ,

Try running the runthis.sh script. It contains calls to both antsApplyTransforms and antsApplyTransformsToPoints demonstrating the proper order for both calls.

@Borda Borda mentioned this issue Mar 11, 2019
2 tasks
@Borda
Copy link
Author

Borda commented Mar 11, 2019

The script is if fine showing the capabilities, but it is not very educative since it does not talk about the options - if they are the only one... it seems that the input image can be also jpg, but the output can be only Nifty.
I have changed the line https://github.com/stnava/chicken/blob/master/runthis.sh#L42 but for JPG it does not create any file and for PNG it creates an empty file.

@cookpa
Copy link
Member

cookpa commented Mar 11, 2019

/antsbin/bin/antsApplyTransforms
--dimensionality 2
--input /results/BmANTs/1/Izd2-29-041-w35-proSPC.nii
--output /results/BmANTs/1/Izd2-29-041-w35-proSPC.nii
--reference-image /results/BmANTs/1/Izd2-29-041-w35-He.nii
--transform /results/BmANTs/1/trans1Warp.nii.gz -t /results/BmANTs/1/trans0GenericAffine.mat
--interpolation Linear
/antsbin/bin/antsApplyTransformsToPoints
--dimensionality 2
--input /results/BmANTs/1/Izd2-29-041-w35-proSPC.csv
--output /results/BmANTs/1/Izd2-29-041-w35-proSPC.csv
--transform [ /results/BmANTs/1/trans0GenericAffine.mat , 1] -t /results/BmANTs/1/trans1InverseWarp.nii.gz

This looks correct to me, except that you are overwriting the input with the output, which I would not recommend.

The input points should be in ITK physical space and in the format "x,y,z,t". Since your data is 2D, the last two dimensions here would be zero.

If you open your image in ITK-SNAP, you can go to Tools -> Image Information, which will show you the ITK coordinates of a point in the image.

@stnava
Copy link
Member

stnava commented Mar 11, 2019 via email

@Borda
Copy link
Author

Borda commented Mar 11, 2019

I played with ANTsPy, but unfortunately, it requires cmake 3.11 and the latest version for Unintu is 3.10, so I am not able to run on university computation grid with my restricted access... :(

@gdevenyi
Copy link
Contributor

gdevenyi commented Mar 11, 2019

Upgrading cmake is super easy, they have a boostrap build or you can download a pre-compiled binary.

There's no need to install into the "system" you can just extract and add the directory to your PATH.

https://cmake.org/install/

@Borda
Copy link
Author

Borda commented Mar 11, 2019

yes, I have notices that the X,Y are swaped compare to the "standard" image coordinates in matplotlib

@Borda
Copy link
Author

Borda commented Mar 11, 2019

apt install cmake --upgrade install 3.10 for Ubuntu 18.04 and if I would run pip install git-https:... does not have trivial defining alternative/local cmake which can be symply downloaded...

@Borda
Copy link
Author

Borda commented Mar 12, 2019

this is the image and points before transformation
fig1
and after transformation with this commands #733 (comment)
fig2

@stnava
Copy link
Member

stnava commented Mar 12, 2019 via email

@Borda
Copy link
Author

Borda commented Mar 12, 2019

that using the estimated transformation (direct for image and inverse for points) does not give the same positions on the image after warping compare to the initial state

@stnava
Copy link
Member

stnava commented Mar 12, 2019 via email

@Borda
Copy link
Author

Borda commented Mar 12, 2019

Thank you, here are the two input images and the landmarks - Rat_Kidney_.zip

@cookpa
Copy link
Member

cookpa commented Mar 12, 2019

I think the problem is the input points are not in ITK physical space coordinates.

I did

antsRegistrationSyNQuick.sh -d 2 -f Rat_Kidney_HE.nii -m Rat_Kidney_PanCytokeratin.nii -t s -o panToHE

Then placed two points by hand

image

x,y,z,t
-827,-218,0,0
-316,-482,0,0

Then

antsApplyTransformsToPoints -d 2 -i testPoints.csv -o testPointsHE.csv -t [ panToHE0GenericAffine.mat, 1 ] -t panToHE1InverseWarp.nii.gz 

which produces

x,y,z,t
-848.452,-218.052,0,0
-325.176,-520.195,0,0

Which looks to be correct

image

@stnava
Copy link
Member

stnava commented Mar 12, 2019 via email

@Borda
Copy link
Author

Borda commented Mar 12, 2019

so it means that the point's coordinate has to be negative valued and the origin is in the right bottom corner, correct? What coordinates are used ITK or Nifty?

@stnava
Copy link
Member

stnava commented Mar 13, 2019 via email

@Borda
Copy link
Author

Borda commented Mar 13, 2019

I am using the same as ImageJ/Fiji where the origin is in the top left corner and going down/right the position grows... The LPS is Left, Posterior, Superior, which is used in DICOM images and by the ITK toolkit. It means that the origin in the coordinate system is in the bottom left corner, right?

@stnava
Copy link
Member

stnava commented Mar 13, 2019 via email

@ntustison
Copy link
Member

I would recommend using ITK-SNAP as its coordinate system is consistent with ANTs usage.

@Borda
Copy link
Author

Borda commented Jul 23, 2019

@stnava @ntustison @cookpa @gdevenyi may I kindly ask you could recommend me what registration parameters should I used for such large images as for ANHIR challenge, the image size is up to 18kx18k pixels... I have started with the following:

    --initial-moving-transform [ %(target-image)s, %(source-image)s , 1]
    --metric mattes[ %(target-image)s, %(source-image)s, 1 , 32, regular, 0.2 ]
    --transform affine[ 0.1 ]
    --convergence [500x500x0,1.e-6,20]
    --smoothing-sigmas 4x2x1mm
    --shrink-factors 3x2x1
    --use-estimate-learning-rate-once 1
    --metric cc[ %(target-image)s, %(source-image)s, 1 , 2 ]
    --transform SyN[ .1, 3, 0.0 ]
    --convergence [ 50x50x0,0,5 ]
    --smoothing-sigmas 1x0.5x0mm
    --shrink-factors 4x2x1 -l 1 -u 1 -z 1

@ntustison
Copy link
Member

Most of our collective experience is outside histological applications. That being said, I would start by downsampling the images to something like 512x512 voxels which would probably give you the ability to optimize the linear transformations and much of the deformable part relatively quickly. Then you can work towards refinement by slowly incorporating the full resolution of your images and adding image registration "stages." One bit of caution is to keep in mind the spacing of the images as I see you're using mm to specify certain parameters and performance is going to depend on what information is (or isn't) in the image headers.

@Borda
Copy link
Author

Borda commented Jul 23, 2019

the images are png/jpeg so there is no information about spacing...

@ntustison
Copy link
Member

Right, which is why I advised caution. ITK will automatically read in a spacing value when no information is available, I think 1 mm isotropic. And when you downsample/upsample and do registration, performance will be reflected in the assigned spacing values.

@Borda
Copy link
Author

Borda commented Jul 23, 2019

just passing the antsRegistration options and there is no down-sampling/down-scaling which means that I have to do it in advance, right? Also, I am wondering if there is anything like build-in scaling pyramid which is also quite a common option for image registration...

@cookpa
Copy link
Member

cookpa commented Jul 23, 2019

I think if you replace "mm" with "vox" in the smoothing it will take care of the smoothing part.

There is downsampling of the image within the registration, that is what the --shrink-factors option does. But some operations still take place at full resolution (I think the initialization is one), and the warp fields would be written out at the resolution of the smallest shrink factor.

By downsampling offline, you can test registration parameters much more efficiently. Up to you though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants