# Affine registration of similar images #1833

Closed
opened this issue May 14, 2019 · 7 comments

## Description

Hi dipy team and users,

I am having some troubles with the dipy.align.imaffine functions' results when I try to register images with very little to no differences.
First, when I try to register an image to itself the translation, rigid and affine transformations give me results quite different from the identity. I just tested by attempting to register the MNI to itself with the different combination of transformations and when I calculate the norm of the translations or the distance of the other transformation to the identity the norm varies between around 0.0001 and 0.01.

I know that anyway it's not very common to register an image to itself. However, I am trying to write a function to create a template of longitudinal MRI data with an algorithm similar to what freesurfer does but entirely in python.
I basically have to average a dataset of images of the same subject and then register the images to this average. Then I average the newly registered images and register them to the new average and so on until it reaches convergence. Therefore, at some point, the images are almost identical. And when I used dipy registration functions, the similarity between the images never reached a convergence. It was oscillating, and sometimes the distance even increased for several rounds. I could choose a way bigger threshold, but it would be way too big to trust it.
I tried the same algorithm using the fsl FLIRT function, and I reached convergence with a threshold of 10-15, so this is a huge difference.

Do you have any idea of how I could obtain a transformation closer to none when the images are very similar?

Maybe it's because I used the default parameters?

``````nbins=32,
sampling_prop=None,
level_iters=[10000, 1000, 100],
sigmas=[3.0, 1.0, 0.0],
factors=[4, 2, 1]
``````

Thank you in advance and have a great day,
Chris Foulon

# Tested with both python 2 and 3 and dipy version 0.16.0

``````import nibabel as nib
import numpy as np
from dipy.align.imaffine import (MutualInformationMetric, AffineRegistration)
from dipy.align.transforms import (TranslationTransform3D,
RigidTransform3D,
AffineTransform3D)
mni = '/Users/cf27246/test/MNI152_T1_3mm_brain.nii.gz'

moving = img.get_data()
static_grid2world = img.affine
metric = MutualInformationMetric(32, None)

affreg = AffineRegistration(metric=metric,
level_iters=[10000, 1000, 100],
sigmas=[3.0, 1.0, 1.0],
factors=[4, 2, 1])

transform = TranslationTransform3D()
params0 = None
translation = affreg.optimize(moving, moving, transform, params0,
static_grid2world, static_grid2world)
transformation_matrix = translation.affine
#transformed = translation.transform(moving, interp='linear')

transform = RigidTransform3D()
params0 = None
rigid = affreg.optimize(moving, moving, transform, params0,
static_grid2world, static_grid2world,
starting_affine=transformation_matrix)
transformation_matrix = rigid.affine
#transformed = rigid.transform(moving, interp='linear')

transform = AffineTransform3D()
params0 = None
affine = affreg.optimize(moving, moving, transform, params0,
static_grid2world, static_grid2world,
starting_affine=transformation_matrix)
transformation_matrix = affine.affine
#transformed = affine.transform(moving, interp='linear')

# Translation vector
translation = transformation_matrix[0:3, 3]
# 3x3 matrice of rotation and other rigid and affine transformations
oth_affine_transform = transformation_matrix[0:3, 0:3]
tr_norm = np.linalg.norm(translation)
affine_norm = np.linalg.norm(oth_affine_transform - np.identity(3), 'fro')
# The criterion to test convergence in freesurfer (If I understood correctly of course)
print(pow(tr_norm, 2) + pow(affine_norm, 2))
``````

In this example is the final norm is around 0.009294 with the translation + rigid + affine transformations.

I followed the examples but maybe the default parameters are not adapted to this particular case, I don't know.

Member

### skoudoro commented May 15, 2019 • edited

 Hi @chrisfoulon, Thank you for this complete report. For this particular case, I suppose you do not really need the multiscale registration so I will use these parameters: ``````level_iters=[100], sigmas=[0.0], factors=[1] `````` it should give you a better result for this case. I need to this play a bit with what you are pointing, but like most of the DIPY team, we are currently at ISMRM and quite busy ... any opinion @omarocegueda? @Borda?

Contributor

### Borda commented May 16, 2019

 @skoudoro could you please reformat the initial question, allow code so it is easier to read I do not think that using multi-level would cause any problem, at least in theory it should not @chrisfoulon could you please share the sample image (probably some minimal version) or could it be also shown on a synthetic image generated with simple geometric shapes? https://github.com/Borda/BIRL/blob/master/data_images/images/artificial_reference.jpg
Member

### skoudoro commented May 16, 2019

 could you please reformat the initial question done I do not think that using multi-level would cause any problem, at least in theory it should not In theory, it should not, I agree. @chrisfoulon, After a bit more of thinking, I know that the default condition value for the optimizer to stop is 1e-4 (`opt_tol`). I suppose you need to increase the convergence for this specific case. You can see below how to do it. As @Borda suggested, it will be good if you could share sample data. ``````options = {'gtol': 1e-15, 'disp': False} affreg = AffineRegistration(metric=metric, level_iters=[10000, 1000, 100], sigmas=[3.0, 1.0, 1.0], factors=[4, 2, 1] options=options) ``````
Author

### chrisfoulon commented May 16, 2019

 Thank you for your answers, I attached some sample data I used for this example. The first one is the MNI152 in 3mm and the second the MNI152 in 1mm from FSL. MNI152_T1_3mm_brain.nii.gz MNI152.nii.gz I had the same kind of results with both images, and with several other T1 and mean EPI images. I tried with your suggestion with the gtol, but the results are still the same :S
Member

### skoudoro commented May 18, 2019

 @chrisfoulon in your comparison with fsl did you use mutual information or correlation ratio?
Author

### chrisfoulon commented May 20, 2019

 OH! Good call, I tried with FSL using the mutual information instead of the correlation ratio and I also have a big distance between the same images! (I'm sorry I'm not yet familiar with all the parameters and what they imply). Is it possible to use the correlation ratio with dipy affine registration? Thank you!
Member

### skoudoro commented May 20, 2019

 Not yet, we need to implement it (we only have it for SyN registration). All new contributions are welcomed! Feel free to create a new PR if you want to add it! 😄