Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

checking equivariance for the angles that are not 90n #61

Closed
ahyunSeo opened this issue Sep 24, 2022 · 2 comments
Closed

checking equivariance for the angles that are not 90n #61

ahyunSeo opened this issue Sep 24, 2022 · 2 comments

Comments

@ahyunSeo
Copy link
Contributor

Hello,

I want to check equivariance for the angles like 22.5, 45, ... that are not the multiples of 90 degrees.
To avoid the interpolation issue, I only tried to compute the values of the image center.
However, I could not find a proper way to prove the equivariance.
I created a gist for sharing.

In the code, I use C8 group.

for img_size in [181, 183, 185, 187]:
        x = torch.randn([1, 3, img_size, img_size]).cuda()

        x = enn.GeometricTensor(x, in_type) # 3-channel trivial repr.
        xrot1 = x.transform(N-1) # 45 deg CW
        xrot2 = x.transform(N-2) # 90 deg CW

        with torch.no_grad():
            lat = model(x)
            latrot1 = model(xrot1)
            latrot2 = model(xrot2)
            w = lat.shape[-1]
            center = int((w-1)/2)
            latrot1 = latrot1.transform(1) # 45 deg CCW
            latrot2 = latrot2.transform(2) # 90 deg CCW

            print(torch.allclose(lat[0, 0, center, center].tensor, \
                    latrot1[0, 0, center, center].tensor))
            print(torch.allclose(lat[0, 0, center, center].tensor, \
                    latrot2[0, 0, center, center].tensor))

The outputs of the printing are always False, True.
I even tried a single 3x3 conv for the 'model' but I never pass the allclose for the 45 degree.

I wonder whether I'm doing something wrong.

Best,

Ahyun

@Gabri95
Copy link
Collaborator

Gabri95 commented Oct 3, 2022

Hi @ahyunSeo

Unfortunately, it is impossible to achieve perfect equivariance to rotations smaller than 90 deg rotations since the images are sampled on a squared grid.

To test equivariance to these rotations, you can include a few tricks:

  • rotate the images at a higher resolution and, then, downsample them. This reduces the interpolation artifacts introduced by rotating the input
  • mask the input outside the central disk (the pixels in the corners are moved outside the image during rotations)
  • use a natural image rather than random Gaussian noise (which tends to contain higher frequencies and, therefore, introduce more interpolation artifacts)
  • you could repeat the same process on the output of convolution, i.e. apply the rotation and then downsample
  • torch.allclose has very very low error tollerance by default, which is impossible to achieve here. Set a relative tollerance in the order of 1e-1 (this is quite large but it's generally ok for neural networks)

You can take a look at the MNIST example here.
As you can see, I'm first upsampling the mnist digits, then rotating them and, finally, downsampling them, before feeding them into the model.
You can also see in block 9 that the outputs can vary by a few percentage points.

If you want to improve the stability of your model, you can also make use of wider filters (e.g. 5x5 filters instead of 3x3) to reduce the discretization artifacts.

Hope this helps!
Gabriele

@ahyunSeo
Copy link
Contributor Author

ahyunSeo commented Oct 9, 2022

Hi, Gabriele

Thank you for your useful tricks and comments.
I think I can try some of them.

Best,
Ahyun

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants