-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Periodic/Circular Padding in Transpose Convolution #38
Comments
@Ceyron I had the same concern. You might be able to use |
Thanks for the reply. Don't we need the transpose convolution for upsampling? |
Yes, but it is always possible to reproduce a transposed convolution using a regular convolution. I think, in this case with |
I am not sure about this. Under similar settings (with stride=2) a forward conv reduces the spatial dimension (approx. half) and a transpose conv increases the spatial dimension (approx. twice).
|
Sorry. You are right, that doesn't work if you set |
I don't think that we can make the forward conv equal to the transpose conv (given |
@Ceyron See below:
|
Yeah, that works. It's a manual "lhs dilation" 😉 Would be nice if PyTorch had this by default. But coming back to my original question: Don't you think that the UNet needs this transpose conv with circular/periodic padding? |
Hi @Ceyron, sorry for the late reply, I just came across the issue. :) Well spotted with the transposed convolutions! We found that in practice, this small non-periodicity didn't affect the results. The skip connections introduce back fully periodic information, and the model can correct these small differences with subsequent periodic convolutions. To make the upsample periodic, I would recommend reducing the kernel size to 2x2, which gives no overlap, or use PyTorch's Upsample, as used in other U-Net implementations. We haven't seen any benefits from it, but in other settings it might be more important. |
Hi,
Thanks a lot for open-sourcing the code; it has inspired me greatly! 😊
I am currently trying to set up a UNet architecture similar to the ones used in this repo to be applied to fields with periodic boundary conditions. In the PDE-Refiner paper, you write:
As far as I understand, the KS example (which has periodic BCs) uses this architecture which should instantiate the model of this file. For this UNet, you use the default
padding_mode
which you overwrite to becircular
in this file. Part of the UNet is also an upsampling with the transpose convolution. For this convolution; however, it seems that there is only the default choice ofpadding_modes="zeros"
which is also the only mode PyTorch supports.Maybe the question is stupid, but shouldn't the transpose operator also use periodic padding like, for instance, flax supports?
The text was updated successfully, but these errors were encountered: