You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def forward(self, x: torch.Tensor) -> torch.Tensor:
x = self.main(x)
if self.is_res:
x = x + self.conv(x)
return x / 1.414 # <= here
else:
return self.conv(x)
The text was updated successfully, but these errors were encountered:
If you repeatedly add arrays to arrays, their magnitude increases exponentially and will eventually overflow. Therefore, it is a good idea to normalize values. A common normalization strategy is to scale arrays such that the standard deviation is 1. If you add two uncorrelated Gaussian random variables with standard deviation of 1, the standard deviation of their sum will be $\sqrt{2}$(Proof), so you have to divide by $\sqrt{2}$ to make their standard deviation 1 again.
You can easily try this yourself with the following Python code:
importtorch# add two uncorrelated arraysx=torch.randn(1000000) +torch.randn(1000000)
print(x.std()) # standard deviation increased from 1 to approximately 1.41x/=2**0.5# divide by sqrt(2)print(x.std()) # standard deviation is 1 again
Why are you normalizing to 1.414 in unet.py?
class Conv3(nn.Module):
...
The text was updated successfully, but these errors were encountered: