Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Residual bloc #44

Open
eXpensia opened this issue Jun 5, 2024 · 0 comments
Open

Residual bloc #44

eXpensia opened this issue Jun 5, 2024 · 0 comments

Comments

@eXpensia
Copy link

eXpensia commented Jun 5, 2024

Hello, to match with the paper, a residual block should :

apply convolution, then normalization, then activation function and then add the residual, two times. thus I would have guess that this should be written as :

def forward(self, x)
        residual = x
        y = self.conv1(x)
        y = self.act1(self.norm1(y))  
        
        y = y + residual
        residual = y
        
        y = self.norm2(self.conv2(y))
    	y = self.act2(y)
        y += residual
        
        return y

which is not the case in the residual block I've found in the repo where the residual is added before the activation of the second block such as :


   def forward(self, x):
        y = self.conv1(x)
        y = self.act1(self.norm1(y))  
        y = self.norm2(self.conv2(y))
        y += x
        return self.act2(y)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant