Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with 07_batchnorm.ipynb #122

Closed
jaideep11061982 opened this issue Apr 30, 2019 · 2 comments
Closed

Issue with 07_batchnorm.ipynb #122

jaideep11061982 opened this issue Apr 30, 2019 · 2 comments

Comments

@jaideep11061982
Copy link

Hi i get error while running the below class
at code x.var((0,2,3),keepdim=True)
var(): argument 'dim' (position 1) must be int, not tuple

class BatchNorm(nn.Module):
    def __init__(self, nf, mom=0.1, eps=1e-5):
        super().__init__()
        # NB: pytorch bn mom is opposite of what you'd expect
        self.mom,self.eps = mom,eps
        self.mults = nn.Parameter(torch.ones (nf,1,1))
        self.adds  = nn.Parameter(torch.zeros(nf,1,1))
        self.register_buffer('vars',  torch.ones(1,nf,1,1))
        self.register_buffer('means', torch.zeros(1,nf,1,1))

    def update_stats(self, x):
        print(x.size())
        m = x.mean((0,2,3), keepdim=True)
        print(m.size())
        v = x.var((0,2,3), keepdim=True) # This gives syntax error saying Var() needs dim as int but tuple given
        self.means.lerp_(m, self.mom)
        self.vars.lerp_ (v, self.mom)
        return m,v
        
    def forward(self, x):
        if self.training:
            with torch.no_grad(): 
              m,v = self.update_stats(x)
        else: 
          m,v = self.means,self.vars
        x = (x-m) / (v+self.eps).sqrt()
        return x*self.mults + self.adds
@llk2why
Copy link

llk2why commented May 3, 2019

Has the same problem

@jaideep11061982
Copy link
Author

this was issue with pytorch version. Fixed it by installing it from knightly.html link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants