You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi I have ubuntu 16 04
torchvision (0.2.0)
torch (0.3.1)
python 3.6
installed via pip
cuda 8.0
2xk80
I'm trying to do DataParallel over my model,
but this error occures during forward pass
however if i disable upsample and sigmoid all is working clearly.
how i init parallel model net = nn.DataParallel(UNet(3, 1).cuda())
model forward
x1 = self.inc(x) x2 = self.down1(x1) x3 = self.down2(x2) x4 = self.down3(x3) x5 = self.down4(x4) x = self.up1(x5, x4) x = self.up2(x, x3) x = self.up3(x, x2) x = self.up4(x, x1) x = self.outc(x) x = self.upsample(x) x = self.sigmoid(x) return x
failes here probs = net(X)
however it's all working fine without upsample and sigmoid
am i doing something wrong?
thanks in advance
The text was updated successfully, but these errors were encountered:
I also came across this problem under the same condition except installing pytorch with anaconda.
My problem occurred when forwarding nn.LSTMCell with cuda(). The cpu mode can work fine.
before this, I installed libtcmalloc-minimal4 to solve this problem #2314
Hi I have ubuntu 16 04
torchvision (0.2.0)
torch (0.3.1)
python 3.6
installed via pip
cuda 8.0
2xk80
I'm trying to do DataParallel over my model,
but this error occures during forward pass
however if i disable upsample and sigmoid all is working clearly.
how i init parallel model
net = nn.DataParallel(UNet(3, 1).cuda())
model forward
x1 = self.inc(x) x2 = self.down1(x1) x3 = self.down2(x2) x4 = self.down3(x3) x5 = self.down4(x4) x = self.up1(x5, x4) x = self.up2(x, x3) x = self.up3(x, x2) x = self.up4(x, x1) x = self.outc(x) x = self.upsample(x) x = self.sigmoid(x) return x
failes here
probs = net(X)
however it's all working fine without upsample and sigmoid
am i doing something wrong?
thanks in advance
The text was updated successfully, but these errors were encountered: