Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Convert deep-head-pose-lite from pytorch to caffe fail #107

Open
yueyihua opened this issue Mar 16, 2021 · 1 comment
Open

Convert deep-head-pose-lite from pytorch to caffe fail #107

yueyihua opened this issue Mar 16, 2021 · 1 comment

Comments

@yueyihua
Copy link

conv1.0
conv: blob1
conv1 was added to layers
545911448128:conv_blob1 was added to blobs
add1 was added to layers
545911448416:add_blob1 was added to blobs
WARNING: CANNOT FOUND blob 8775320
8775320:extra_blob1 was added to blobs
conv1.1
batch_norm1 was added to layers
545911448272:batch_norm_blob1 was added to blobs
bn_scale1 was added to layers
conv1.2
relu1 was added to layers
545913981760:relu_blob1 was added to blobs
maxpool
max_pool1 was added to layers
545911447624:max_pool_blob1 was added to blobs
WARNING: the output shape miss match at max_pool1: input torch.Size([1, 24, 112, 112]) output---Pytorch:torch.Size([1, 24, 56, 56])---Caffe:torch.Size([1, 24, 57, 57])
This is caused by the different implementation that ceil mode in caffe and the floor mode in pytorch.
You can add the clip layer in caffe prototxt manually if shape mismatch error is caused in caffe.
stage2.0.branch1.0
conv: max_pool_blob1
conv2 was added to layers
545911447768:conv_blob2 was added to blobs
add2 was added to layers
545911418744:add_blob2 was added to blobs
WARNING: CANNOT FOUND blob 545913982624
Traceback (most recent call last):
File "hopenet_to_caffe.py", line 15, in
pytorch_to_caffe.trans_net(net,input,name)
File "./pytorch_to_caffe.py", line 786, in trans_net
out = net.forward(input_var)
File "./stable_hopenetlite.py", line 127, in forward
x = self.stage2(x)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/container.py", line 117, in forward
input = module(input)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "./stable_hopenetlite.py", line 76, in forward
out = torch.cat((self.branch1(x), self.branch2(x)), dim=1)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/container.py", line 117, in forward
input = module(input)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 722, in _call_impl
result = self.forward(*input, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/batchnorm.py", line 113, in forward
self.num_batches_tracked = self.num_batches_tracked + 1
File "./pytorch_to_caffe.py", line 532, in _add
bottom=[log.blobs(input),log.blobs(args[0])], top=top_blobs)
File "./Caffe/layer_param.py", line 33, in init
self.bottom.extend(bottom)
TypeError: None has type NoneType, but expected one of: bytes, unicode

@XiaoLaoDi
Copy link

@yueyihua have you solved your problem ? how did you solve it ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants