New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
加载模型时存在的问题 #15
Comments
文档里写了可能的原因: 模型保存错了,无法正确加载 |
但是 |
第三条,要求模型完全一致,即使名字一致也要注意各层参数是否一样 |
这个都是一样的 |
那就不知道了,建议先试个最简单的模型,一个bn层这种,再二分查找问题点, |
请问问题解决了吗 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
你好,我最近参考你的代码加载模型时碰到了点问题
教程第五章,给出了用pytorch保存模型时用
然后C++ libtorch加载模型时用
torch::load(vgg16bn,"your path to vgg16bn.pt");
。我用同样的python代码保存模型,加载模型用
torch::load
时报错:In template: invalid operands to binary expression ('serialize::InputArchive' and '<ModelName>')
,但是用torch::jit::script::Module module = torch::jit::load(modelPath);
可以加载模型。我想问下前一个错误能怎么解决呢?虽然第二种方式可以加载模型但是是黑箱方式加载,我还是希望能用白盒方式加载模型。
我用的pytorch和libtorch是1.4.0版本
The text was updated successfully, but these errors were encountered: