Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The output is different when load the serialized model in c++ API #59

Closed
YirongMao opened this issue Sep 16, 2019 · 1 comment
Closed

Comments

@YirongMao
Copy link

YirongMao commented Sep 16, 2019

Hi,
I serialized the model, and then loaded the model in C++ API following #16

But when I sent a tensor with all ones into the models (pytorch model and serialized model in c++), their outputs are different.
when I sent a tensor with all zeros, the outputs are the same.

My TensorRT version is 5.0.2.6
PyTorch version is 1.1.0

@YirongMao
Copy link
Author

So sorry. The outputs are the same. There is a problem in my code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant