Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/engine support load model #10580

Conversation

Superjomn
Copy link
Contributor

@Superjomn Superjomn commented May 11, 2018

fixes: #10513

@Superjomn Superjomn requested a review from luotao1 May 11, 2018 00:51
@Superjomn Superjomn self-assigned this May 12, 2018
@Superjomn Superjomn added the 预测 原名Inference,包含Capi预测问题等 label May 12, 2018
@Superjomn Superjomn added this to To do in Inference on Engine via automation May 12, 2018
@Superjomn Superjomn added this to the infernce-enable-tensorrt-engine milestone May 12, 2018
@Superjomn Superjomn moved this from To do to In progress in Inference on Engine May 12, 2018
@Superjomn Superjomn removed this from In progress in Inference on Engine May 12, 2018
@Xreki Xreki added this to Integrate TensorRT in Inference Framework May 21, 2018
@Superjomn Superjomn closed this Jun 14, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
预测 原名Inference,包含Capi预测问题等
Projects
No open projects
Inference Framework
Integrate TensorRT
Development

Successfully merging this pull request may close these issues.

None yet

1 participant