Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

与trtorch等项目的优劣对比 #31

Closed
aaronchan90 opened this issue Oct 10, 2021 · 3 comments
Closed

与trtorch等项目的优劣对比 #31

aaronchan90 opened this issue Oct 10, 2021 · 3 comments
Labels

Comments

@aaronchan90
Copy link

拜读完相关文档及code,了解到Forward直接使用trt的network class 逐个翻译 原始模型的每个layer,相当于给tf、torch实现了相应的parser(就像onnx-parser一样)。同时了解到目前nv自身也有类似的开源项目如trtorch、tf-trt,请问Forward 与这些项目相比的优劣分别是哪些点
谢谢!

@zhaoyiluo
Copy link
Collaborator

zhaoyiluo commented Oct 11, 2021

Hello @aaronchan90 ,感谢您对 Forward 项目的支持。

Forward 和 NV 提供的 trtorch、tf-trt 等项目,它们解决的问题都是相同的,即实现模型的部署和推理。

对于您提出的问题,我总结了以下几点:

1、Forward 实现了一次编译即能同时支持四种深度学习框架的推理(TF,Torch,Keras,ONNX);

2、Forward(除 fwd_onnx 以外)自定义了 TensorRT 网络层,这会使得和 NV 自身推理项目的支持范围不同;

3、fwd_torch、fwd_tf 和 trtorch、tf-trt 支持的算子范围不同,同时,我们也对部分基础算子进行了优化,提升推理性能;

4、对于同一框架,假设算子范围和性能相同的前提下,可以理解为对应的两个项目是“相同的“。

欢迎在使用的过程中继续交流!

@github-actions
Copy link

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

@github-actions github-actions bot added the Stale label Nov 11, 2021
@github-actions
Copy link

This issue was closed because it has been stalled for 5 days with no activity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants