Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于wide_deep model 的 TIPC 测试。 #754

Closed
Li-fAngyU opened this issue May 4, 2022 · 5 comments
Closed

关于wide_deep model 的 TIPC 测试。 #754

Li-fAngyU opened this issue May 4, 2022 · 5 comments

Comments

@Li-fAngyU
Copy link
Contributor

你好我在fork项目后按照test_tipc里的readme.md文件里的指示进行测试。执行一下两条代码:
1.bash test_tipc/prepare.sh ./test_tipc/configs/wide_deep/train_infer_python.txt 'lite_train_lite_infer'
2.bash test_tipc/test_train_inference_python.sh ./test_tipc/configs/wide_deep/train_infer_python.txt 'lite_train_lite_infer'
再enable_tensorRT=True的情况下,测试失败,错误报告如下:
W0504 20:43:30.999462 1664 analysis_predictor.cc:795] The one-time configuration of analysis predictor failed, which may be due to native predictor called first and its configurations taken effect.
I0504 20:43:31.013504 1664 analysis_predictor.cc:576] TensorRT subgraph engine is enabled
�[1m�[35m--- Running analysis [ir_graph_build_pass]�[0m
�[1m�[35m--- Running analysis [ir_graph_clean_pass]�[0m
�[1m�[35m--- Running analysis [ir_analysis_pass]�[0m
Traceback (most recent call last):
File "tools/paddle_infer.py", line 169, in
main(args)
File "tools/paddle_infer.py", line 115, in main
predictor, pred_config = init_predictor(args)
File "tools/paddle_infer.py", line 93, in init_predictor
predictor = create_predictor(config)
ValueError: (InvalidArgument) Pass tensorrt_subgraph_pass has not been registered. Please use the paddle inference library compiled with tensorrt or disable the tensorrt engine in inference configuration!
[Hint: Expected Has(pass_type) == true, but received Has(pass_type):0 != true:1.] (at /paddle/paddle/fluid/framework/ir/pass.h:236)

运行环境是AI Studio经典版 V100 32GB。
请问是因为环境的问题吗?

@wangzhen38
Copy link
Collaborator

wangzhen38 commented May 5, 2022

你好,有可能是环境问题,需要使用trt版本的paddle,请参考https://paddleinference.paddlepaddle.org.cn/master/user_guides/download_lib.html#python 安装下~

@Li-fAngyU
Copy link
Contributor Author

好的谢谢,我去试一下~

@Li-fAngyU
Copy link
Contributor Author

Li-fAngyU commented May 6, 2022

@wangzhen38 你好我安装了带有trt编译的paddle(linux-cuda10.1-cudnn7.6-trt6-gcc8.2 python3.7),然后也安装了tensorrt,可以成功import。但是还是会提示找不到tensorrt的动态库。
image
然后TIPC测试的时候,错误报告:

File "tools/paddle_infer.py", line 93, in init_predictor
    predictor = create_predictor(config)
RuntimeError: (Unavailable) Load tensorrt api getInferLibVersion failed
  [Hint: p_getInferLibVersion should not be null.] (at /paddle/paddle/fluid/platform/dynload/tensorrt.h:114)

@wangzhen38
Copy link
Collaborator

wangzhen38 commented May 6, 2022

@wangzhen38 你好我安装了带有trt编译的paddle(linux-cuda10.1-cudnn7.6-trt6-gcc8.2 python3.7),然后也安装了tensorrt,可以成功import。但是还是会提示找不到tensorrt的动态库。 image 然后TIPC测试的时候,错误报告:

File "tools/paddle_infer.py", line 93, in init_predictor
    predictor = create_predictor(config)
RuntimeError: (Unavailable) Load tensorrt api getInferLibVersion failed
  [Hint: p_getInferLibVersion should not be null.] (at /paddle/paddle/fluid/platform/dynload/tensorrt.h:114)

python3.7的环境,建议安装下最新的版本试下( https://paddle-inference-lib.bj.bcebos.com/2.3.0-rc0/python/Linux/GPU/x86-64_gcc8.2_avx_mkl_cuda11.2_cudnn8.2.1_trt8.0.3.4/paddlepaddle_gpu-2.3.0rc0.post112-cp37-cp37m-linux_x86_64.whl )另外,应该不需要额外单独安装tensorRT了

@Li-fAngyU
Copy link
Contributor Author

谢谢你,我发现是需要额外单独安装TensorRT的,之前在设置路径的时候没有设置好,按照URL这里设置好环境变量后,就成功了,然后TIPC测试也成功run了。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants