Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

paddlie-lite推理静态库链接后推理报错 #10493

Open
renshujiajia opened this issue Apr 10, 2024 · 5 comments
Open

paddlie-lite推理静态库链接后推理报错 #10493

renshujiajia opened this issue Apr 10, 2024 · 5 comments

Comments

@renshujiajia
Copy link

为使您的问题得到快速解决,在建立 Issue 前,请您先通过如下方式搜索是否有相似问题: 历史 issue, FAQ 文档, 官方文档

建立 issue 时,为快速解决问题,请您根据使用情况给出如下信息:

  • 标题:简洁、精准描述您的问题,例如“ssd 模型转换报错”
  • 版本、环境信息:
       1)Paddle Lite 版本:V2.12 inference_lite_lib.android.armv8.clang.c++_static.with_extra.tar.gz
       2)Host 环境:Ubuntu20.04
  • 模型信息
       1)模型名称 ch_PP-OCRv4_det_infer转nb
    2)模型链接 官方检测模型
  • 复现信息:ndk20b编译C++程序,使用官方ppocr_command_line程序进行推理
  • 问题描述:若C++程序链接paddle-lite静态库,报错:'feed' is not supported, 若链接动态库,并将so库放在设备上,则不报错。链接静态库需要注意什么事项吗?我是直接:# target_link_libraries(${PROJECT_NAME} PRIVATE ${CMAKE_SOURCE_DIR}/3rdparty/inference_lite_lib.android.armv8.clang.c++_static.with_extra/cxx/lib/libpaddle_api_light_bundled.a)方式连接的
@renshujiajia renshujiajia changed the title paddlie-lite推理静态库链接 paddlie-lite推理静态库链接后推理报错 Apr 10, 2024
@lishicheng1996
Copy link
Contributor

已收到您的issue,等内部同学看一下

@xiebaiyuan
Copy link
Collaborator

没引用paddleop,spaddlekernels这几个头文件

@hong19860320
Copy link
Collaborator

静态库一定要引用这三个头文件:
#include "include/paddle_api.h"
#include "include/paddle_use_kernels.h"
#include "include/paddle_use_ops.h"

@renshujiajia
Copy link
Author

静态库一定要引用这三个头文件: #include "include/paddle_api.h" #include "include/paddle_use_kernels.h" #include "include/paddle_use_ops.h"

添加后报错变成了:(.text._ZN6paddle4lite3arm4math25conv_depthwise_5x5s2_int8IfEEvPT_PKaS7_PKfS9_biPfiiiiiiiiPNS0_7ContextILNS_8lite_api10TargetTypeE4EEE[_ZN6paddle4lite3arm4math25conv_depthwise_5x5s2_int8IfEEvPT_PKaS7_PKfS9_biPfiiiiiiiiPNS0_7ContextILNS_8lite_api10TargetTypeE4EEE]+0x2d4): undefined reference to __kmpc_fork_call' ../../3rdparty/inference_lite_lib.android.armv8.clang.c++_static.with_extra/cxx/lib/libpaddle_api_light_bundled.a(conv5x5s2_depthwise_int8.cc.o): In function .omp_outlined..1':
conv5x5s2_depthwise_int8.cc:(.text..omp_outlined..1+0xb0): undefined reference to __kmpc_for_static_init_4' conv5x5s2_depthwise_int8.cc:(.text..omp_outlined..1+0x170): undefined reference to omp_get_thread_num'
conv5x5s2_depthwise_int8.cc:(.text..omp_outlined..1+0x6b4): undefined reference to `__kmpc_for_static_fini'
clang++: error: linker command failed with exit code 1 (use -v to see invocation)
请问你有遇到过吗

@hong19860320
Copy link
Collaborator

看起来没有找到 OMP的符号,你需要重新编译一个没有 OMP 的 LITE 库。

lite_option(LITE_WITH_OPENMP "Enable OpenMP in lite framework" ON)

ON 改成 OFF

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants