ocr works fine on cpu but gives strange result on gpu #11579
Unanswered
GasparVardanyan
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've integrated the deploy/cpp_infer example to my project.
I use the precompiled avx_mkl_cuda10.1_cudnn7.6.5_avx_mkl_no_trt library (https://paddleinference.paddlepaddle.org.cn/user_guides/download_lib.html) on windows 11.
I've set all needed flags and use ocr like this:
I use this image as input: https://i.imgur.com/FlTDRoq.png
When I run it on cpu, I get this:
Full output with log: http://sprunge.us/WJmhCR
But when I run it on gpu (I just change DEFINE_bool(use_gpu, false, "Infering with GPU or CPU."); to true in args.cpp), I get this:
Full output with log: http://sprunge.us/HwI2NU
Please help me to understand what I'm missing to get PaddleOCR to work on GPU.
Beta Was this translation helpful? Give feedback.
All reactions