Replies: 1 comment 2 replies
-
First, triton can support PaddleOCR as a whole (det->cls->rec). Triton support several backend or platform. You can just install your python package inside the triton docker. If you want to use the PaddlePaddle to load your model, you can also use the Triton-Paddlepaddle docker(which is now not an official version). |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey,
I'm struggling with how to deploy PaddleOCR in triton serving.
I understand I could use either Triton Server Paddlebackend. However, if I understand it right, it is not able to support PaddleOCR as a whole (
det
->cls
->rec
). For sure I could upload only one model, say the detection model, to a triton server of PaddleBackend. But that means I have to serve three models (det
,cls
,rec
) separately and connect them together manually.I also thought about using Triton Server Pythonbackend and package a conda environment with PaddleOCR into the server. However, the triton server won't be able unpack the environment as long as there is PaddlePaddle package in it.
Is there any other way we could deploy PaddleOCR as a whole using triton?
Beta Was this translation helpful? Give feedback.
All reactions