System Info
used docker nvcr.io/nvidia/tritonserver:24.10-trtllm-python-py3 to build engines and start the server (tensorrtllm_backend v0.15.0)
Who can help?
@byshiue @schetlur-nv
Information
Tasks
Reproduction
/
Expected behavior
all Ready
actual behavior
after starting launch_triton_server.py
I encountered following issue:
UNAVAILABLE: Not found: unable to load shared library: /opt/tritonserver/backends/tensorrtllm/libtriton _tensorrtllm.so: undefined symbol: _ZNK12tensorrt_llm8executor8Response11getErrorMsgB5cxx11Ev
additional notes
/