You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
see the title. Currently, the inference API relies on the check of native library's tensor size. For better error handling, Python runtime should check input shapes.
The text was updated successfully, but these errors were encountered:
31: Introduce furiosa-serving r=hyunsik a=ileixe
Introduce furiosa-serving to provide better APIs to create
inference server running on Nux runtime. This package is a lightweight
wrapper for FastAPI with a small set of functionalities to interact
furiosa-model-server.
I expect customers can implement API servers in FastAPI way with
minimal pieces of knowledge with Nux implementation detail by this package.
depends: https://github.com/furiosa-ai/furiosa-sdk-private/pull/69
Co-authored-by: Youseok Yang <yan@furiosa.ai>
see the title. Currently, the inference API relies on the check of native library's tensor size. For better error handling, Python runtime should check input shapes.
The text was updated successfully, but these errors were encountered: