Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Session::run, AsyncSession::submit should check input shapes #31

Closed
hyunsik opened this issue Jun 18, 2021 · 2 comments
Closed

Session::run, AsyncSession::submit should check input shapes #31

hyunsik opened this issue Jun 18, 2021 · 2 comments
Labels

Comments

@hyunsik
Copy link
Member

hyunsik commented Jun 18, 2021

see the title. Currently, the inference API relies on the check of native library's tensor size. For better error handling, Python runtime should check input shapes.

@stale
Copy link

stale bot commented Nov 30, 2021

This issue has been automatically marked as stale because it has not had recent activity.

@stale stale bot added the stale label Nov 30, 2021
@hyunsik
Copy link
Member Author

hyunsik commented Dec 14, 2021

it was fixed.

@hyunsik hyunsik closed this as completed Dec 14, 2021
hyunsik pushed a commit that referenced this issue Mar 14, 2022
31: Introduce furiosa-serving r=hyunsik a=ileixe

Introduce furiosa-serving to provide better APIs to create
inference server running on Nux runtime. This package is a lightweight
wrapper for FastAPI with a small set of functionalities to interact
furiosa-model-server.

I expect customers can implement API servers in FastAPI way with
minimal pieces of knowledge with Nux implementation detail by this package.

depends: https://github.com/furiosa-ai/furiosa-sdk-private/pull/69


Co-authored-by: Youseok Yang <yan@furiosa.ai>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant