Skip to content

llm-inference is a platform for publishing and managing llm inference, providing a wide range of out-of-the-box features for model deployment, such as UI, RESTful API, auto-scaling, computing resource management, monitoring, and more.

License

Notifications You must be signed in to change notification settings

OpenCSGs/llm-inference

Error
Looks like something went wrong!

About

llm-inference is a platform for publishing and managing llm inference, providing a wide range of out-of-the-box features for model deployment, such as UI, RESTful API, auto-scaling, computing resource management, monitoring, and more.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 5