Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support BentoML 1.2 in Yatai #505

Open
parano opened this issue Feb 26, 2024 · 1 comment
Open

Support BentoML 1.2 in Yatai #505

parano opened this issue Feb 26, 2024 · 1 comment

Comments

@parano
Copy link
Member

parano commented Feb 26, 2024

You may have seen our latest announcement on BentoML 1.2 release here: https://bentoml.slack.com/archives/CK8PQU2JY/p1708444803664399

馃嵄 We are excited to share with you that we have released BentoML v1.2, the biggest release since the launch of v1.0. This release includes improvements from all the learning and feedback from our community over the past year. We invite you to read our release blog post for a comprehensive overview of the new features and the motivations behind their development.

Here are a few key points to note before we delve into the new features:

  • v1.2 ensures complete backward compatibility, meaning that Bentos built with v1.1 will continue to function seamlessly with this release.
  • We remain committed to supporting v1.1. Critical bug fixes and security updates will be backported to the v1.1 branch.
  • BentoML documentation has been updated with examples and guides for v1.2. More guides are being added every week.
  • BentoCloud is fully equipped to handle deployments from both v1.1 and v1.2 releases of BentoML.

鉀忥笍 Introduced a simplified service SDK to empower developers with greater control and flexibility.

  • Simplified the service and API interfaces as Python classes, allowing developers to add custom logic and use third party libraries flexibly with ease.
  • Introduced @bentoml.service and @bentoml.api decorators to customize the behaviors of services and APIs.
  • Moved configuration from YAML files to the service decorator @bentoml.service next to the class definition.
  • See the vLLM example demonstrating the flexibility of the service API by initializing a vLLM AsyncEngine in the service constructor and run inference with continuous batching in the service API.

馃敪 Revamped IO descriptors with more familiar input and output types.

  • Enable use of Pythonic types directly, without the need for additional IO descriptor definitions or decorations.
  • Integrated with Pydantic to leverage its robust validation capabilities and wide array of supported types.
  • Expanded support to ML and Generative AI specific IO types.
  • 馃摝 Updated model saving and loading API to be more generic to enable integration with more ML frameworks.
  • Allow flexible saving and loading models using the bentoml.models.create API instead of framework specific APIs, e.g. bentoml.pytorch.save_model, bentoml.tensorflow.save_model.

馃殮 Streamlined the deployment workflow to allow more rapid development iterations and a faster time to production.

  • Enabled direct deployment to production through CLI and Python API from Git projects.

馃帹 Improved API development experience with generated web UI and rich Python client.

  • All bentos are now accompanied by a custom-generated UI in the BentoCloud Playground, tailored to their API definitions.
  • BentoClient offers a Pythonic way to invoke the service endpoint, allowing parameters to be supplied in native Python format, letting the client efficiently handles the necessary serialization while ensuring compatibility and performance.

馃幁 We鈥檝e learned that the best way to showcase what BentoML can do is not through dry, conceptual documentation but through real-world examples. Check out our current list of examples, and we鈥檒l continue to publish new ones to the gallery as exciting new models are released.

馃檹 Thank you for your continued support!

One of the remaining items for BentoML 1.2 release, is to support BentoML 1.2 in the Yatai BentoDeployment operator. We plan to have full support for 1.2 in Yatai 2.0. This issue is for tracking BentoML 1.2 support in current version of Yatai.

@hutm
Copy link

hutm commented Mar 15, 2024

@parano , what is the recommended way to deploy models to yatai given that bento 1.2 is not supported yet. Should we stick to bento 1.1?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants