Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BentoML roadmap overview #178

Open
parano opened this issue Jun 21, 2019 · 6 comments

Comments

@parano
Copy link
Member

commented Jun 21, 2019

This is living thread giving an overview of planned BentoML features on our roadmap - would love to hear your feedback. Join more discussion in our slack channel here: http://bit.ly/2N5IpbB

@parano parano added the roadmap label Jun 21, 2019

@parano parano self-assigned this Jun 21, 2019

@parano parano pinned this issue Jun 21, 2019

@parano parano added this to To do in BentoML via automation Jun 21, 2019

@parano parano removed this from To do in BentoML Jun 21, 2019

@parano parano unpinned this issue Jul 13, 2019

@parano parano pinned this issue Jul 13, 2019

@parano

This comment has been minimized.

Copy link
Member Author

commented Jul 23, 2019

Heroku deployment

Similar to the SageMaker and Serverless deployment BentoML currently provide, add support for Heroku platform

@parano

This comment has been minimized.

Copy link
Member Author

commented Jul 23, 2019

Multi-model deployment workflow

Add support for easily creating and configuring ML services with multiple machine learning models

@parano

This comment has been minimized.

Copy link
Member Author

commented Jul 23, 2019

Kukbeflow integration

Add support for deploying from Kubeflow project's training workflow

@parano

This comment has been minimized.

Copy link
Member Author

commented Jul 23, 2019

Deployment Manager

A stateful server that tracks all your desired deployment state, deployment history and event logs. Allow users to interact via CLI, API and web UI, and talks to cloud platforms or kubernetes cluster for scheduling deployments.

  • Metaflow style decorator for specifying k8s/cloud resources
  • Corresponding Kubernetes CRD/controller and cloud resource manager implementation
@parano

This comment has been minimized.

Copy link
Member Author

commented Jul 23, 2019

GPU support

Currently the BentoML generated docker images are not compatible with GPU environment and we are adding support for generating images that can utilize GPUs when serving a model

@parano

This comment has been minimized.

Copy link
Member Author

commented Jul 23, 2019

TF-serving integration

Use tf-serving as tensorflow model backend, BentoML API server will handle REST API, request parsing, preprocessing and send GRPC to tf-serving for inferencing with/without GPU

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
1 participant
You can’t perform that action at this time.