Skip to content
Permalink
Browse files

Update readme.md and docs (#468)

  • Loading branch information
parano committed Jan 9, 2020
1 parent f7aed79 commit 513a88c196dd91c7f94bfffe0791c44f583ea5f3
Showing with 71 additions and 91 deletions.
  1. +18 −4 DEVELOPMENT.md
  2. +53 −87 README.md
@@ -6,7 +6,7 @@ $ git clone https://github.com/bentoml/BentoML.git
$ cd BentoML
```

Ensure you have python and pip installed, BentoML supports python _2.7_, _3.6_, and _3.7_
Ensure you have python and pip installed, BentoML supports python _3.6_, and _3.7_
```bash
$ python --version
```
@@ -54,7 +54,7 @@ $ tox

If you want to run tests under conda for specific version, use `-e` option:
```bash
$ tox -e py27
$ tox -e py37
// or
$ tox -e py36
```
@@ -106,18 +106,32 @@ $ pip install -e .[dev]

To build documentation for locally:
```bash
$ ./scripts/build-docs.sh
$ ./docs/build.sh
```

Modify \*.rst files inside the `docs` folder to update content, and to
view your changes, run the following command:

```
$ python -m http.server --directory built-docs
$ python -m http.server --directory ./docs/build/html
```

And go to your browser at `http://localhost:8000`

If you are developing under `macOS`, we also made a script that watches docs
file changes, automatically rebuild the docs html files, and refresh the browser
tab to show the change:

Make sure you have fswatch command installed:
```
brew install fswatch
```

Run the `watch.sh` script to start watching docs changes:
```
$ ./docs/watch.sh
```


## Creating Pull Request on Github

140 README.md
@@ -7,27 +7,31 @@

> From ML model to production API endpoint with a few lines of code

[![BentoML](https://raw.githubusercontent.com/bentoml/BentoML/master/docs/source/_static/img/bentoml.png)](https://github.com/bentoml/BentoML)

[Getting Started](https://github.com/bentoml/BentoML#getting-started) | [Documentation](http://bentoml.readthedocs.io) | [Gallery](https://github.com/bentoml/gallery) | [Contributing](https://github.com/bentoml/BentoML#contributing) | [Releases](https://github.com/bentoml/BentoML#releases) | [License](https://github.com/bentoml/BentoML/blob/master/LICENSE) | [Blog](https://medium.com/bentoml)


BentoML makes it easy to __serve and deploy machine learning models__ in the cloud.

It is an open source framework for machine learning teams to build cloud-native prediction API
services that are ready for production. BentoML supports most popular ML training frameworks
and common deployment platforms including major cloud providers and docker/kubernetes.
It is an open source framework for building cloud-native model serving services.
BentoML supports most popular ML training frameworks and deployment platforms, including
major cloud providers and docker/kubernetes.

👉 [Join BentoML Slack community](https://join.slack.com/t/bentoml/shared_invite/enQtNjcyMTY3MjE4NTgzLTU3ZDc1MWM5MzQxMWQxMzJiNTc1MTJmMzYzMTYwMjQ0OGEwNDFmZDkzYWQxNzgxYWNhNjAxZjk4MzI4OGY1Yjg)
to hear about the latest development updates.

---

- [Getting Started](https://github.com/bentoml/BentoML#getting-started)
- [Documentation](http://bentoml.readthedocs.io)
- [Gallery](https://github.com/bentoml/gallery)
- [Contributing](https://github.com/bentoml/BentoML#contributing)
- [Releases](https://github.com/bentoml/BentoML#releases)
- [License](https://github.com/bentoml/BentoML/blob/master/LICENSE)
- [Blog](https://medium.com/bentoml)


## Getting Started

Installation with pip:
Installing BentoML with `pip`:
```bash
pip install bentoml
```
@@ -99,32 +103,18 @@ used to build a docker image for deployment:
docker build -t my_api_server {saved_path}
```

The saved BentoService bundle can also be loaded directly from command line:
```bash
bentoml predict {saved_path} --input='[[5.1, 3.5, 1.4, 0.2]]'
# alternatively:
bentoml predict {saved_path} --input='./iris_test_data.csv'
You can also deploy your BentoService directly to cloud services such as AWS Lambda with `bentoml`, and
get back a API endpoint hosting your model, that is ready for production use:
```

The saved bundle is pip-installable and can be directly distributed as a PyPI package:
```bash
pip install {saved_path}
bentoml deployment create my-iris-classifier --bento IrisClassifier:{VERSION} --platform=aws-lambda
```
```python
# Your BentoService class name will become packaged name
import IrisClassifier

installed_svc = IrisClassifier.load()
installed_svc.predict([[5.1, 3.5, 1.4, 0.2]])
```
Try out the full quickstart notebook: [Source](https://github.com/bentoml/BentoML/blob/master/guides/quick-start/bentoml-quick-start-guide.ipynb), [Google Colab](https://colab.research.google.com/github/bentoml/BentoML/blob/master/guides/quick-start/bentoml-quick-start-guide.ipynb), [nbviewer](https://nbviewer.jupyter.org/github/bentoml/bentoml/blob/master/guides/quick-start/bentoml-quick-start-guide.ipynb)

Deploy the saved BentoService to cloud services such as AWS Lambda with the `bentoml `command:
```
bentoml deployment create my-iris-classifier --bento IrisClassifier:{VERSION} --platform=aws-lambda
```

To learn more, try out our 5-mins Quick Start notebook using BentoML to turn a trained sklearn model into a containerized REST API server, and then deploy it to AWS Lambda: [Download](https://github.com/bentoml/BentoML/blob/master/guides/quick-start/bentoml-quick-start-guide.ipynb), [Google Colab](https://colab.research.google.com/github/bentoml/BentoML/blob/master/guides/quick-start/bentoml-quick-start-guide.ipynb), [nbviewer](https://nbviewer.jupyter.org/github/bentoml/bentoml/blob/master/guides/quick-start/bentoml-quick-start-guide.ipynb)
## Documentation

Full documentation and API references can be found at [bentoml.readthedocs.io](http://bentoml.readthedocs.io)


## Examples
@@ -152,7 +142,7 @@ To learn more, try out our 5-mins Quick Start notebook using BentoML to turn a t
* Text Classification - [Google Colab](https://colab.research.google.com/github/bentoml/gallery/blob/master/keras/text-classification/keras-text-classification.ipynb) | [nbviewer](https://nbviewer.jupyter.org/github/bentoml/gallery/blob/master/keras/text-classification/keras-text-classification.ipynb) | [source](https://github.com/bentoml/gallery/blob/master/keras/text-classification/keras-text-classification.ipynb)
* Toxic Comment Classifier - [Google Colab](https://colab.research.google.com/github/bentoml/gallery/blob/master/keras/toxic-comment-classification/keras-toxic-comment-classification.ipynb) | [nbviewer](https://nbviewer.jupyter.org/github/bentoml/gallery/blob/master/keras/toxic-comment-classification/keras-toxic-comment-classification.ipynb) | [source](https://github.com/bentoml/gallery/blob/master/keras/toxic-comment-classification/keras-toxic-comment-classification.ipynb)

### Tensorflow 2.0
#### Tensorflow 2.0

* tf.Function model - [Google Colab](https://colab.research.google.com/github/bentoml/gallery/blob/master/tensorflow/echo/tensorflow-echo.ipynb) | [nbviewer](https://nbviewer.jupyter.org/github/bentoml/gallery/blob/master/tensorflow/echo/tensorflow-echo.ipynb) | [source](https://github.com/bentoml/gallery/blob/master/tensorflow/echo/tensorflow-echo.ipynb)

@@ -178,83 +168,59 @@ To learn more, try out our 5-mins Quick Start notebook using BentoML to turn a t

### Deployment guides:

- [BentoML AWS Lambda Deployment Guide](https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-serverless)
- [BentoML AWS SageMaker Deployment Guide](https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-sagemaker)
- [BentoML Clipper.ai Deployment Guide](https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-clipper/bentoml-clipper-deployment-guide.ipynb)
- [BentoML AWS ECS Deployment Guide](https://github.com/bentoml/BentoML/tree/master/guides/deployment/deploy-with-aws-ecs)
- [BentoML Google Cloud Run Deployment Guide](https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-google-cloud-run/deploy-with-google-cloud-run.ipynb)
- [BentoML Kubernetes Deployment Guide](https://github.com/bentoml/BentoML/tree/master/guides/deployment/deploy-with-kubernetes)
* Automated end-to-end deployment workflow with BentoML
- [BentoML AWS Lambda Deployment Guide](https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-serverless)
- [BentoML AWS SageMaker Deployment Guide](https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-sagemaker)

* Clipper Deployment
- [BentoML Clipper.ai Deployment Guide](https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-clipper/bentoml-clipper-deployment-guide.ipynb)

## Feature Highlights
* Mannual Deployment
- [BentoML AWS ECS Deployment](https://github.com/bentoml/BentoML/tree/master/guides/deployment/deploy-with-aws-ecs)
- [BentoML Google Cloud Run Deployment](https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-google-cloud-run/deploy-with-google-cloud-run.ipynb)
- [BentoML Kubernetes Deployment](https://github.com/bentoml/BentoML/tree/master/guides/deployment/deploy-with-kubernetes)


* __Multiple Distribution Format__ - Easily package your Machine Learning models
and preprocessing code into a format that works best with your inference scenario:
* Docker Image - deploy as containers running REST API Server
* PyPI Package - integrate into your python applications seamlessly
* CLI tool - put your model into Airflow DAG or CI/CD pipeline
* Spark UDF - run batch serving on a large dataset with Spark
* Serverless Function - host your model on serverless platforms such as AWS Lambda
## Contributing

Have questions or feedback? Post a [new github issue](https://github.com/bentoml/BentoML/issues/new/choose)
or discuss in our Slack channel: [![join BentoML Slack](https://badgen.net/badge/Join/BentoML%20Slack/cyan?icon=slack)](https://join.slack.com/t/bentoml/shared_invite/enQtNjcyMTY3MjE4NTgzLTU3ZDc1MWM5MzQxMWQxMzJiNTc1MTJmMzYzMTYwMjQ0OGEwNDFmZDkzYWQxNzgxYWNhNjAxZjk4MzI4OGY1Yjg)

* __Multiple Framework Support__ - BentoML supports a wide range of ML frameworks
out-of-the-box including [Tensorflow](https://github.com/tensorflow/tensorflow/),
[PyTorch](https://github.com/pytorch/pytorch),
[Keras](https://keras.io/),
[Scikit-Learn](https://github.com/scikit-learn/scikit-learn),
[xgboost](https://github.com/dmlc/xgboost),
[H2O](https://github.com/h2oai/h2o-3),
[FastAI](https://github.com/fastai/fastai) and can be easily extended to work
with new or custom frameworks

* __Deploy Anywhere__ - BentoService bundle can be easily deployed with
platforms such as [Docker](https://www.docker.com/),
[Kubernetes](https://kubernetes.io/),
[Serverless](https://github.com/serverless/serverless),
[Airflow](https://airflow.apache.org) and [Clipper](http://clipper.ai),
on cloud platforms including AWS, Google Cloud, and Azure
Want to help build BentoML? Check out our
[contributing guide](https://github.com/bentoml/BentoML/blob/master/CONTRIBUTING.md) and the
[development guide](https://github.com/bentoml/BentoML/blob/master/DEVELOPMENT.md).

* __Custom Runtime Backend__ - Easily integrate your python pre-processing code with
high-performance deep learning runtime backend, such as
[tensorflow-serving](https://github.com/tensorflow/serving)

* __Workflow Designed For Teams__ - The YataiService component in BentoML provides
Web UI and APIs for managing and deploying all the models and prediction services
your team has created or deployed, in a centralized service.
## Releases

BentoML is under active development and is evolving rapidly.
Currently it is a Beta release, __we may change APIs in future releases__.

## Documentation
Read more about the latest features and changes in BentoML from the [releases page](https://github.com/bentoml/BentoML/releases).

Full documentation and API references can be found at [bentoml.readthedocs.io](http://bentoml.readthedocs.io)


## Usage Tracking

BentoML library by default reports basic usages using
[Amplitude](https://amplitude.com). It helps BentoML authors to understand how
people are using this tool and improve it over time. You can easily opt-out by
running the following command from terminal:
BentoML by default collects anonymous usage data using [Amplitude](https://amplitude.com).
It only collects BentoML library's own actions and parameters, no user or model data will be collected.
[Here is the code that does it](https://github.com/bentoml/BentoML/blob/master/bentoml/utils/usage_stats.py).

This helps BentoML team to understand how the community is using this tool and
what to build next. You can easily opt-out of usage tracking by running the following
command:

```bash
# From terminal:
bentoml config set usage_tracking=false
```

## Contributing

Have questions or feedback? Post a [new github issue](https://github.com/bentoml/BentoML/issues/new/choose)
or discuss in our Slack channel: [![join BentoML Slack](https://badgen.net/badge/Join/BentoML%20Slack/cyan?icon=slack)](https://join.slack.com/t/bentoml/shared_invite/enQtNjcyMTY3MjE4NTgzLTU3ZDc1MWM5MzQxMWQxMzJiNTc1MTJmMzYzMTYwMjQ0OGEwNDFmZDkzYWQxNzgxYWNhNjAxZjk4MzI4OGY1Yjg)

Want to help build BentoML? Check out our
[contributing guide](https://github.com/bentoml/BentoML/blob/master/CONTRIBUTING.md) and the
[development guide](https://github.com/bentoml/BentoML/blob/master/DEVELOPMENT.md).

## Releases

BentoML is under active development and is evolving rapidly. **Currently it is a
Beta release, we may change APIs in future releases**.

Read more about the latest features and changes in BentoML from the [releases page](https://github.com/bentoml/BentoML/releases).

```python
# From python:
import bentoml
bentoml.config().set('core', 'usage_tracking', 'False')
```

## License

0 comments on commit 513a88c

Please sign in to comment.
You can’t perform that action at this time.