Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated README.md for #364 #365

Closed
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
56 changes: 39 additions & 17 deletions README.md
Expand Up @@ -18,14 +18,21 @@ Conda instructions are provided in more detail, but you may also use `pip` and `

### Install with pip

To use `pip` to install TorchServe and the model archiver:
1. Install Java 11

``` bash
pip install torch torchtext torchvision sentencepiece psutil future
pip install torchserve torch-model-archiver
```
```bash
sudo apt-get install openjdk-11-jdk
```

1. Use `pip` to install TorchServe and the model archiver:

``` bash
pip install torch torchtext torchvision sentencepiece psutil future
pip install torchserve torch-model-archiver
```

### Install with Conda
**Note:** For Conda, Python 3.8 is required to run Torchserve

#### Ubuntu

Expand All @@ -40,13 +47,13 @@ pip install torchserve torch-model-archiver
For CPU

```bash
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch sentencepiece torchtext torchvision -c pytorch -c powerai
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch torchtext torchvision -c pytorch -c powerai
```

For GPU

```bash
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch sentencepiece torchtext torchvision cudatoolkit=10.1 -c pytorch -c powerai
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch torchtext torchvision cudatoolkit=10.1 -c pytorch -c powerai
```

1. Activate the environment
Expand All @@ -55,6 +62,11 @@ pip install torchserve torch-model-archiver
source activate torchserve
```

2. Optional if using torchtext models
```bash
pip install sentencepiece
```

#### macOS

1. Install Java 11
Expand All @@ -68,7 +80,7 @@ pip install torchserve torch-model-archiver
1. Create an environment and install torchserve and torch-model-archiver

```bash
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch sentencepiece torchtext torchvision -c pytorch -c powerai
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch torchtext torchvision -c pytorch -c powerai
```

1. Activate the environment
Expand All @@ -77,16 +89,27 @@ pip install torchserve torch-model-archiver
source activate torchserve
```

2. Optional if using torchtext models
```bash
pip install sentencepiece
```

Now you are ready to [package and serve models with TorchServe](#serve-a-model).

### Install TorchServe for development

If you plan to develop with TorchServe and change some of the source code, you must install it from source code.

1. Install Java 11

```bash
sudo apt-get install openjdk-11-jdk
```

1. Install dependencies

```bash
pip install psutil future -y
pip install psutil future
```

1. Clone the repo
Expand Down Expand Up @@ -121,14 +144,14 @@ For information about the model archiver, see [detailed documentation](model-arc

This section shows a simple example of serving a model with TorchServe. To complete this example, you must have already [installed TorchServe and the model archiver](#install-with-pip).

To run this example, clone the TorchServe repository and navigate to the root of the repository:
To run this example, clone the TorchServe repository:

```bash
git clone https://github.com/pytorch/serve.git
cd serve
```

Then run the following steps from the root of the repository.
Then run the following steps from the parent directory of the root of the repository.
For example, if you cloned the repository into `/home/my_path/serve`, run the steps from `/home/my_path`.

### Store a Model

Expand All @@ -138,8 +161,7 @@ You can also create model stores to store your archived models.
1. Create a directory to store your models.

```bash
mkdir ./model_store
cd ./model_store
mkdir model_store
```

1. Download a trained model.
Expand All @@ -151,7 +173,7 @@ You can also create model stores to store your archived models.
1. Archive the model by using the model archiver. The `extra-files` param uses fa file from the `TorchServe` repo, so update the path if necessary.

```bash
torch-model-archiver --model-name densenet161 --version 1.0 --model-file ./serve/examples/image_classifier/densenet_161/model.py --serialized-file ./model_store/densenet161-8d451a50.pth --extra-files ./serve/examples/image_classifier/index_to_name.json --handler image_classifier
torch-model-archiver --model-name densenet161 --version 1.0 --model-file ./serve/examples/image_classifier/densenet_161/model.py --serialized-file densenet161-8d451a50.pth --export-path model_store --extra-files ./serve/examples/image_classifier/index_to_name.json --handler image_classifier
```

For more information about the model archiver, see [Torch Model archiver for TorchServe](model-archiver/README.md)
Expand All @@ -161,7 +183,7 @@ For more information about the model archiver, see [Torch Model archiver for Tor
After you archive and store the model, use the `torchserve` command to serve the model.

```bash
torchserve --start --model-store ./model_store --models ./model_store/densenet161.mar
torchserve --start --ncs --model-store model_store --models densenet161.mar
```

After you execute the `torchserve` command above, TorchServe runs on your host, listening for inference requests.
Expand Down Expand Up @@ -282,6 +304,6 @@ To run your TorchServe Docker image and start TorchServe inside the container wi

We welcome all contributions!

To learn more about how to contribute, see the contributor guide [here](https://github.com/pytorch/serve/blob/master/CONTRIBUTING.md).
To learn more about how to contribute, see the contributor guide [here](https://github.com/pytorch/serve/blob/master/CONTRIBUTING.md).

To file a bug or request a feature, please file a GitHub issue. For filing pull requests, please use the template [here](https://github.com/pytorch/serve/blob/master/pull_request_template.md). Cheers!
5 changes: 2 additions & 3 deletions docker/Dockerfile.gpu
Expand Up @@ -21,8 +21,7 @@ RUN update-alternatives --install /usr/bin/python python /usr/bin/python3 1
RUN update-alternatives --install /usr/local/bin/pip pip /usr/local/bin/pip3 1

RUN export USE_CUDA=1
RUN pip install psutil future torch torchvision torchtext
RUN pip install --no-cache-dir psutil
RUN pip install --no-cache-dir psutil future torch torchvision torchtext
ADD serve serve
RUN pip install ../serve/

Expand All @@ -45,4 +44,4 @@ ENV TEMP=/home/model-server/tmp
ENTRYPOINT ["/usr/local/bin/dockerd-entrypoint.sh"]
CMD ["serve"]

LABEL maintainer="wongale@amazon.com"
LABEL maintainer="wongale@amazon.com"
1 change: 1 addition & 0 deletions examples/image_classifier/README.md
Expand Up @@ -45,6 +45,7 @@
from torchvision import models
import torch
model = models.densenet161(pretrained=True)
model.eval()
example_input = torch.rand(1, 3, 224, 224)
traced_script_module = torch.jit.trace(model, example_input)
traced_script_module.save("dense161.pt")
Expand Down