Skip to content

Commit

Permalink
Pip installable from pypi
Browse files Browse the repository at this point in the history
  • Loading branch information
andylolu2 committed Mar 25, 2022
1 parent 1765ecf commit 7654b42
Show file tree
Hide file tree
Showing 2 changed files with 18 additions and 11 deletions.
26 changes: 16 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,21 @@ that speeds up inference by 2-4x by using [ONNX runtime](https://github.com/micr

Evaluation script can be found in [test_speed.py](tests/test_speed.py).

> Evaluation is done on my laptop with AMD 4900HS and Nvidia 2060 Max-Q.
## :gear: Installation

```terminal
pip install speedtoxify
```

Please additionally install `onnxruntime-gpu` for inference on gpus.
Requires the machine have CUDA installed.

```terminal
pip install onnxruntime-gpu
```

## :star2: Quick start

```python
Expand Down Expand Up @@ -52,15 +67,6 @@ exported and stored at `~/.cache/detoxify_onnx`.
This directory can be customized in the `cache_dir` argument to
`Speedtoxify()`.

## Documentation
## :page_with_curl: Documentation

Please refer to [docs](docs).

## GPU inference

Please install `onnxruntime-gpu` for inference on gpus. Requires the
machine have CUDA installed.

```terminal
pip install onnxruntime-gpu
```
3 changes: 2 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,10 @@

setuptools.setup(
name="speedtoxify",
version="0.0.1",
version="0.0.2",
author="Andy Lo",
author_email="andylolu24@gmail.com",
url="https://github.com/andylolu2/speedtoxify",
description="Wrapper around detoxify package for faster inference using ONNX runtime.",
long_description=long_description,
long_description_content_type="text/markdown",
Expand Down

0 comments on commit 7654b42

Please sign in to comment.