Skip to content

Commit

Permalink
How To (Updates and Fixes) (#1456)
Browse files Browse the repository at this point in the history
* Update easy-setup-embeddings.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update easy-setup-docker-cpu.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update easy-setup-docker-gpu.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update and rename easy-setup-docker-cpu.md to easy-setup-docker.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update easy-setup-docker.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update easy-setup-docker.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update _index.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update easy-setup-docker.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update easy-setup-docker.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Delete docs/content/howtos/easy-setup-docker-gpu.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update _index.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update easy-setup-sd.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

* Update easy-setup-sd.md

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>

---------

Signed-off-by: lunamidori5 <118759930+lunamidori5@users.noreply.github.com>
  • Loading branch information
lunamidori5 committed Dec 18, 2023
1 parent 1fc3a37 commit 17dde75
Show file tree
Hide file tree
Showing 5 changed files with 45 additions and 181 deletions.
3 changes: 1 addition & 2 deletions docs/content/howtos/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,7 @@ weight = 9

This section includes LocalAI end-to-end examples, tutorial and how-tos curated by the community and maintained by [lunamidori5](https://github.com/lunamidori5).

- [Setup LocalAI with Docker on CPU]({{%relref "howtos/easy-setup-docker-cpu" %}})
- [Setup LocalAI with Docker With CUDA]({{%relref "howtos/easy-setup-docker-gpu" %}})
- [Setup LocalAI with Docker]({{%relref "howtos/easy-setup-docker" %}})
- [Seting up a Model]({{%relref "howtos/easy-model" %}})
- [Making Text / LLM requests to LocalAI]({{%relref "howtos/easy-request" %}})
- [Making Photo / SD requests to LocalAI]({{%relref "howtos/easy-setup-sd" %}})
Expand Down
137 changes: 0 additions & 137 deletions docs/content/howtos/easy-setup-docker-cpu.md

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@

+++
disableToc = false
title = "Easy Setup - GPU Docker"
title = "Easy Setup - Docker"
weight = 2
+++

Expand All @@ -12,26 +12,13 @@ weight = 2

We are going to run `LocalAI` with `docker compose` for this set up.

Lets Setup our folders for ``LocalAI``
{{< tabs >}}
{{% tab name="Windows (Batch)" %}}
Lets setup our folders for ``LocalAI`` (run these to make the folders for you if you wish)
```batch
mkdir "LocalAI"
cd LocalAI
mkdir "models"
mkdir "images"
```
{{% /tab %}}

{{% tab name="Linux (Bash / WSL)" %}}
```bash
mkdir -p "LocalAI"
cd LocalAI
mkdir -p "models"
mkdir -p "images"
```
{{% /tab %}}
{{< /tabs >}}

At this point we want to set up our `.env` file, here is a copy for you to use if you wish, Make sure this is in the ``LocalAI`` folder.

Expand All @@ -51,7 +38,7 @@ GALLERIES=[{"name":"model-gallery", "url":"github:go-skynet/model-gallery/index.
MODELS_PATH=/models

## Enable debug mode
# DEBUG=true
DEBUG=true

## Disables COMPEL (Lets Stable Diffuser work, uncomment if you plan on using it)
# COMPEL=0
Expand Down Expand Up @@ -84,28 +71,54 @@ BUILD_TYPE=cublas

Now that we have the `.env` set lets set up our `docker-compose` file.
It will use a container from [quay.io](https://quay.io/repository/go-skynet/local-ai?tab=tags).


{{< tabs >}}
{{% tab name="CPU Only" %}}
Also note this `docker-compose` file is for `CPU` only.

```docker
version: '3.6'
services:
api:
image: quay.io/go-skynet/local-ai:{{< version >}}
tty: true # enable colorized logs
restart: always # should this be on-failure ?
ports:
- 8080:8080
env_file:
- .env
volumes:
- ./models:/models
- ./images/:/tmp/generated/images/
command: ["/usr/bin/local-ai" ]
```
{{% /tab %}}

{{% tab name="GPU and CPU" %}}
Also note this `docker-compose` file is for `CUDA` only.

Please change the image to what you need.
{{< tabs >}}
{{% tab name="GPU Images CUDA 11" %}}
- `master-cublas-cuda11`
- `master-cublas-cuda11-core`
- `v2.0.0-cublas-cuda11`
- `v2.0.0-cublas-cuda11-core`
- `v2.0.0-cublas-cuda11-ffmpeg`
- `v2.0.0-cublas-cuda11-ffmpeg-core`
- `{{< version >}}-cublas-cuda11`
- `{{< version >}}-cublas-cuda11-core`
- `{{< version >}}-cublas-cuda11-ffmpeg`
- `{{< version >}}-cublas-cuda11-ffmpeg-core`

Core Images - Smaller images without predownload python dependencies
{{% /tab %}}

{{% tab name="GPU Images CUDA 12" %}}
- `master-cublas-cuda12`
- `master-cublas-cuda12-core`
- `v2.0.0-cublas-cuda12`
- `v2.0.0-cublas-cuda12-core`
- `v2.0.0-cublas-cuda12-ffmpeg`
- `v2.0.0-cublas-cuda12-ffmpeg-core`
- `{{< version >}}-cublas-cuda12`
- `{{< version >}}-cublas-cuda12-core`
- `{{< version >}}-cublas-cuda12-ffmpeg`
- `{{< version >}}-cublas-cuda12-ffmpeg-core`

Core Images - Smaller images without predownload python dependencies
{{% /tab %}}
Expand Down Expand Up @@ -135,6 +148,8 @@ services:
- ./images/:/tmp/generated/images/
command: ["/usr/bin/local-ai" ]
```
{{% /tab %}}
{{< /tabs >}}


Make sure to save that in the root of the `LocalAI` folder. Then lets spin up the Docker run this in a `CMD` or `BASH`
Expand Down
13 changes: 1 addition & 12 deletions docs/content/howtos/easy-setup-embeddings.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,25 +12,14 @@ curl http://localhost:8080/models/apply -H "Content-Type: application/json" -d '
}'
```

Now we need to make a ``bert.yaml`` in the models folder
```yaml
backend: bert-embeddings
embeddings: true
name: text-embedding-ada-002
parameters:
model: bert
```

**Restart LocalAI after you change a yaml file**

When you would like to request the model from CLI you can do

```bash
curl http://localhost:8080/v1/embeddings \
-H "Content-Type: application/json" \
-d '{
"input": "The food was delicious and the waiter...",
"model": "text-embedding-ada-002"
"model": "bert-embeddings"
}'
```

Expand Down
10 changes: 4 additions & 6 deletions docs/content/howtos/easy-setup-sd.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ weight = 2
+++

To set up a Stable Diffusion model is super easy.
In your models folder make a file called ``stablediffusion.yaml``, then edit that file with the following. (You can change ``Linaqruf/animagine-xl`` with what ever ``sd-lx`` model you would like.
In your ``models`` folder make a file called ``stablediffusion.yaml``, then edit that file with the following. (You can change ``Linaqruf/animagine-xl`` with what ever ``sd-lx`` model you would like.
```yaml
name: animagine-xl
parameters:
Expand All @@ -21,8 +21,7 @@ diffusers:

If you are using docker, you will need to run in the localai folder with the ``docker-compose.yaml`` file in it
```bash
docker-compose down #windows
docker compose down #linux/mac
docker compose down
```

Then in your ``.env`` file uncomment this line.
Expand All @@ -32,14 +31,13 @@ COMPEL=0

After that we can reinstall the LocalAI docker VM by running in the localai folder with the ``docker-compose.yaml`` file in it
```bash
docker-compose up #windows
docker compose up #linux/mac
docker compose up -d
```

Then to download and setup the model, Just send in a normal ``OpenAI`` request! LocalAI will do the rest!
```bash
curl http://localhost:8080/v1/images/generations -H "Content-Type: application/json" -d '{
"prompt": "Two Boxes, 1blue, 1red",
"size": "256x256"
"size": "1024x1024"
}'
```

0 comments on commit 17dde75

Please sign in to comment.