Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions content/manuals/ai/compose/model-runner.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ services:

models:
smollm2:
image: ai/smollm2
model: ai/smollm2
```

### How it works
Expand Down Expand Up @@ -70,7 +70,7 @@ services:

models:
smollm2:
image: ai/smollm2
model: ai/smollm2
```

With this configuration, your `my-chat-app` service will receive:
Expand Down
18 changes: 9 additions & 9 deletions content/manuals/ai/compose/models-and-compose.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ services:

models:
llm:
image: ai/smollm2
model: ai/smollm2
```

This example defines:
Expand All @@ -56,7 +56,7 @@ Models support various configuration options:
```yaml
models:
llm:
image: ai/smollm2
model: ai/smollm2
context_size: 1024
runtime_flags:
- "--a-flag"
Expand Down Expand Up @@ -87,9 +87,9 @@ services:

models:
llm:
image: ai/smollm2
model: ai/smollm2
embedding-model:
image: ai/all-minilm
model: ai/all-minilm
```

With short syntax, the platform automatically generates environment variables based on the model name:
Expand All @@ -116,9 +116,9 @@ services:

models:
llm:
image: ai/smollm2
model: ai/smollm2
embedding-model:
image: ai/all-minilm
model: ai/all-minilm
```

With this configuration, your service receives:
Expand All @@ -142,7 +142,7 @@ services:

models:
llm:
image: ai/smollm2
model: ai/smollm2
```

Docker Model Runner will:
Expand All @@ -163,9 +163,9 @@ services:

models:
llm:
image: ai/smollm2
model: ai/smollm2
# Cloud-specific configurations
labels:
x-cloud-options:
- "cloud.instance-type=gpu-small"
- "cloud.region=us-west-2"
```
Expand Down