Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -216,6 +216,26 @@ The following sample is an `admin-settings.json` file with common enterprise set
"desktopTerminalEnabled": {
"locked": false,
"value": false
},
"enableInference": {
"locked": false,
"value": true
},
"enableInferenceTCP": {
"locked": false,
"value": true
},
"enableInferenceTCPPort": {
"locked": true,
"value": 12434
},
"enableInferenceCORS": {
"locked": true,
"value": ""
},
"enableInferenceGPUVariant": {
"locked": true,
"value": true
}
}
```
Expand Down Expand Up @@ -334,12 +354,13 @@ For more information, see [Networking](/manuals/desktop/features/networking.md#n

### AI settings

| Parameter | OS | Description | Version |
|:-----------------------------------------------------|----|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------|
| `enableInference` | | If `allowBetaFeatures` is true, setting `enableInference` to `true` enables [Docker Model Runner](/manuals/ai/model-runner/_index.md) by default. You can independently control this setting from the `allowBetaFeatures` setting. | |
|         `enableInferenceTCP` | | Enable host-side TCP support. This setting requires Docker Model Runner setting to be enabled first. | |
|         `enableInferenceTCPPort` | | Specifies the exposed TCP port. This setting requires Docker Model Runner setting to be enabled first. | |
|         `enableInferenceCORS` | | Specifies the allowed CORS origins. Empty string to deny all,`*` to accept all, or a list of comma-separated values. This setting requires Docker Model Runner setting to be enabled first. | |
| Parameter | OS | Description | Version |
|:----------------------------|---------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------|
| `enableInference` | | Setting `enableInference` to `true` enables [Docker Model Runner](/manuals/ai/model-runner/_index.md). | |
| `enableInferenceTCP` | | Enable host-side TCP support. This setting requires the Docker Model Runner setting to be enabled first. | |
| `enableInferenceTCPPort` | | Specifies the exposed TCP port. This setting requires the Docker Model Runner and Enable host-side TCP support settings to be enabled first. | |
| `enableInferenceCORS` | | Specifies the allowed CORS origins. Empty string to deny all,`*` to accept all, or a list of comma-separated values. This setting requires the Docker Model Runner and Enable host-side TCP support settings to be enabled first. | |
| `enableInferenceGPUVariant` | Windows only | Setting `enableInferenceGPUVariant` to `true` enables GPU-backed inference. The additional components required for this don't come by default with Docker Desktop, therefore they will be downloaded to `~/.docker/bin/inference`. | |

### Beta features

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -616,13 +616,15 @@ Builders settings lets you manage Buildx builder instances for advanced image-bu

| Default value | Accepted values | Format |
|---------------|-----------------|----------|
| `true` | `true`, `false` | Boolean |
| `true` | `true`, `false` | Boolean |

- **Description:** Docker Model Runner functionality for running AI models in containers.
- **OS:** {{< badge color=blue text="All" >}}
- **Use case:** Run and manage AI/ML models using Docker infrastructure.
- **Configure this setting with:**
- **AI** settings in [Docker Desktop GUI](/manuals/desktop/settings-and-maintenance/settings.md)
- Settings Management: `enableDockerAI` setting in the [`admin-settings.json` file](/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md)
- Settings Management: **Enable Docker Model Runner** setting in the [Admin Console](/manuals/enterprise/security/hardened-desktop/settings-management/configure-admin-console.md)

#### Enable host-side TCP support

Expand All @@ -634,7 +636,9 @@ Builders settings lets you manage Buildx builder instances for advanced image-bu
- **OS:** {{< badge color=blue text="All" >}}
- **Use case:** Allow external applications to connect to Model Runner via TCP.
- **Configure this setting with:**
- **AI** settings in [Docker Desktop GUI](/manuals/desktop/settings-and-maintenance/settings.md)
- Settings Management: `enableDockerAI` setting in the [`admin-settings.json` file](/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md)
- Settings Management: **Host-side TCP support** setting in the [Admin Console](/manuals/enterprise/security/hardened-desktop/settings-management/configure-admin-console.md)

> [!NOTE]
>
Expand All @@ -650,8 +654,13 @@ Builders settings lets you manage Buildx builder instances for advanced image-bu
- **OS:** {{< badge color=blue text="All" >}}
- **Use case:** Customize the port for Model Runner TCP connectivity.
- **Configure this setting with:**
- **Beta features** settings in [Docker Desktop GUI](/manuals/desktop/settings-and-maintenance/settings.md)
- **AI** settings in [Docker Desktop GUI](/manuals/desktop/settings-and-maintenance/settings.md)
- Settings Management: `enableInferenceTCP` setting in the [`admin-settings.json` file](/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md)
- Settings Management: **Host-side TCP port** setting in the [Admin Console](/manuals/enterprise/security/hardened-desktop/settings-management/configure-admin-console.md)

> [!NOTE]
>
> This setting requires Docker Model Runner and host-side TCP support settings to be enabled first.

##### CORS Allowed Origins

Expand All @@ -663,8 +672,27 @@ Builders settings lets you manage Buildx builder instances for advanced image-bu
- **OS:** {{< badge color=blue text="All" >}}
- **Use case:** Allow web applications to connect to Model Runner services.
- **Configure this setting with:**
- **Beta features** settings in [Docker Desktop GUI](/manuals/desktop/settings-and-maintenance/settings.md)
- **AI** settings in [Docker Desktop GUI](/manuals/desktop/settings-and-maintenance/settings.md)
- Settings Management: `enableInferenceCORS` setting in the [`admin-settings.json` file](/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md)
- Settings Management: **CORS Allowed Origins** setting in the [Admin Console](/manuals/enterprise/security/hardened-desktop/settings-management/configure-admin-console.md)

> [!NOTE]
>
> This setting requires Docker Model Runner and host-side TCP support settings to be enabled first.

#### Enable GPU-backed inference

| Default value | Accepted values | Format |
|---------------|-----------------|----------|
| `false` | `true`, `false` | Boolean |

- **Description:** GPU-backed inference.
- **OS:** {{< badge color=blue text="Windows only" >}}
- **Use case:** Enable GPU-backed inference. Additional components will be downloaded to ~/.docker/bin/inference.
- **Configure this setting with:**
- **AI** settings in [Docker Desktop GUI](/manuals/desktop/settings-and-maintenance/settings.md)
- Settings Management: `enableInferenceGPUVariant` setting in the [`admin-settings.json` file](/manuals/enterprise/security/hardened-desktop/settings-management/configure-json-file.md)
- Settings Management: **Enable GPU-backed inference** setting in the [Admin Console](/manuals/enterprise/security/hardened-desktop/settings-management/configure-admin-console.md)

## Kubernetes settings

Expand Down