diff --git a/docs/AI-for-security/connect-to-byo.asciidoc b/docs/AI-for-security/connect-to-byo.asciidoc index 6dc6a88648..f385c084aa 100644 --- a/docs/AI-for-security/connect-to-byo.asciidoc +++ b/docs/AI-for-security/connect-to-byo.asciidoc @@ -180,10 +180,11 @@ Finally, configure the connector: 1. Log in to your Elastic deployment. 2. Navigate to **Stack Management → Connectors → Create Connector → OpenAI**. The OpenAI connector enables this use case because LM Studio uses the OpenAI SDK. 3. Name your connector to help keep track of the model version you are using. -4. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`. -5. Under **Default model**, enter `local-model`. -6. Under **API key**, enter the secret token specified in your Nginx configuration file. -7. Click **Save**. +4. Under **Select an OpenAI provider**, select **Other (OpenAI Compatible Service)**. +5. Under **URL**, enter the domain name specified in your Nginx configuration file, followed by `/v1/chat/completions`. +6. Under **Default model**, enter `local-model`. +7. Under **API key**, enter the secret token specified in your Nginx configuration file. +8. Click **Save**. image::images/lms-edit-connector.png[The Edit connector page in the {security-app}, with appropriate values populated] diff --git a/docs/AI-for-security/images/lms-edit-connector.png b/docs/AI-for-security/images/lms-edit-connector.png index 0359253eb1..6f23209e5d 100644 Binary files a/docs/AI-for-security/images/lms-edit-connector.png and b/docs/AI-for-security/images/lms-edit-connector.png differ