diff --git a/1_developer/_2_rest/index.md b/1_developer/_2_rest/index.md index 5103bb6..cddd4c5 100644 --- a/1_developer/_2_rest/index.md +++ b/1_developer/_2_rest/index.md @@ -57,14 +57,14 @@ The following endpoints are available in LM Studio's v1 REST API. ## Inference endpoint comparison -The table below compares the features of LM Studio's `api/v1/chat` endpoint with the OpenAI-compatible `v1/responses` and `v1/chat/completions` endpoints. +The table below compares the features of LM Studio's `/api/v1/chat` endpoint with the OpenAI-compatible `/v1/responses` and `/v1/chat/completions` endpoints. - - - + + + diff --git a/1_developer/_2_rest/quickstart.md b/1_developer/_2_rest/quickstart.md index dc0c2e3..9280ecc 100644 --- a/1_developer/_2_rest/quickstart.md +++ b/1_developer/_2_rest/quickstart.md @@ -91,7 +91,7 @@ See the full [chat](/docs/developer/rest/chat) docs for more details. ## Use MCP servers via API -Enable the model interact with ephemeral Model Context Protocol (MCP) servers in `api/v1/chat` by specifying servers in the `integrations` field. +Enable the model interact with ephemeral Model Context Protocol (MCP) servers in `/api/v1/chat` by specifying servers in the `integrations` field. ```lms_code_snippet variants: diff --git a/1_developer/_2_rest/streaming-events.md b/1_developer/_2_rest/streaming-events.md index 3233020..0b61b5d 100644 --- a/1_developer/_2_rest/streaming-events.md +++ b/1_developer/_2_rest/streaming-events.md @@ -6,7 +6,7 @@ index: 4 Streaming events let you render chat responses incrementally over Server‑Sent Events (SSE). When you call `POST /api/v1/chat` with `stream: true`, the server emits a series of named events that you can consume. These events arrive in order and may include multiple deltas (for reasoning and message content), tool call boundaries and payloads, and any errors encountered. The stream always begins with `chat.start` and concludes with `chat.end`, which contains the aggregated result equivalent to a non‑streaming response. -List of event types that can be sent in an `api/v1/chat` response stream: +List of event types that can be sent in an `/api/v1/chat` response stream: - `chat.start` - `model_load.start` - `model_load.progress`
Featureapi/v1/chatv1/responsesv1/chat/completions/api/v1/chat/v1/responses/v1/chat/completions