diff --git a/1_developer/_2_rest/endpoints.md b/1_developer/_2_rest/endpoints.md index 021a9a5..d7377ed 100644 --- a/1_developer/_2_rest/endpoints.md +++ b/1_developer/_2_rest/endpoints.md @@ -4,10 +4,10 @@ description: "The REST API includes enhanced stats such as Token / Second and Ti --- ```lms_warning -LM Studio now has a v1 REST API! Please migrate to the new API. +LM Studio now has a [v1 REST API](/docs/developer/rest)! Please migrate to the new API. ``` -##### Requires [LM Studio 0.3.6](/download) or newer. Still WIP, endpoints may change. +##### Requires [LM Studio 0.3.6](/download) or newer. LM Studio now has its own REST API, in addition to OpenAI compatibility mode ([learn more](/docs/developer/openai-compat)). @@ -21,8 +21,6 @@ The REST API includes enhanced stats such as Token / Second and Time To First To - [`POST /api/v0/completions`](#post-apiv0completions) - Text Completions (prompt -> completion) - [`POST /api/v0/embeddings`](#post-apiv0embeddings) - Text Embeddings (text -> embedding) -###### 🚧 We are in the process of developing this interface. Let us know what's important to you on [Github](https://github.com/lmstudio-ai/lmstudio-js/issues) or by [email](mailto:bugs@lmstudio.ai). - --- ### Start the REST API server diff --git a/1_developer/_2_rest/index.md b/1_developer/_2_rest/index.md index a4f00c1..5103bb6 100644 --- a/1_developer/_2_rest/index.md +++ b/1_developer/_2_rest/index.md @@ -1,16 +1,24 @@ --- title: LM Studio API sidebar_title: Overview -description: Get started with LM Studio's REST API for local model management and inference. -fullPage: true +description: LM Studio's REST API for local inference and model management +fullPage: false index: 1 --- -LM Studio offers a powerful REST API with first-class support for local model management and inference. In addition to our native API, we provide full OpenAI compatibility mode ([learn more](/docs/developer/openai-compat)). +LM Studio offers a powerful REST API with first-class support for local inference and model management. In addition to our native API, we provide full OpenAI compatibility mode ([learn more](/docs/developer/openai-compat)). -Our REST API handles local LLM workflows with model downloading, loading, configuration, and inference. Get performance stats like tokens per second, model status, context length, quantization info, and more. Configure loading parameters to customize how models initialize. +## What's new +Previously, there was a [v0 REST API](/docs/developer/rest/endpoints). That API has since been deprecated in favor of the v1 REST API. -### Supported endpoints +The v1 REST API includes enhanced features such as: +- [MCP via API](/docs/developer/core/mcp) +- [Stateful chats](/docs/developer/rest/stateful-chats) +- [Authentication](/docs/developer/core/authentication) configuration with API tokens +- Model [download](/docs/developer/rest/download) and [load](/docs/developer/rest/load) endpoints + +## Supported endpoints +The following endpoints are available in LM Studio's v1 REST API. @@ -48,6 +56,63 @@ Our REST API handles local LLM workflows with model downloading, loading, config
+## Inference endpoint comparison +The table below compares the features of LM Studio's `api/v1/chat` endpoint with the OpenAI-compatible `v1/responses` and `v1/chat/completions` endpoints. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Featureapi/v1/chatv1/responsesv1/chat/completions
Stateful chat
Remote MCPs
MCPs you have in LM Studio
Custom tools
Model load streaming events
Prompt processing streaming events
Specify context length in the request
+ --- Please report bugs by opening an issue on [Github](https://github.com/lmstudio-ai/lmstudio-bug-tracker/issues).