Skip to content

lmstudio_en

ymcui edited this page Jan 28, 2024 · 1 revision

Chat with LM Studio

LM Studio is a multi-platform (macOS, Windows, Linux) chat application for large language models, which supports GGUF models (llama.cpp compatible). The following briefly introduces basic usage. Please check their official website for more information.

Step1: Download application

image

Step2: organize model folder

Create a new folder models, and organize the model file as follows (you can also use ln command here). For example, hfl/chinese-mixtral-instruct model can be organized as follows, where the GGUF model file should be placed.

- hfl
  - chinese-mixtral-instruct
    - ggml-model-q4_0.gguf

Step3: configure model folder

Click the last button. In local models folder, click change, navigate the models folder which was created in the previous step. After this step, all supported models will be listed. Please select Mistral Instruct as the chat template.

image

Note: Currently, Mixtral model is recognized as LLaMA, which does not affect the model inference.

Step4: begin chat

Click the third button for chatting. You can choose the model on top of the interface. Right side bar shows system prompt, enabling GPU, context size, etc. Left side bar shows history conversations.

image