Skip to content

How to setup with LM Studio

Long Huynh edited this page Jul 12, 2024 · 6 revisions
  1. Go to https://lmstudio.ai/ and download the application based on your operating system.

CleanShot 2024-07-12 at 14 45 42@2x

  1. Search for a model to download.

CleanShot 2024-07-12 at 14 46 49@2x

CleanShot 2024-07-12 at 14 48 40@2x

  1. After your model finished downloading, go to 'Local Server.'

CleanShot 2024-07-12 at 14 52 28@2x

  1. Go to 'Local Inference Server'. Adjust 'Server Port' if needed. Start Server.

    Screenshot 2024-01-15 at 6 31 09 PM

Make sure CORS is on for streaming.

  1. Go to Obsidian > BMO Chatbot > REST API Connection > REST API URL and insert the server url (e.g. http://localhost:1234/v1). NOTE: The REST API URL uses /chat/completions endpoints.

CleanShot 2024-07-12 at 14 58 23@2x

  1. Select your model and start chatting.

CleanShot 2024-07-12 at 15 00 03@2x