Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request - LM Studio Streaming #37

Closed
ckep1 opened this issue Jan 15, 2024 · 2 comments
Closed

Feature Request - LM Studio Streaming #37

ckep1 opened this issue Jan 15, 2024 · 2 comments

Comments

@ckep1
Copy link

ckep1 commented Jan 15, 2024

Love this plugin so far! Streaming works well with Ollama, but I'd love to be able to use this with LM Studio to enable streaming on Windows too with local LLMs. I have found LM Studio server setup to be much easier than Ollama, more choices are provided and it works on Windows.

Since it has an Open AI style API, I have seemingly been able to get it working both with the LocalAI and OpenAI base link settings under the Advanced Tab. This is great, but I need to wait for inferencing to finish before I get output from LM Studio.

Streaming does work otherwise, I believe all the API request needs is "stream": true added on the LM Studio side.
Screenshot 2024-01-15 at 11 04 40 AM

Not sure how much work is needed to handle the streaming within the plugin itself, I tried to make this change quickly to the API calls in a fork but completions vs streaming seem differently configured and therefore more work would be required.

Obsidian Copilot has this implemented for reference.

Thanks for the plugin - great work!

@longy2k
Copy link
Owner

longy2k commented Jan 16, 2024

Let me know if you can get LM Studio to stream on v1.8.2, thanks!

https://github.com/longy2k/obsidian-bmo-chatbot/wiki/How-to-setup-with-LM-Studio

@ckep1
Copy link
Author

ckep1 commented Jan 16, 2024

Amazing! Got it working with streaming!

Thank you for the super fast response here, great new update overall.

Had no expectation this would be working anytime soon, let alone same day. Cheers!

@ckep1 ckep1 closed this as completed Jan 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants