Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Look into local LLMs support #98

Open
minimaxir opened this issue Dec 2, 2023 · 2 comments
Open

Look into local LLMs support #98

minimaxir opened this issue Dec 2, 2023 · 2 comments

Comments

@minimaxir
Copy link
Owner

It appears local LLMs are at a state now where it's easy enough to use them for the average user, and I just got a M3 Pro MacBook Pro for testing.

The servers packaged with apps like LM Studio follow OpenAI's API spec, which means that it should be fine to work with simpleaichat, pending a few warnings about unsupported config parameters.

@hugokoopmans
Copy link

can you let me know if i can use this tool to talk to GPT4all.io in server mode (it provides openAI API compatible API)

@minimaxir
Copy link
Owner Author

Not currently but that functionality would be theoretically covered by this change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants