Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature] LocalAI endpoint #92

Open
Tracked by #206
nalbion opened this issue Sep 13, 2023 · 5 comments
Open
Tracked by #206

[feature] LocalAI endpoint #92

nalbion opened this issue Sep 13, 2023 · 5 comments
Labels
cost enhancement New feature or request privacy

Comments

@nalbion
Copy link
Contributor

nalbion commented Sep 13, 2023

I don't have any experience with it, but LocalAI might be more attractive for people working in environments where sending source code out to the interwebs is frowned upon.

LocalAI is a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU.

  • Text generation with llama.cpp, gpt4all.cpp and more
  • OpenAI functions
  • Embeddings generation for vector databases
  • Download Models drectly from Huggingface

See also #69 by @mrgoonie

@nalbion nalbion changed the title LocalAI endpoint [feature] LocalAI endpoint Sep 15, 2023
@nalbion nalbion added the enhancement New feature or request label Sep 28, 2023
@pryh4ck
Copy link

pryh4ck commented Oct 1, 2023

How do I utilize local.ai?

@jmikedupont2
Copy link

https://twitter.com/introsp3ctor/status/1708301615143256165?t=fuwFrgIAd7uZixTVZKfK6A&s=19 I have a fake openai running and some basic lollms running

@nalbion
Copy link
Contributor Author

nalbion commented Oct 11, 2023

This may already be supported. Apparently you just need to set OPENAI_API_BASE=http://localhost:8080 and run LocalAI.

image

@pryh4ck
Copy link

pryh4ck commented Oct 12, 2023 via email

@TomLucidor
Copy link

@jmikedupont2 sorry just found you through search, what are the feature differences between PrivateGPT, LocalAI, and LOLLMS? Also are Oobabooga and Koboldcpp/KoboldAI/SillyTavern just the same plain LLM access point without much features outside?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cost enhancement New feature or request privacy
Projects
None yet
Development

No branches or pull requests

4 participants