Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using a local LLM with LMStudio but it requires embeddings, any embedding model or a specific one? #395

Closed
Mayorc1978 opened this issue Mar 15, 2024 · 5 comments

Comments

@Mayorc1978
Copy link

Mayorc1978 commented Mar 15, 2024

Using docker and setting the BASE_URL to my host LLM running inside LMStudio I was able to make it work, at least part of it.
image

But it stops to work the moment he searches for embeddings:
[2024-03-15 22:08:09.682] [ERROR] Unexpected endpoint or method. (POST /v1/embeddings). Returning 200 anyway

So, does it search for a specific embedding model? Or any available models? Cause in the latter case it could work using Ollama, since Ollama has embeddings Nomic Embedding for instance is pretty fast, has better performance text-embedding-ada-002 and text-embedding-3-small performance on short and long context tasks.

Usually accessed with this:

curl http://localhost:11434/api/embeddings -d '{
  "model": "nomic-embed-text",
  "prompt": "The sky is blue because of Rayleigh scattering"
}'
@Mayorc1978
Copy link
Author

Tested with nomic-embed-text and Ollama and works as intended.

@hirowa
Copy link

hirowa commented Apr 15, 2024

Hey @Mayorc1978 where can I find the file or configuration to use it with Ollama?
Hope you have a fantastic day!

@arsaboo
Copy link
Contributor

arsaboo commented May 5, 2024

@Mayorc1978 can you please provide the changes you made to make it work with Ollama...thanks!

@zx9597446
Copy link

            # Get Ollama host and embedding model from environment variables
            # Set default value for OLLAMA_HOST
            ollama_host = os.getenv("OLLAMA_HOST", "http://host.docker.internal:11434")
            # Set default value for OLLAMA_EMBEDING_MODEL
            ollama_embedding_model = os.getenv("OLLAMA_EMBEDING_MODEL", "nomic-embed-text")

            _embeddings = OllamaEmbeddings(model=ollama_embedding_model, base_url=ollama_host)

change embeddings.py works for me

@arsaboo
Copy link
Contributor

arsaboo commented May 22, 2024

@zx9597446 that will only change the embedding model, if at all.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants