-
-
Notifications
You must be signed in to change notification settings - Fork 176
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Local Embedding Issues #351
Comments
Can you elaborate? Are you trying to use embedding models from Huggingface? or via Ollama (if so, I wasn't aware this is possible) |
I am just finding out that ollama used to do it but now doesn't ollama/ollama#834 this is where my confusion came from. |
I think I was wrong entirely... I see you have the same sentence transformer model integrated with langroid (cool), I think that worked actually. What happens to me is that the chat-multi-extract seems to extract the doc. create the vector store etc but actually only reads the example data at the end. I am trying this all locally so it could be my model setup and vars, etc. |
Didn't understand what you meant by this |
@TheBitmonkey closing this, let me know if we need to revisit. |
Hi, super module good sir.
I am having some difficulties trying to get embedding working on my end (could be user error). Earlier I was using LM Studio but that does not have an embedding endpoint. I have it running now with Ollama through litellm as it was before. But embedding is still not working I believe.
When I look at the class OpenAIEmbeddingsConfig in the embedding models folder I can not see any reference to api-base var, it seems to presume the use of the openai api directly. I think that is where my issue is and I can't seem to get a work around to function on my end. Any help would be appreciated. Cheers.
The text was updated successfully, but these errors were encountered: