-
Notifications
You must be signed in to change notification settings - Fork 14.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug after the openai updated in Embedding #13051
Comments
🤖 Hello @SGA-Yuze-Li, I'm Dosu, a bot designed to assist you with the LangChain repository. I'm here to help you navigate through bugs, answer your questions, and guide you if you'd like to contribute to the project. While we wait for a human maintainer, I'm here to provide initial assistance. I'm looking at your issue right now and will provide a more comprehensive response shortly. |
I've had this exact issue as well. #13050 |
🤖 The error you're encountering is due to a recent change in the To resolve this issue, you need to update the way you're calling the Here's how you can do it: class SagemakerEndpointEmbeddings(Embeddings):
# ...
def embed_query(self, query: str) -> List[float]:
input = self.prepare_input(query)
return self.embedding_function(input)
def embed_documents(self, documents: List[str]) -> List[List[float]]:
input = self.prepare_input(documents)
return self.embedding_function(input) In the Also, you might need to adjust the Please refer to the following sources for more information:
I hope this helps! If you have any further questions or if something is unclear, please let me know. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Same error on Colab with openai pinning to |
Yes. I pinned to 0.28.1 as well, didn't help. |
Problem seems to appear with chroma update to 0.4.16 Pinning to previous version seems to help.
Langchain still implements the old interface. See: https://docs.trychroma.com/migration#migration-to-0416---november-7-2023 |
|
Worked perfectly. Thank you |
Thanks a lot!!! |
Thanks a lot! Worked! |
My hero! |
Langchain 0.0.332 released with the chroma team's fix for compatibility with chromadb>=0.4.16 Can now use latest of both |
I still met the problem, all the library are the latest version, here are my code:
and here is the error info:
|
I'm getting this bug with:
|
@ScottXiao233 - I tried reproducing, but your code works for me. If you're using a custom openai embeddings endpoint, maybe make sure it works with the new sdk alone? @TimCoates - This also might be something with your local embeddings setup. Maybe submit as a new issue specific to GPT4AllEmbeddings? It's unclear to me why you'd see this using something other than OpenAI |
@efriis , also doesn't work for me with code example
|
@afedotov-align - looks like you might be mixing LangChain |
Thanks, rather than using .from_documents, doing as so: |
Thank you :') |
issue should still be open |
@varayush007 Try defining a custom class that sub-classes |
|
Backdating my chromadb version to chromadb==0.4.15 worked for me. |
System Info
AWS Sagemaker DataScience3.0 Image.
Who can help?
No response
Information
Related Components
Reproduction
here is my code, it worked before Nov 7th.
Chroma.from_documents(documents=document, embedding=embeddings,)
Then i get this error:
ValueError: Expected EmbeddingFunction.__call__ to have the following signature: odict_keys(['self', 'input']), got odict_keys(['self', 'args', 'kwargs']) Please see https://docs.trychroma.com/embeddings for details of the EmbeddingFunction interface. Please note the recent change to the EmbeddingFunction interface: https://docs.trychroma.com/migration#migration-to-0416---november-7-2023
Expected behavior
Is anyone know how to fix this?
The text was updated successfully, but these errors were encountered: