-
Notifications
You must be signed in to change notification settings - Fork 7.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
langchain-python-rag-privategpt "Cannot submit more than 5,461 embeddings at once" #4476
Comments
It may be yet another subcomponent issue. With v0.1.38, langchain version is 0.0.274.
Not use of e.g. langchain_community. As workaround, I've updated all components. This is not recommended because usually it creates more side effects and it's more difficult to reproduce issues.
Afterwards, the following langchain packages are installed:
|
Same issue with v0.1.39. Luckily the workaround works, with Nvidia drivers 552 (see #4563). edited June 5th: Same with v0.1.41. |
With chromadb==0.4.7, ingest.py still fails with `Cannot submit more than 5,461 embeddings at once. Please submit your embeddings in batches of size 5,461 or less.` See - ollama#4476 - ollama#2572 - ollama#533
This should be fixed in #5139 |
What is the issue?
In langchain-python-rag-privategpt, there is a bug 'Cannot submit more than x embeddings at once' which already has been mentioned in various different constellations, lately see #2572.
Now with Ollama version 0.1.38 the chromadb version already has been updated to 0.47, but the
max_batch_size
calculation still seems to produce issues, see actual issue case chroma-core/chroma#2181.Meanwhile, is there a workaround for Ollama?
OS
WSL2
GPU
Nvidia
CPU
Intel
Ollama version
0.1.38
Research findings
In
ingest.py
, indef maint()
, I've modified theelse
condition as following but it didn't help (same issue).The text was updated successfully, but these errors were encountered: