Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

joanfm / jina-embeddings-v2-base-en and -de fail with error code 500 #4425

Open
qsdhj opened this issue May 14, 2024 · 4 comments
Open

joanfm / jina-embeddings-v2-base-en and -de fail with error code 500 #4425

qsdhj opened this issue May 14, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@qsdhj
Copy link

qsdhj commented May 14, 2024

What is the issue?

I tried to integerate the german embedding model joanfm/jina-embeddings-v2-base-de , into my LlamaIndex RAG application. During the creation of the embeddings the process ollama fails with error 500: llama runner process has terminated: exit status 0xc0000409.

When calling:

pass_embedding = Settings.embed_model.get_text_embedding_batch(
    ["This is a passage!", "This is another passage"], show_progress=True
)
ValueError                                Traceback (most recent call last)
Cell In[16], [line 2](vscode-notebook-cell:?execution_count=16&line=2)
      [1](vscode-notebook-cell:?execution_count=16&line=1) # Test the embedding model
----> [2](vscode-notebook-cell:?execution_count=16&line=2) pass_embedding = Settings.embed_model.get_text_embedding_batch(
      [3](vscode-notebook-cell:?execution_count=16&line=3)     ["This is a passage!", "This is another passage"], show_progress=True
      [4](vscode-notebook-cell:?execution_count=16&line=4) )
      [5](vscode-notebook-cell:?execution_count=16&line=5) print(pass_embedding)
      [7](vscode-notebook-cell:?execution_count=16&line=7) query_embedding = Settings.embed_model.get_query_embedding("Where is blue?")

File c:\Users\Stefan.Mueller\AppData\Local\miniconda3\envs\llamaindex\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py:274, in Dispatcher.span.<locals>.wrapper(func, instance, args, kwargs)
    [270](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/instrumentation/dispatcher.py:270) self.span_enter(
    [271](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/instrumentation/dispatcher.py:271)     id_=id_, bound_args=bound_args, instance=instance, parent_id=parent_id
    [272](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/instrumentation/dispatcher.py:272) )
    [273](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/instrumentation/dispatcher.py:273) try:
--> [274](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/instrumentation/dispatcher.py:274)     result = func(*args, **kwargs)
    [275](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/instrumentation/dispatcher.py:275) except BaseException as e:
    [276](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/instrumentation/dispatcher.py:276)     self.event(SpanDropEvent(span_id=id_, err_str=str(e)))

File c:\Users\Stefan.Mueller\AppData\Local\miniconda3\envs\llamaindex\Lib\site-packages\llama_index\core\base\embeddings\base.py:331, in BaseEmbedding.get_text_embedding_batch(self, texts, show_progress, **kwargs)
    [322](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/base/embeddings/base.py:322) dispatch_event(
    [323](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/base/embeddings/base.py:323)     EmbeddingStartEvent(
    [324](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/base/embeddings/base.py:324)         model_dict=self.to_dict(),
    [325](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/base/embeddings/base.py:325)     )
    [326](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/core/base/embeddings/base.py:326) )
...
    [100](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/embeddings/ollama/base.py:100)     )
    [102](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/embeddings/ollama/base.py:102) try:
    [103](file:///C:/Users/Stefan.Mueller/AppData/Local/miniconda3/envs/llamaindex/Lib/site-packages/llama_index/embeddings/ollama/base.py:103)     return response.json()["embedding"]

With mxbai-embed-large:latest this works without an error.

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.1.37

@qsdhj qsdhj added the bug Something isn't working label May 14, 2024
@thinkverse
Copy link

Ollama doesn't currently support Jina Embeddings v2, it should be supported after #4414 gets merged, so you'd likely have to wait for the new Ollama release or build from source after the PR has been merged.

@JoanFM
Copy link

JoanFM commented May 14, 2024

hey @qsdhj,

Indeed, there is a need for ollama to update its dependency on llama.cpp and to release a new version for Jina Embeddings V2 to be available.

I created and tested those models by building it manually.

@qsdhj
Copy link
Author

qsdhj commented May 15, 2024

hey @JoanFM,

thanks for your reply.
Do you or some other here, now what the status of batch processing of embeddings with ollama is?
Without it, the feature is useless for my intended use.

@JoanFM
Copy link

JoanFM commented May 15, 2024

Hey @qsdhj ,

I am not sure about this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants