Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama: embedding_only option has been removed from the Ollama API #503

Closed
ThomasVitale opened this issue Mar 24, 2024 · 2 comments · Fixed by #504
Closed

Ollama: embedding_only option has been removed from the Ollama API #503

ThomasVitale opened this issue Mar 24, 2024 · 2 comments · Fixed by #504

Comments

@ThomasVitale
Copy link
Contributor

ThomasVitale commented Mar 24, 2024

Bug description
The embedding_only option is not part of the Ollama API anymore. It's been removed since it wasn't actually used: ollama/ollama#2848. When this option is included in a request to Ollama, an error is thrown. Current API spec: https://github.com/ollama/ollama/blob/main/api/types.go#L87.

Environment
Spring AI Snapshot
Ollama 0.1.29

Expected behavior
The embedding_only option should not be part of https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-ollama/src/main/java/org/springframework/ai/ollama/api/OllamaOptions.java

PR: #504

@tzolov
Copy link
Collaborator

tzolov commented Mar 24, 2024

Thanks for the fix @ThomasVitale ,

Unfortunately the Ollama's options management and documentation looks like a complete mess.
It is not clear which properties are used only for Chat, Embedding or shared. For example: ollama/ollama#2349
Also, for half of the defined options there is no documentation: https://docs.spring.io/spring-ai/reference/1.0-SNAPSHOT/api/chat/ollama-chat.html#_chat_properties

So, issues like this will continue to pop up until we figure out how to fix our Ollama ITs caching. @eddumelendez we need to give the #322 another try.

@ThomasVitale
Copy link
Contributor Author

@tzolov you're welcome! I caught this error on another project where I run integration tests with the Testcontainers Ollama module and these images I maintain that come with pre-built models for convenience and to keep the pipelines simple. They are built weekly with the latest Ollama and model versions, and they are multi-arch: https://github.com/ThomasVitale/llm-images

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants