You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Is it possible to use Ollama embedding model for plugin selection while using OpenAI model for agents.
See my config file blow:
{
"llm.api_base": "https://xxx.openai.azure.com/",
"llm.api_key": "xxx",
"llm.api_type": "azure",
"llm.api_version": "2023-07-01-preview",
"llm.model": "gpt-4",
"llm.response_format": null,
"llm.embedding_api_type": "ollama",
"llm.embedding_model": "nomic-embed-text:latest",
"code_generator.enable_auto_plugin_selection": true,
"code_generator.auto_plugin_selection_topk": 2,
"execution_service.kernel_mode": "local",
"planner.prompt_compression": true,
"code_generator.prompt_compression": true
}
I thought I would need to input ollama api into the config as well? I can't seem to find the info.
Describe the solution you'd like
Be able to use Ollama model for plugin auto selection, at the same time using OpenAI models for agents.
The text was updated successfully, but these errors were encountered:
I didn't try myself, but I think the embedding model is configured seperately with the model for the agent roles, and so you should be able to do what you want.
You can take a look at this file taskweaver/llm/ollama.py and the config class OllamaServiceConfig. I think you need to configure at least the following one, given that you have already configured the others in your example above:
Is your feature request related to a problem? Please describe.
Is it possible to use Ollama embedding model for plugin selection while using OpenAI model for agents.
See my config file blow:
{
"llm.api_base": "https://xxx.openai.azure.com/",
"llm.api_key": "xxx",
"llm.api_type": "azure",
"llm.api_version": "2023-07-01-preview",
"llm.model": "gpt-4",
"llm.response_format": null,
"llm.embedding_api_type": "ollama",
"llm.embedding_model": "nomic-embed-text:latest",
"code_generator.enable_auto_plugin_selection": true,
"code_generator.auto_plugin_selection_topk": 2,
"execution_service.kernel_mode": "local",
"planner.prompt_compression": true,
"code_generator.prompt_compression": true
}
I thought I would need to input ollama api into the config as well? I can't seem to find the info.
Describe the solution you'd like
Be able to use Ollama model for plugin auto selection, at the same time using OpenAI models for agents.
The text was updated successfully, but these errors were encountered: