-
Notifications
You must be signed in to change notification settings - Fork 273
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configuration for Cognitive Services Azure OpenAI #137
Configuration for Cognitive Services Azure OpenAI #137
Conversation
Currently blocked by langchain-ai/langchain#1560 |
69bb8f7
to
12c2f71
Compare
core/cat/factory/embedder.py
Outdated
@@ -38,6 +38,22 @@ class Config: | |||
"description": "Configuration for OpenAI embeddings", | |||
} | |||
|
|||
class EmbedderAzureOpenAIConfig(EmbedderSettings): | |||
openai_api_key: str | |||
model_name: str |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@soferreira in the future we will not use langchain.embeddings.FakeEmbeddings
but the correct Embeddings. Could you check this list ? I am not sure it is correct to have both model_name
and deployment_name
. Thanks
# model_name: '....' # TODO: allow optional kwargs | ||
} | ||
) | ||
if using_openai_llm in [OpenAI, OpenAIChat]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pieroit should I reverse the logic here ? Check of AzureOpenAI
and use the embedder EmbedderAzureOpenAIConfig
and then have a catch all else
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zioproto @soferreira I made some refactoring to avoid confusion between default cat behaviour and plugin extensions.
If you look into cat/mad_hatter/core_plugin
you will find some hooks that define how the cat loads LLM and embedder.
You can override the get_language_model
and get_language_embedder
hooks to always have the cat loading a specific LLM/embedder. In cat/plugins/myplugin/myfile.py
:
@hook
def get_language_model(cat):
# look here in DB, or totally skip it
config = {
openai_api_key: os.environ["SOME_ENV_VAR"]
api_base: "something"
api_type: "azure"
api_version: "2022-12-01"
}
return LLMAzureOpenAIConfig.get_llm_from_config(config)
From now on your hook will be executed and not the default one.
Hope this is intuitive enough, feedback is welcome
8fcf0a1
to
e00a8fb
Compare
@pieroit LGTM, feel free to merge |
@pieroit I pushed a new commit to avoid using the Fake Embeddings. I had to upgrade LangChain to 0.0.155 to include this change: langchain-ai/langchain#3711 |
This uses the fake embedder until the fix for langchain langchain-ai/langchain#1560
Do not use the Fake Embeddings anymore. Requires LangChain 0.0.155 because it contains langchain-ai/langchain#3711
4ef1a79
to
b9c660b
Compare
|
||
# Embedding LLM | ||
using_openai_llm = type(cat.llm) in [OpenAI, OpenAIChat] | ||
using_openai_llm = type(cat.llm) in [OpenAI, OpenAIChat, AzureOpenAI] | ||
if ("OPENAI_KEY" in os.environ) or using_openai_llm: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's totally ignore environment variables ;)
# model_name: '....' # TODO: allow optional kwargs | ||
} | ||
) | ||
if using_openai_llm in [OpenAI, OpenAIChat]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
using_openai_llm
is a boolean, not a type!
becase using_openai_llm is a Boolean the if condition was never satisfied
Related issue langchain-ai/langchain#3992 |
Fixes #128