Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Configuration for Cognitive Services Azure OpenAI #137

Merged
merged 3 commits into from
May 2, 2023

Conversation

zioproto
Copy link
Contributor

@zioproto zioproto commented Apr 18, 2023

Fixes #128

@zioproto
Copy link
Contributor Author

Currently blocked by langchain-ai/langchain#1560

@zioproto zioproto force-pushed the feature/azure-openai-config branch from 69bb8f7 to 12c2f71 Compare April 19, 2023 08:55
@@ -38,6 +38,22 @@ class Config:
"description": "Configuration for OpenAI embeddings",
}

class EmbedderAzureOpenAIConfig(EmbedderSettings):
openai_api_key: str
model_name: str
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@soferreira in the future we will not use langchain.embeddings.FakeEmbeddings but the correct Embeddings. Could you check this list ? I am not sure it is correct to have both model_name and deployment_name. Thanks

core/cat/factory/llm.py Show resolved Hide resolved
# model_name: '....' # TODO: allow optional kwargs
}
)
if using_openai_llm in [OpenAI, OpenAIChat]:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pieroit should I reverse the logic here ? Check of AzureOpenAI and use the embedder EmbedderAzureOpenAIConfig and then have a catch all else ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zioproto @soferreira I made some refactoring to avoid confusion between default cat behaviour and plugin extensions.

If you look into cat/mad_hatter/core_plugin you will find some hooks that define how the cat loads LLM and embedder.

You can override the get_language_model and get_language_embedder hooks to always have the cat loading a specific LLM/embedder. In cat/plugins/myplugin/myfile.py:

@hook
def get_language_model(cat):

    # look here in DB, or totally skip it

    config = {
        openai_api_key: os.environ["SOME_ENV_VAR"]
        api_base: "something"
        api_type: "azure"
        api_version: "2022-12-01"
    }

    return LLMAzureOpenAIConfig.get_llm_from_config(config)

From now on your hook will be executed and not the default one.
Hope this is intuitive enough, feedback is welcome

@zioproto zioproto force-pushed the feature/azure-openai-config branch 2 times, most recently from 8fcf0a1 to e00a8fb Compare April 27, 2023 14:38
@zioproto zioproto changed the title WiP: Configuration for Cognitive Services Azure OpenAI Configuration for Cognitive Services Azure OpenAI Apr 27, 2023
@zioproto
Copy link
Contributor Author

@pieroit LGTM, feel free to merge

@zioproto
Copy link
Contributor Author

zioproto commented May 2, 2023

@pieroit I pushed a new commit to avoid using the Fake Embeddings. I had to upgrade LangChain to 0.0.155 to include this change: langchain-ai/langchain#3711

This uses the fake embedder until the fix for langchain langchain-ai/langchain#1560
Do not use the Fake Embeddings anymore.
Requires LangChain 0.0.155 because it contains langchain-ai/langchain#3711
@zioproto zioproto force-pushed the feature/azure-openai-config branch from 4ef1a79 to b9c660b Compare May 2, 2023 09:12

# Embedding LLM
using_openai_llm = type(cat.llm) in [OpenAI, OpenAIChat]
using_openai_llm = type(cat.llm) in [OpenAI, OpenAIChat, AzureOpenAI]
if ("OPENAI_KEY" in os.environ) or using_openai_llm:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's totally ignore environment variables ;)

# model_name: '....' # TODO: allow optional kwargs
}
)
if using_openai_llm in [OpenAI, OpenAIChat]:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

using_openai_llm is a boolean, not a type!

becase using_openai_llm is a Boolean the if condition
was never satisfied
@pieroit pieroit merged commit 590c300 into cheshire-cat-ai:main May 2, 2023
@aakashrshah
Copy link

Related issue langchain-ai/langchain#3992

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support for Azure Open AI Languange Model provider
4 participants