You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This poses a challenge when working with Azure OpenAI Service in particular as you are required to deploy a gpt-35-turbo model version of 0301, which is deprecated, but due to quota limits on standard accounts you are unlikely to be able to deploy that plus the embeddings model and a 0613 model for gpt-35-turbo for the application to use.
Having Semantic Memory move off the Completions API to ChatCompletions would unblock the usage in AOAI applications and ensure that applications aren't caught in the upcoming deprecations.
The text was updated successfully, but these errors were encountered:
Ah, I just realised that I need to be using AzureOpenAIConfig.APITypes.ChatCompletion not AzureOpenAIConfig.APITypes.TextCompletion to go through the right codepath to use the new ChatCompletions endpoint, so this is probably not valid
The OpenAI/AzureOpenAI
TextGeneration
classes use the legacy Completions API (deprecation from OpenAI, announcement and Azure model compatibility).This poses a challenge when working with Azure OpenAI Service in particular as you are required to deploy a
gpt-35-turbo
model version of0301
, which is deprecated, but due to quota limits on standard accounts you are unlikely to be able to deploy that plus the embeddings model and a0613
model forgpt-35-turbo
for the application to use.Having Semantic Memory move off the Completions API to ChatCompletions would unblock the usage in AOAI applications and ensure that applications aren't caught in the upcoming deprecations.
The text was updated successfully, but these errors were encountered: