Why doesn't the sample plugin (LightPlugin) return the value as it shows in the example? #6377
-
Hi everyone, I am pretty new to Semantic Kernel and trying to run a small example (LightPlugin) from this page https://learn.microsoft.com/en-us/semantic-kernel/overview/?tabs=Csharp. I am stuck on the second answer from AI - in the example, it answers "[Light is now on] The only difference from the example is I have: In the example, it is "var builder = Kernel.CreateBuilder().AddAzureOpenAIChatCompletion(modelId, endpoint, apiKey);". Does it have something to do with my Azure OpenAI instance/endpoint? Could you please what may be going on? Thank you for your help in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 2 replies
-
Hi @bryantlin ! Here is a list of available models, their features and available regions: |
Beta Was this translation helpful? Give feedback.
-
Thanks @dmytrostruk! That should be the reason. |
Beta Was this translation helpful? Give feedback.
-
Hello everyone, I have the same problem. I get the error message: "I'm afraid I can't do that as I'm just a text-based AI language model. However, if you have a smart lighting system or a smart bulb connected to a smart hub, you can use its compatible app or voice assistant to turn on the light." I use the model: gpt-35-turbo (0301) Can anyone help me? Thank you for your help in advance! |
Beta Was this translation helpful? Give feedback.
Hi @bryantlin !
I don't think the problem is in difference in Azure OpenAI chat completion service registration. It looks like the model is not aware about your plugin to change the light state. If you use exactly the same code as on Microsoft Learn page, could you please share model id you are using in your testing? As far as I know, not all models support function calling functionality.
Here is a list of available models, their features and available regions:
https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models