-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: [Agents] Support FunctionCallContent #5153
Labels
ai connector
Anything related to AI connectors
kernel.core
kernel
Issues or pull requests impacting the core kernel
.NET
Issue or Pull requests regarding .NET code
sk team issue
A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community)
Comments
3 tasks
matthewbolanos
changed the title
.Net: [Agents] Support FunctionContent
.Net: [Agents] Support FunctionCallContent
Mar 12, 2024
evchaki
added
sk team issue
A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community)
and removed
triage
labels
Mar 12, 2024
This was referenced Mar 25, 2024
crickman
added
kernel
Issues or pull requests impacting the core kernel
ai connector
Anything related to AI connectors
kernel.core
and removed
agents
labels
Apr 1, 2024
Not agent specific...as desrcribed, this has to do with core connector framework / abstractions. |
This was referenced Apr 8, 2024
@matthewbolanos, FYI, the |
github-merge-queue bot
pushed a commit
that referenced
this issue
Apr 17, 2024
Today, in SK, LLM function calling is supported exclusively by the OpenAI connector, and the function calling model is specific to that connector. The new AI connectors being added to SK, which support function calling, introduce their specific models for function calling. The design, where each new connector introduces its own specific model class for function calling, does not scale well from the connector development perspective and does not allow for polymorphic use of connectors by the SK consumer code. This ADR describes the high-level details of the service-agnostic function-calling model classes, while leaving the low-level details to the implementation phase. Additionally, this ADR outlines the identified options for various aspects of the design. Requirements - #5153 ### Description ADR PR: #5696 ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone 😄 --------- Co-authored-by: Stephen Toub <stoub@microsoft.com> Co-authored-by: Chris <66376200+crickman@users.noreply.github.com> Co-authored-by: Dmytro Struk <13853051+dmytrostruk@users.noreply.github.com>
github-merge-queue bot
pushed a commit
that referenced
this issue
Apr 19, 2024
Today, in SK, LLM function calling is supported exclusively by the OpenAI connector, and the function calling model is specific to that connector. The new AI connectors being added to SK, which support function calling, introduce their specific models for function calling. The design, where each new connector introduces its own specific model class for function calling, does not scale well from the connector development perspective and does not allow for polymorphic use of connectors by the SK consumer code. This ADR describes the high-level details of the service-agnostic function-calling model classes, while leaving the low-level details to the implementation phase. Additionally, this ADR outlines the identified options for various aspects of the design. Requirements - #5153 --------- Co-authored-by: Eduard van Valkenburg <eavanvalkenburg@users.noreply.github.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
ai connector
Anything related to AI connectors
kernel.core
kernel
Issues or pull requests impacting the core kernel
.NET
Issue or Pull requests regarding .NET code
sk team issue
A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community)
There are scenarios that require function calls across multiple models and agents. Today, our function calls are tied closely with the OpenAI connector, so it's difficult to reuse the same code for the Gemini connector and to allow agents to pass function calls to one another (AutoGen scenario)
This issue is to capture the work necessary to make the current implementation generic. This should include introducing a new 'FunctionCallContent' type that includes (at a minimum) the a reference to the 'KernelFunction' to be called and the arguments to call. This content type should then appear in the item collection for 'ChatMessageContent'.
As part of this change, all existing function calling behavior should still "work".
Scenarios
As part of this issue, the following scenarios should be supported:
ChatHistory
andChatMessageContent
objects without relying on the OpenAI package.ChatHistory
object with function calls and tool results so they can rehydrate it in the future (and potentially run theChatHistory
with a different AI model).ChatMessageContent
with a tool call they created to aChatHistory
object and then run it with any AI model (this may require simulating tool call IDs in the case of OpenAI). This is helpful when a developer wants to templatize context and system prompts aren't working.YAML support for function calls – A function call and tool result can be written in a YAML prompt template using the intermediate prompt template language. This will allow a developer to implement few shot examples for function calling within prompts. A new tag (e.g.,<FunctionCall>
is likely necessary to achieve this).YAML support for function results – A function result can be written in a YAML prompt template as a tool message.YAML support for defining available functions – as a prompt engineer, you should be able to state which functions are available for the LLM to call.If possible, we should try to align the syntax with .promptfile. They use<functions>
and<function>
tags.The YAML scenarios/requirements are moved to the issue -#5632 to be prioritized separately.
The text was updated successfully, but these errors were encountered: