Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: [Agents] Support FunctionCallContent #5153

Open
SergeyMenshykh opened this issue Feb 26, 2024 · 2 comments
Open

.Net: [Agents] Support FunctionCallContent #5153

SergeyMenshykh opened this issue Feb 26, 2024 · 2 comments
Assignees
Labels
ai connector Anything related to AI connectors kernel.core kernel Issues or pull requests impacting the core kernel .NET Issue or Pull requests regarding .NET code sk team issue A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community)

Comments

@SergeyMenshykh
Copy link
Member

SergeyMenshykh commented Feb 26, 2024

There are scenarios that require function calls across multiple models and agents. Today, our function calls are tied closely with the OpenAI connector, so it's difficult to reuse the same code for the Gemini connector and to allow agents to pass function calls to one another (AutoGen scenario)

This issue is to capture the work necessary to make the current implementation generic. This should include introducing a new 'FunctionCallContent' type that includes (at a minimum) the a reference to the 'KernelFunction' to be called and the arguments to call. This content type should then appear in the item collection for 'ChatMessageContent'.

As part of this change, all existing function calling behavior should still "work".

Scenarios

As part of this issue, the following scenarios should be supported:

  • Core abstraction support – A developer should be able to use function calls and tool results in ChatHistory and ChatMessageContent objects without relying on the OpenAI package.
  • Serialization/Deserializaion – A developer should be able to serialize a ChatHistory object with function calls and tool results so they can rehydrate it in the future (and potentially run the ChatHistory with a different AI model).
  • Works with any model – Any AI service (that supports tools) can consume tool calls and tool results without having service specific classes. This includes Gemini, Mistral, Anthropic, and Llama.
  • Passing function alls – In multi-agent scenarios, one agent can create a tool call for another agent to complete it.
  • Simulated function calls – A function call can be simulated by a developer; in other words, a developer should be able to add a ChatMessageContent with a tool call they created to a ChatHistory object and then run it with any AI model (this may require simulating tool call IDs in the case of OpenAI). This is helpful when a developer wants to templatize context and system prompts aren't working.
  • YAML support for function calls – A function call and tool result can be written in a YAML prompt template using the intermediate prompt template language. This will allow a developer to implement few shot examples for function calling within prompts. A new tag (e.g., <FunctionCall> is likely necessary to achieve this).
  • YAML support for function results – A function result can be written in a YAML prompt template as a tool message.
  • YAML support for defining available functions – as a prompt engineer, you should be able to state which functions are available for the LLM to call.

If possible, we should try to align the syntax with .promptfile. They use <functions> and <function> tags.

The YAML scenarios/requirements are moved to the issue -#5632 to be prioritized separately.

@SergeyMenshykh SergeyMenshykh added .NET Issue or Pull requests regarding .NET code agents labels Feb 26, 2024
@matthewbolanos matthewbolanos changed the title .Net: [Agents] Support FunctionContent .Net: [Agents] Support FunctionCallContent Mar 12, 2024
@evchaki evchaki added sk team issue A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community) and removed triage labels Mar 12, 2024
@crickman crickman added kernel Issues or pull requests impacting the core kernel ai connector Anything related to AI connectors kernel.core and removed agents labels Apr 1, 2024
@crickman
Copy link
Contributor

crickman commented Apr 1, 2024

Not agent specific...as desrcribed, this has to do with core connector framework / abstractions.

@SergeyMenshykh
Copy link
Member Author

@matthewbolanos, FYI, the AzureOpenAIChatCompletionWithDataService does not use the current SK function-calling model as the {Azure}OpenAIChatCompletionService does. Therefore, it was suggested not supporting function call content for it now and considering it later when/if needed.

github-merge-queue bot pushed a commit that referenced this issue Apr 17, 2024
Today, in SK, LLM function calling is supported exclusively by the
OpenAI connector, and the function calling model is specific to that
connector. The new AI connectors being added to SK, which support
function calling, introduce their specific models for function calling.
The design, where each new connector introduces its own specific model
class for function calling, does not scale well from the connector
development perspective and does not allow for polymorphic use of
connectors by the SK consumer code.

This ADR describes the high-level details of the service-agnostic
function-calling model classes, while leaving the low-level details to
the implementation phase. Additionally, this ADR outlines the identified
options for various aspects of the design.

Requirements - #5153

### Description
ADR PR:  #5696

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [x] The code builds clean without any errors or warnings
- [x] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [x] All unit tests pass, and I have added new tests where possible
- [x] I didn't break anyone 😄

---------

Co-authored-by: Stephen Toub <stoub@microsoft.com>
Co-authored-by: Chris <66376200+crickman@users.noreply.github.com>
Co-authored-by: Dmytro Struk <13853051+dmytrostruk@users.noreply.github.com>
github-merge-queue bot pushed a commit that referenced this issue Apr 19, 2024
Today, in SK, LLM function calling is supported exclusively by the
OpenAI connector, and the function calling model is specific to that
connector. The new AI connectors being added to SK, which support
function calling, introduce their specific models for function calling.
The design, where each new connector introduces its own specific model
class for function calling, does not scale well from the connector
development perspective and does not allow for polymorphic use of
connectors by the SK consumer code.

This ADR describes the high-level details of the service-agnostic
function-calling model classes, while leaving the low-level details to
the implementation phase. Additionally, this ADR outlines the identified
options for various aspects of the design.

Requirements - #5153

---------

Co-authored-by: Eduard van Valkenburg <eavanvalkenburg@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai connector Anything related to AI connectors kernel.core kernel Issues or pull requests impacting the core kernel .NET Issue or Pull requests regarding .NET code sk team issue A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community)
Projects
Status: Sprint: Done
Development

No branches or pull requests

4 participants