Skip to content

Python: Bug: Python - Unable to call plugin during process step #12067

Closed
@Athosone

Description

@Athosone

Describe the bug
I am writing a process in python.

When calling the ai service, If I want to use function call then I got this error:

[2025-05-14 23:40:34 - semantic_kernel.kernel:206 - ERROR] Something went wrong in function invocation. During function invocation: 'GenerateDocumentationStep-GENERATE_DOCUMENTATION'. Error description: '("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt", TypeError('Object of type _PydanticGeneralMetadata is not JSON serializable'))'

To Reproduce
Steps to reproduce the behavior:

Create a process like this:

main.py:

async def main():
    async with (
        MCPStdioPlugin(
            name="filesystem",
            description="Plugin to access and interact with the filesystem",
            command="npx",
            args=[
                "-y",
                "@modelcontextprotocol/server-filesystem",
                "/home/athosone/git/",
            ],
        ) as filesystem_plugin,
    ):
        model = load_chat_model()
        process = create_process()
        kernel = Kernel()
        kernel.add_service(model)
        kernel.add_plugin(filesystem_plugin)
        setup_logging()
        process_context = await start(
            process=process,
            kernel=kernel,
            initial_event=KernelProcessEvent(
                id="Start", data="/blabla"
            ),
        )
        _ = await process_context.get_state()

process:

def create_process() -> KernelProcess:
    process_builder = ProcessBuilder(name="DocumentationGeneration")

    info_gathering_step = process_builder.add_step(GatherUsecasesStep)
    docs_generation_step = process_builder.add_step(GenerateDocumentationStep)
    docs_publish_step = process_builder.add_step(PublishDocumentationStep)

    process_builder.on_input_event("Start").send_event_to(target=info_gathering_step)

    info_gathering_step.on_function_result(
        function_name=GatherUsecasesStep.Functions.GATHER_USECASES.name
    ).send_event_to(
        target=docs_generation_step,
        function_name=GenerateDocumentationStep.Functions.GENERATE_DOCUMENTATION.name,
        parameter_name="usecases",
    )

    docs_generation_step.on_event(
        GenerateDocumentationStep.OutputEvents.DOCUMENTATION_GENERATED
    ).send_event_to(target=docs_publish_step)

    kernel_process = process_builder.build()
    return kernel_process

the step that fails:

    @kernel_function(name=Functions.GENERATE_DOCUMENTATION.name)
    async def generate_documentation(
        self,
        context: KernelProcessStepContext,
        kernel: Kernel,
        usecases: dict[str, str],
    ) -> None:
        print(
            f"{GenerateDocumentationStep.__name__}\n\t Generating documentation for provided usecases"
        )

        completion_client, settings = kernel.select_ai_service(type=AzureChatCompletion)

        assert isinstance(completion_client, ChatCompletionClientBase)  # nosec
        assert isinstance(settings, AzureChatPromptExecutionSettings)

        settings.function_choice_behavior = FunctionChoiceBehavior.Auto() # If I remove this line then I can sucessfully call the llm (but without the functions...)

        usecase_chat = ChatHistory(system_message=self.system_prompt)
        usecase_chat.add_user_message(
            """
            Hello
            """
        )
        r = await completion_client.get_chat_message_content(
            chat_history=usecase_chat,
            settings=settings,
            kernel=kernel,
        )
        print(r)

Expected behavior
I can call the completion client with function calling enabled.

Platform

  • Language: Python 3.12.8
  • AI model: AzureOpenAI gpt-4o
  • OS: Linux (wsl2)

Additional Context

If I call it from main it works with the function calling.
It looks like it only failed in the context of a process step.

Complete logs:

Secure MCP Filesystem Server running on stdio
Allowed directories: [ '/home/athosone/git/' ]
Gathering usecases in /home/athosone/git/...
GenerateDocumentationStep                                                                                                                                                                                                                                                                                                            Generating documentation for provided usecases
[2025-05-14 23:40:34 - semantic_kernel.functions.kernel_function:48 - ERROR] Function failed. Error: ("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt", TypeError('Object of type _PydanticGeneralMetadata is not JSON serializable'))
[2025-05-14 23:40:34 - semantic_kernel.kernel:206 - ERROR] Something went wrong in function invocation. During function invocation: 'GenerateDocumentationStep-GENERATE_DOCUMENTATION'. Error description: '("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt", TypeError('Object of type _PydanticGeneralMetadata is not JSON serializable'))'
[2025-05-14 23:40:34 - semantic_kernel.processes.local_runtime.local_step:166 - ERROR] Error in Step GenerateDocumentationStep: Error occurred while invoking function: 'GenerateDocumentationStep-GENERATE_DOCUMENTATION'

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingprocessespythonPull requests for the Python Semantic Kernel

Type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions