Skip to content

Conversation

ilvalerione
Copy link
Contributor

@ilvalerione ilvalerione commented Sep 8, 2025

New OpenAI Responses API & Support for Provider Tools

This PR was inspired by #94 and the experimental repository by Andrew Monty.

What does this PR do?

This PR brings the new OpenAIResponses provider which uses the OpenAI responses API under the hood instead of the classical completions API. It supports tool call, streaming, and structured output.

Since it implements the Neuron AIProviderInterface nothing change from external point of view compared to the actual OpenAI provider.

But, since OpenAI responses API supports the provider's native tools like web_search and others I was able to introduce the ProviderTool class to make you able to use provider tools for OpenAI, Anthropic, and Gemini.

use NeuronAI\Tools\ProviderTool;

class MyAgent extends Agent
{
    public function provider(): AIProviderInterface
    {
        return new OpenAIResponses(
            key: 'OPENAI_API_KEY',
            model: 'OPENAI_MODEL',
        );
    }

    public function tools(): array
    {
        return [
            ProviderTool:make(
                type: 'web_search'
            )->setOptions([...]),

            // Add other tools and toolkits as usual...
        ];
    }
}

You can add the provider tools in the tools available to the agent but if the model decide to call this tool it will happen on the provider side, so you just wait for the final response.

Not all proivders support this pattern. For now it's just OpenAI, Anthropic, Gemini, and each of them with limitations. For example Gemini doesn't support the declarion of tools and provider tools at the same time. So if you want to use google search you can't declare other tools. That's why classic tools composition like Tavily still the most flexible solution.

Why is this change needed?

OpenAI marked completions API as deprecated and now it's promoting the transition to responses API in order to take advantage of the latest features: https://platform.openai.com/docs/guides/migrate-to-responses

Related Issues

#285
#94

Breaking Changes

No breaking changes in public APIs. Other providers that depend by the OpenAI completions API format (HuggingFace, Mistral, Grok, Deepseek) continue to extend the OpenAI provider class. So they continue to work based on their current compatibility.

Documentation

You can use the new provider as any other provider. It supports chat, stream, and structured output.

use NeuronAI\Providers\OpenAI\OpenAIResponses;

class MyAgent extends Agent
{
    public function provider(): AIProviderInterface
    {
        return new OpenAIResponses(
            key: 'OPENAI_API_KEY',
            model: 'OPENAI_MODEL',
        );
    }

    ...
}

ProviderTool can be added as usual in the tools method:

class MyAgent extends Agent
{
    ...

    public function tools(): array
    {
        return [
            ProviderTool:make(
                type: 'web_search'
            )->setOptions([...]),
        ];
    }
}

@ilvalerione ilvalerione changed the title OpenAI responses api OpenAI responses api & Provider Tools Sep 8, 2025
@ilvalerione ilvalerione changed the title OpenAI responses api & Provider Tools feat:OpenAI responses api & Provider Tools Sep 8, 2025
@lamberttraccard
Copy link

Nice work! Just to clarify, does this only support OpenAI as the name suggests, or can we use this for Gemini and Anthropic as well? I am a little confused 🤔.

@ilvalerione
Copy link
Contributor Author

ProviderTool can be used with any provider that support this pattern, so yes, you can attach ProviderTool to the agent also using Gemini, and Anthropic.

@ilvalerione ilvalerione merged commit 79d2d84 into 2.x Sep 12, 2025
9 checks passed
@ilvalerione ilvalerione deleted the openai-responses-api branch September 12, 2025 10:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants