minhyung/error-solutions-openai provides a small replacement for the OpenAI
solution classes in spatie/error-solutions.
It keeps Spatie's existing solution provider flow, but lets you use modern OpenAI models and OpenAI API-compatible providers such as OpenRouter, vLLM, or Ollama-compatible servers.
composer require minhyung/error-solutions-openaiPublish the optional config file:
php artisan vendor:publish --tag="error-solutions-openai-config"Set your API key and model:
ERROR_SOLUTIONS_OPENAI_KEY=sk-...
ERROR_SOLUTIONS_OPENAI_MODEL=gpt-5.4-miniFor an OpenAI API-compatible provider, set a custom base URL:
ERROR_SOLUTIONS_OPENAI_KEY=...
ERROR_SOLUTIONS_OPENAI_BASE_URL=https://openrouter.ai/api/v1
ERROR_SOLUTIONS_OPENAI_MODEL=openai/gpt-5.4-miniExtra provider headers can be configured in config/error-solutions-openai.php:
'headers' => [
'HTTP-Referer' => env('APP_URL'),
'X-Title' => env('APP_NAME'),
],If your provider expects a token limit parameter other than max_tokens, set:
ERROR_SOLUTIONS_OPENAI_TOKEN_LIMIT_PARAMETER=max_completion_tokensRegister the provider in Spatie's existing config/error-solutions.php:
use Minhyung\ErrorSolutionsOpenAI\OpenAiSolutionProvider;
return [
'solution_providers' => [
'php',
'laravel',
OpenAiSolutionProvider::class,
],
];You can also instantiate it directly:
use Minhyung\ErrorSolutionsOpenAI\OpenAiSolutionProvider;
$provider = new OpenAiSolutionProvider(
apiKey: env('ERROR_SOLUTIONS_OPENAI_KEY'),
model: 'gpt-5.4-mini',
);The provider uses Chat Completions because it is the broadest common API across OpenAI-compatible model providers.
composer test
composer analysePlease see the GitHub releases for more information on what has changed.
Pull requests are welcome. Please run the test suite and static analysis before opening a pull request.
If you discover a security vulnerability, please report it privately instead of opening a public issue by emailing urlinee@gmail.com.
The MIT License (MIT). Please see License File for more information.