Skip to content

Update default LLM provider documentation to OpenAI GPT-5.3 Codex#38

Merged
japertechnology merged 2 commits intomainfrom
copilot/update-documentation-llm-provider
Feb 27, 2026
Merged

Update default LLM provider documentation to OpenAI GPT-5.3 Codex#38
japertechnology merged 2 commits intomainfrom
copilot/update-documentation-llm-provider

Conversation

Copy link

Copilot AI commented Feb 27, 2026

The runtime settings (.pi/settings.json) already point to openai/gpt-5.3-codex, but documentation and install templates still reference Anthropic as the default.

Changes

  • install/settings.json — Default provider changed from anthropic/claude-sonnet-4-20250514 to openai/gpt-5.3-codex to match the actual .pi/settings.json
  • README.md — Quick-start references OPENAI_API_KEY; Configuration section lists "OpenAI - GPT-5.3 Codex (default)" first; Supported Providers table reordered accordingly
  • PACKAGES.md — "(default provider)" label moved from Anthropic to OpenAI
  • MINIMUM-INTELLIGENCE-INSTALLER.ts — Post-install example key updated to OPENAI_API_KEY
  • github-minimum-intelligence-installation.yml — Welcome issue body example key updated to OPENAI_API_KEY
{
  "defaultProvider": "openai",
  "defaultModel": "gpt-5.3-codex",
  "defaultThinkingLevel": "high"
}

✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

…rovider

Co-authored-by: japertechnology <51429990+japertechnology@users.noreply.github.com>
Copilot AI changed the title [WIP] Update documentation for default LLM provider to OpenAI 5.3 Codex Update default LLM provider documentation to OpenAI GPT-5.3 Codex Feb 27, 2026
@japertechnology japertechnology marked this pull request as ready for review February 27, 2026 02:07
@japertechnology japertechnology merged commit 065bb31 into main Feb 27, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants