feat(rig-core): add ChatGPT Subscription, GitHub Copilot, and compatibility providers#1615
Conversation
|
btw i am building a rust desktop app and verified above providers manually :) |
5fffd3b to
3193eac
Compare
|
I was also working on trying to make a custom provider using OAuth codex login. So loved seeing this PR already open! This would be amazing to see added here. I’m also working on some agent tools and would love to be able to just reuse my existing codex subscription |
Thanks @RileyMathews ! Glad this is helpful for you, too! |
3193eac to
89f509b
Compare
|
Thanks for the PR. Overall this looks good, but I have a couple of follow-up concerns about the auth flow. Device flow polling: the OAuth polling logic is currently hardcoded to a fixed sleep/attempt count and only retries on authorization_pending. GitHub’s device flow docs mention Cached token recovery: if a cached access token exists on disk but is stale or revoked, it looks like we return it directly and then fail when the Copilot token bootstrap rejects it. Is there a reason we don’t invalidate the cached token and fall back to re-auth in that case? Ideally this should recover automatically so the user does not need to manually clear cached tokens later. |
627613e to
c8964ff
Compare
Thanks @gold-silver-copper for the great suggestions! Now addressed both concerns in the latest push.
Also added targeted tests around the new polling and stale-token recovery behavior. Thanks again :) |
Introduce upstream-ready providers and compatibility layers for subscription-backed and OpenAI/Anthropic-compatible services. - add ChatGPT Subscription OAuth with device flow, token refresh, request normalization, and SSE compatibility handling - add GitHub Copilot OAuth with device flow, cached API key refresh, per-account endpoint routing, and codex responses routing - add MiniMax, Moonshot, and Z.AI providers for documented OpenAI-compatible and Anthropic-compatible endpoints - generalize shared OpenAI and Anthropic compatibility plumbing so provider-specific behavior stays centralized and reusable - expose app-friendly OAuth callbacks for embedding integrations such as Con settings UI
c8964ff to
2a7f954
Compare
|
I'm going to work on fixing the final errors and merging this with #1415 so we can get this in by the end of today. |
|
thanks @gold-silver-copper! did a small patch to make ci happy |
|
@wey-gu Hey there, I appreciate it but I'm going to revert it, I've already handled that on my local branch, sorry about that. I'm currently finishing up this PR, it'll be merged by tonight, don't worry about it :) |
sure! sorry for that :-P thought i was helping follow the sun, didnt realize breaking your work :D |
|
It's okay, I should've been communicating more :p |
…to wey/subs-providers
This PR adds two subscription-backed providers and three compatibility providers, while keeping provider-specific behavior localized in
rig-coreinstead of pushing quirks into callers.What is included
ChatGPT Subscription/responsesrequest normalization for the subscription backendinstructionsrequiredsystemmessages must be lifted out ofinputcontent-typeintervalmay be returned as a stringGitHub Copilot/responsesrouting for codex-class models, while keeping chat-completions for normal chat modelscompatibility providers for documented OpenAI-compatible / Anthropic-compatible APIs
MiniMaxMoonshotZ.AIWhy the shared plumbing changed
Validation
cargo check -p rig-coreReference alignment
tool_choice=requiredand assistantreasoning_content; that is handled in this PR.Scope note
This is intentionally one cohesive provider/compatibility change because the subscription providers required small shared compatibility extensions. If maintainers prefer, I can split follow-up work into: