Skip to content

feat: add github copilot provider #230

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jun 25, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,4 @@ Thumbs.db
.opencode/

opencode
opencode.md
72 changes: 56 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,22 +91,23 @@ You can enable or disable this feature in your configuration file:

You can configure OpenCode using environment variables:

| Environment Variable | Purpose |
| -------------------------- | ------------------------------------------------------ |
| `ANTHROPIC_API_KEY` | For Claude models |
| `OPENAI_API_KEY` | For OpenAI models |
| `GEMINI_API_KEY` | For Google Gemini models |
| `VERTEXAI_PROJECT` | For Google Cloud VertexAI (Gemini) |
| `VERTEXAI_LOCATION` | For Google Cloud VertexAI (Gemini) |
| `GROQ_API_KEY` | For Groq models |
| `AWS_ACCESS_KEY_ID` | For AWS Bedrock (Claude) |
| `AWS_SECRET_ACCESS_KEY` | For AWS Bedrock (Claude) |
| `AWS_REGION` | For AWS Bedrock (Claude) |
| `AZURE_OPENAI_ENDPOINT` | For Azure OpenAI models |
| `AZURE_OPENAI_API_KEY` | For Azure OpenAI models (optional when using Entra ID) |
| `AZURE_OPENAI_API_VERSION` | For Azure OpenAI models |
| `LOCAL_ENDPOINT` | For self-hosted models |
| `SHELL` | Default shell to use (if not specified in config) |
| Environment Variable | Purpose |
| -------------------------- | -------------------------------------------------------------------------------- |
| `ANTHROPIC_API_KEY` | For Claude models |
| `OPENAI_API_KEY` | For OpenAI models |
| `GEMINI_API_KEY` | For Google Gemini models |
| `GITHUB_TOKEN` | For Github Copilot models (see [Using Github Copilot](#using-github-copilot)) |
| `VERTEXAI_PROJECT` | For Google Cloud VertexAI (Gemini) |
| `VERTEXAI_LOCATION` | For Google Cloud VertexAI (Gemini) |
| `GROQ_API_KEY` | For Groq models |
| `AWS_ACCESS_KEY_ID` | For AWS Bedrock (Claude) |
| `AWS_SECRET_ACCESS_KEY` | For AWS Bedrock (Claude) |
| `AWS_REGION` | For AWS Bedrock (Claude) |
| `AZURE_OPENAI_ENDPOINT` | For Azure OpenAI models |
| `AZURE_OPENAI_API_KEY` | For Azure OpenAI models (optional when using Entra ID) |
| `AZURE_OPENAI_API_VERSION` | For Azure OpenAI models |
| `LOCAL_ENDPOINT` | For self-hosted models |
| `SHELL` | Default shell to use (if not specified in config) |

### Shell Configuration

Expand Down Expand Up @@ -141,6 +142,9 @@ This is useful if you want to use a different shell than your default system she
"apiKey": "your-api-key",
"disabled": false
},
"copilot": {
"disabled": false
},
"groq": {
"apiKey": "your-api-key",
"disabled": false
Expand Down Expand Up @@ -211,6 +215,23 @@ OpenCode supports a variety of AI models from different providers:
- Claude 3 Haiku
- Claude 3 Opus

### GitHub Copilot

- GPT-3.5 Turbo
- GPT-4
- GPT-4o
- GPT-4o Mini
- GPT-4.1
- Claude 3.5 Sonnet
- Claude 3.7 Sonnet
- Claude 3.7 Sonnet Thinking
- Claude Sonnet 4
- O1
- O3 Mini
- O4 Mini
- Gemini 2.0 Flash
- Gemini 2.5 Pro

### Google

- Gemini 2.5
Expand Down Expand Up @@ -574,6 +595,25 @@ The AI assistant can access LSP features through the `diagnostics` tool, allowin

While the LSP client implementation supports the full LSP protocol (including completions, hover, definition, etc.), currently only diagnostics are exposed to the AI assistant.

## Using Github Copilot

_Copilot support is currently experimental._

### Requirements
- [Copilot chat in the IDE](https://github.com/settings/copilot) enabled in GitHub settings
- One of:
- VSCode Github Copilot chat extension
- Github `gh` CLI
- Neovim Github Copilot plugin (`copilot.vim` or `copilot.lua`)
- Github token with copilot permissions

If using one of the above plugins or cli tools, make sure you use the authenticate
the tool with your github account. This should create a github token at one of the following locations:
- ~/.config/github-copilot/[hosts,apps].json
- $XDG_CONFIG_HOME/github-copilot/[hosts,apps].json

If using an explicit github token, you may either set the $GITHUB_TOKEN environment variable or add it to the opencode.json config file at `providers.copilot.apiKey`.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"If using explicit github token" - suggests there is another way. Is there? If not I would change that wording to have just two options - GITHUB_TOKEN variable or opencode.json setting.

Also it would be good to add, which file to use (hosts.json or apps.json), because those contain different tokens.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The primary auth method, and also the recommended method by aider.chat and avante.nvim, is to first login to github copilot using an IDE (VSCode, JetBrains, Neovim); then let the agent use the token created by the IDE. This ensures that the token has the required scope, as those IDEs plugins are first party and will always generate a token with the correct scopes.

The GITHUB_TOKEN env var should only really be used, if you are manually overriding the token for testing, need to toggle between two github copilot accounts (e.g. personal vs work), or the user has sufficient knowledge on how to generate an auth token with the correct scopes. Given that API access to Copilot Chat is not formally documented, documenting manual token generation, and maintaining that documentation is a support task that no one is really interested (Github has taken additional measures to hide the source code for newer revisions, so the fact that we have a Copilot plugin at all is really just us being luck with me having enough free time to reverse engineer one).

As far as which json file has the correct auth token, this is also obfuscated by Github and not documented. Both aider and third-party neovim plugins basically guess which file has the correct one based on format. The token searching logic included in this PR is replicated from avante, but is by no means perfect. In theory, it might be possible preflight each token against the copilot api, but that kind of edge case hand holding is well beyond the scope of this PR.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I get that. But I wasn't sure if this plugin does the search in the github-copilot config files (and I think others might be confused also), or the user should do that and provide the token manually as with other providers inside the opencode.json file (with the env variable as another method).

BTW. Both opencode and your plugin are awesome, thank you all for the work.


## Using a self-hosted model provider

OpenCode can also load and use models from a self-hosted (OpenAI-like) provider.
Expand Down
116 changes: 108 additions & 8 deletions internal/config/config.go
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ import (
"log/slog"
"os"
"path/filepath"
"runtime"
"strings"

"github.com/opencode-ai/opencode/internal/llm/models"
Expand Down Expand Up @@ -161,6 +162,7 @@ func Load(workingDir string, debug bool) (*Config, error) {
}
if os.Getenv("OPENCODE_DEV_DEBUG") == "true" {
loggingFile := fmt.Sprintf("%s/%s", cfg.Data.Directory, "debug.log")
messagesPath := fmt.Sprintf("%s/%s", cfg.Data.Directory, "messages")

// if file does not exist create it
if _, err := os.Stat(loggingFile); os.IsNotExist(err) {
Expand All @@ -172,6 +174,13 @@ func Load(workingDir string, debug bool) (*Config, error) {
}
}

if _, err := os.Stat(messagesPath); os.IsNotExist(err) {
if err := os.MkdirAll(messagesPath, 0o756); err != nil {
return cfg, fmt.Errorf("failed to create directory: %w", err)
}
}
logging.MessageDir = messagesPath

sloggingFileWriter, err := os.OpenFile(loggingFile, os.O_CREATE|os.O_WRONLY|os.O_APPEND, 0o666)
if err != nil {
return cfg, fmt.Errorf("failed to open log file: %w", err)
Expand Down Expand Up @@ -245,6 +254,7 @@ func setDefaults(debug bool) {
// environment variables and configuration file.
func setProviderDefaults() {
// Set all API keys we can find in the environment
// Note: Viper does not default if the json apiKey is ""
if apiKey := os.Getenv("ANTHROPIC_API_KEY"); apiKey != "" {
viper.SetDefault("providers.anthropic.apiKey", apiKey)
}
Expand All @@ -267,16 +277,32 @@ func setProviderDefaults() {
// api-key may be empty when using Entra ID credentials – that's okay
viper.SetDefault("providers.azure.apiKey", os.Getenv("AZURE_OPENAI_API_KEY"))
}
if apiKey, err := LoadGitHubToken(); err == nil && apiKey != "" {
viper.SetDefault("providers.copilot.apiKey", apiKey)
if viper.GetString("providers.copilot.apiKey") == "" {
viper.Set("providers.copilot.apiKey", apiKey)
}
}

// Use this order to set the default models
// 1. Anthropic
// 2. OpenAI
// 3. Google Gemini
// 4. Groq
// 5. OpenRouter
// 6. AWS Bedrock
// 7. Azure
// 8. Google Cloud VertexAI
// 1. Copilot
// 2. Anthropic
// 3. OpenAI
// 4. Google Gemini
// 5. Groq
// 6. OpenRouter
// 7. AWS Bedrock
// 8. Azure
// 9. Google Cloud VertexAI

// copilot configuration
if key := viper.GetString("providers.copilot.apiKey"); strings.TrimSpace(key) != "" {
viper.SetDefault("agents.coder.model", models.CopilotGPT4o)
viper.SetDefault("agents.summarizer.model", models.CopilotGPT4o)
viper.SetDefault("agents.task.model", models.CopilotGPT4o)
viper.SetDefault("agents.title.model", models.CopilotGPT4o)
return
}

// Anthropic configuration
if key := viper.GetString("providers.anthropic.apiKey"); strings.TrimSpace(key) != "" {
Expand Down Expand Up @@ -399,6 +425,14 @@ func hasVertexAICredentials() bool {
return false
}

func hasCopilotCredentials() bool {
// Check for explicit Copilot parameters
if token, _ := LoadGitHubToken(); token != "" {
return true
}
return false
}

// readConfig handles the result of reading a configuration file.
func readConfig(err error) error {
if err == nil {
Expand Down Expand Up @@ -440,6 +474,9 @@ func applyDefaultValues() {
// It validates model IDs and providers, ensuring they are supported.
func validateAgent(cfg *Config, name AgentName, agent Agent) error {
// Check if model exists
// TODO: If a copilot model is specified, but model is not found,
// it might be new model. The https://api.githubcopilot.com/models
// endpoint should be queried to validate if the model is supported.
model, modelExists := models.SupportedModels[agent.Model]
if !modelExists {
logging.Warn("unsupported model configured, reverting to default",
Expand Down Expand Up @@ -584,6 +621,7 @@ func Validate() error {
// Validate providers
for provider, providerCfg := range cfg.Providers {
if providerCfg.APIKey == "" && !providerCfg.Disabled {
fmt.Printf("provider has no API key, marking as disabled %s", provider)
logging.Warn("provider has no API key, marking as disabled", "provider", provider)
providerCfg.Disabled = true
cfg.Providers[provider] = providerCfg
Expand Down Expand Up @@ -631,6 +669,18 @@ func getProviderAPIKey(provider models.ModelProvider) string {

// setDefaultModelForAgent sets a default model for an agent based on available providers
func setDefaultModelForAgent(agent AgentName) bool {
if hasCopilotCredentials() {
maxTokens := int64(5000)
if agent == AgentTitle {
maxTokens = 80
}

cfg.Agents[agent] = Agent{
Model: models.CopilotGPT4o,
MaxTokens: maxTokens,
}
return true
}
// Check providers in order of preference
if apiKey := os.Getenv("ANTHROPIC_API_KEY"); apiKey != "" {
maxTokens := int64(5000)
Expand Down Expand Up @@ -878,3 +928,53 @@ func UpdateTheme(themeName string) error {
config.TUI.Theme = themeName
})
}

// Tries to load Github token from all possible locations
func LoadGitHubToken() (string, error) {
// First check environment variable
if token := os.Getenv("GITHUB_TOKEN"); token != "" {
return token, nil
}

// Get config directory
var configDir string
if xdgConfig := os.Getenv("XDG_CONFIG_HOME"); xdgConfig != "" {
configDir = xdgConfig
} else if runtime.GOOS == "windows" {
if localAppData := os.Getenv("LOCALAPPDATA"); localAppData != "" {
configDir = localAppData
} else {
configDir = filepath.Join(os.Getenv("HOME"), "AppData", "Local")
}
} else {
configDir = filepath.Join(os.Getenv("HOME"), ".config")
}

// Try both hosts.json and apps.json files
filePaths := []string{
filepath.Join(configDir, "github-copilot", "hosts.json"),
filepath.Join(configDir, "github-copilot", "apps.json"),
}

for _, filePath := range filePaths {
data, err := os.ReadFile(filePath)
if err != nil {
continue
}

var config map[string]map[string]interface{}
if err := json.Unmarshal(data, &config); err != nil {
continue
}

for key, value := range config {
if strings.Contains(key, "github.com") {
if oauthToken, ok := value["oauth_token"].(string); ok {
return oauthToken, nil
}
}
}
}

return "", fmt.Errorf("GitHub token not found in standard locations")
}
26 changes: 21 additions & 5 deletions internal/llm/agent/agent.go
Original file line number Diff line number Diff line change
Expand Up @@ -162,6 +162,7 @@ func (a *agent) generateTitle(ctx context.Context, sessionID string, content str
if err != nil {
return err
}
ctx = context.WithValue(ctx, tools.SessionIDContextKey, sessionID)
parts := []message.ContentPart{message.TextContent{Text: content}}
response, err := a.titleProvider.SendMessages(
ctx,
Expand Down Expand Up @@ -230,6 +231,7 @@ func (a *agent) Run(ctx context.Context, sessionID string, content string, attac
}

func (a *agent) processGeneration(ctx context.Context, sessionID, content string, attachmentParts []message.ContentPart) AgentEvent {
cfg := config.Get()
// List existing messages; if none, start title generation asynchronously.
msgs, err := a.messages.List(ctx, sessionID)
if err != nil {
Expand Down Expand Up @@ -288,7 +290,13 @@ func (a *agent) processGeneration(ctx context.Context, sessionID, content string
}
return a.err(fmt.Errorf("failed to process events: %w", err))
}
logging.Info("Result", "message", agentMessage.FinishReason(), "toolResults", toolResults)
if cfg.Debug {
seqId := (len(msgHistory) + 1) / 2
toolResultFilepath := logging.WriteToolResultsJson(sessionID, seqId, toolResults)
logging.Info("Result", "message", agentMessage.FinishReason(), "toolResults", "{}", "filepath", toolResultFilepath)
} else {
logging.Info("Result", "message", agentMessage.FinishReason(), "toolResults", toolResults)
}
if (agentMessage.FinishReason() == message.FinishReasonToolUse) && toolResults != nil {
// We are not done, we need to respond with the tool response
msgHistory = append(msgHistory, agentMessage, *toolResults)
Expand All @@ -312,6 +320,7 @@ func (a *agent) createUserMessage(ctx context.Context, sessionID, content string
}

func (a *agent) streamAndHandleEvents(ctx context.Context, sessionID string, msgHistory []message.Message) (message.Message, *message.Message, error) {
ctx = context.WithValue(ctx, tools.SessionIDContextKey, sessionID)
eventChan := a.provider.StreamResponse(ctx, msgHistory, a.tools)

assistantMsg, err := a.messages.Create(ctx, sessionID, message.CreateMessageParams{
Expand All @@ -325,7 +334,6 @@ func (a *agent) streamAndHandleEvents(ctx context.Context, sessionID string, msg

// Add the session and message ID into the context if needed by tools.
ctx = context.WithValue(ctx, tools.MessageIDContextKey, assistantMsg.ID)
ctx = context.WithValue(ctx, tools.SessionIDContextKey, sessionID)

// Process each event in the stream.
for event := range eventChan {
Expand Down Expand Up @@ -357,10 +365,17 @@ func (a *agent) streamAndHandleEvents(ctx context.Context, sessionID string, msg
default:
// Continue processing
var tool tools.BaseTool
for _, availableTools := range a.tools {
if availableTools.Info().Name == toolCall.Name {
tool = availableTools
for _, availableTool := range a.tools {
if availableTool.Info().Name == toolCall.Name {
tool = availableTool
break
}
// Monkey patch for Copilot Sonnet-4 tool repetition obfuscation
// if strings.HasPrefix(toolCall.Name, availableTool.Info().Name) &&
// strings.HasPrefix(toolCall.Name, availableTool.Info().Name+availableTool.Info().Name) {
// tool = availableTool
// break
// }
}

// Tool not found
Expand Down Expand Up @@ -553,6 +568,7 @@ func (a *agent) Summarize(ctx context.Context, sessionID string) error {
a.Publish(pubsub.CreatedEvent, event)
return
}
summarizeCtx = context.WithValue(summarizeCtx, tools.SessionIDContextKey, sessionID)

if len(msgs) == 0 {
event = AgentEvent{
Expand Down
Loading