Skip to content

CopilotC-Nvim/CopilotChat.nvim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Copilot Chat for Neovim

Release Build Documentation

Contributors Discord Dotfyle

image

cchat.mp4

CopilotChat.nvim is a Neovim plugin that brings GitHub Copilot Chat capabilities directly into your editor. It provides:

  • 🤖 GitHub Copilot Chat integration with official model and agent support (GPT-4o, Claude 3.7 Sonnet, Gemini 2.0 Flash, and more)
  • 💻 Rich workspace context powered by smart embeddings system
  • 🔒 Explicit context sharing - only sends what you specifically request, either as context or selection
  • 🔌 Modular provider architecture supporting both official and custom LLM backends (Ollama, LM Studio, and more)
  • 📝 Interactive chat UI with completion, diffs and quickfix integration
  • 🎯 Powerful prompt system with composable templates and sticky prompts
  • 🔄 Extensible context providers for granular workspace understanding (buffers, files, git diffs, URLs, and more)
  • ⚡ Efficient token usage with tiktoken optimization
  • 📜 Intelligent chat memory management with automatic summarization to handle lengthy conversations

Requirements

Warning

For Neovim < 0.11.0, add noinsert and popup to your completeopt otherwise autocompletion will not work.

Optional Dependencies

Integration with pickers

For various plugin pickers to work correctly, you need to replace vim.ui.select with your desired picker (as the default vim.ui.select is very basic). Here are some examples:

  • fzf-lua - call require('fzf-lua').register_ui_select()
  • telescope - setup telescope-ui-select.nvim plugin
  • snacks.picker - enable ui_select config
  • mini.pick - set vim.ui.select = require('mini.pick').ui_select

Plugin features that use picker:

  • :CopilotChatPrompts - for selecting prompts
  • :CopilotChatModels - for selecting models
  • :CopilotChatAgents - for selecting agents
  • #<context>:<input> - for selecting context input

Installation

return {
  {
    "CopilotC-Nvim/CopilotChat.nvim",
    dependencies = {
      { "github/copilot.vim" }, -- or zbirenbaum/copilot.lua
      { "nvim-lua/plenary.nvim", branch = "master" }, -- for curl, log and async functions
    },
    build = "make tiktoken", -- Only on MacOS or Linux
    opts = {
      -- See Configuration section for options
    },
    -- See Commands section for default commands if you want to lazy load on them
  },
}

See @jellydn for configuration

Similar to the lazy setup, you can use the following configuration:

call plug#begin()
Plug 'github/copilot.vim'
Plug 'nvim-lua/plenary.nvim'
Plug 'CopilotC-Nvim/CopilotChat.nvim'
call plug#end()

lua << EOF
require("CopilotChat").setup {
  -- See Configuration section for options
}
EOF

Manual

  1. Put the files in the right place
mkdir -p ~/.config/nvim/pack/copilotchat/start
cd ~/.config/nvim/pack/copilotchat/start

git clone https://github.com/github/copilot.vim
git clone https://github.com/nvim-lua/plenary.nvim

git clone https://github.com/CopilotC-Nvim/CopilotChat.nvim
  1. Add to your configuration (e.g. ~/.config/nvim/init.lua)
require("CopilotChat").setup {
  -- See Configuration section for options
}

See @deathbeam for configuration

Features

Commands

Commands are used to control the chat interface:

Command Description
:CopilotChat <input>? Open chat with optional input
:CopilotChatOpen Open chat window
:CopilotChatClose Close chat window
:CopilotChatToggle Toggle chat window
:CopilotChatStop Stop current output
:CopilotChatReset Reset chat window
:CopilotChatSave <name>? Save chat history
:CopilotChatLoad <name>? Load chat history
:CopilotChatPrompts View/select prompt templates
:CopilotChatModels View/select available models
:CopilotChatAgents View/select available agents
:CopilotChat<PromptName> Use specific prompt template

Key Mappings

Default mappings in the chat interface:

Insert Normal Action
<Tab> - Trigger/accept completion menu for tokens
<C-c> q Close the chat window
<C-l> <C-l> Reset and clear the chat window
<C-s> <CR> Submit the current prompt
- grr Toggle sticky prompt for line under cursor
- grx Clear all sticky prompts in prompt
<C-y> <C-y> Accept nearest diff
- gj Jump to section of nearest diff
- gqa Add all answers from chat to quickfix list
- gqd Add all diffs from chat to quickfix list
- gy Yank nearest diff to register
- gd Show diff between source and nearest diff
- gi Show info about current chat
- gc Show current chat context
- gh Show help message

The mappings can be customized by setting the mappings table in your configuration. Each mapping can have:

  • normal: Key for normal mode
  • insert: Key for insert mode

For example, to change the submit prompt mapping or show_diff full diff option:

{
    mappings = {
      submit_prompt = {
        normal = '<Leader>s',
        insert = '<C-s>'
      }
      show_diff = {
        full_diff = true
      }
    }
}

Prompts

Predefined Prompts

Predefined prompt templates for common tasks. Reference them with /PromptName in chat, use :CopilotChat<PromptName> or :CopilotChatPrompts to select them:

Prompt Description
Explain Write an explanation for the selected code
Review Review the selected code
Fix Rewrite the code with bug fixes
Optimize Optimize code for performance and readability
Docs Add documentation comments to the code
Tests Generate tests for the code
Commit Write commit message using commitizen convention

Define your own prompts in the configuration:

{
  prompts = {
    MyCustomPrompt = {
      prompt = 'Explain how it works.',
      system_prompt = 'You are very good at explaining stuff',
      mapping = '<leader>ccmc',
      description = 'My custom prompt description',
    }
  }
}

System Prompts

System prompts define the AI model's behavior. Reference them with /PROMPT_NAME in chat:

Prompt Description
COPILOT_INSTRUCTIONS Base instructions
COPILOT_EXPLAIN Adds coding tutor behavior
COPILOT_REVIEW Adds code review behavior with diagnostics

Define your own system prompts in the configuration (similar to prompts):

{
  prompts = {
    Yarrr = {
      system_prompt = 'You are fascinated by pirates, so please respond in pirate speak.',
    }
  }
}

Sticky Prompts

Sticky prompts persist across chat sessions. They're useful for maintaining context or agent selection. They work as follows:

  1. Prefix text with > using markdown blockquote syntax
  2. The prompt will be copied at the start of every new chat prompt
  3. Edit sticky prompts freely while maintaining the > prefix

Examples:

> #files
> List all files in the workspace

> @models Using Mistral-small
> What is 1 + 11

You can also set default sticky prompts in the configuration:

{
  sticky = {
    '@models Using Mistral-small',
    '#files',
  }
}

Models and Agents

Models

You can control which AI model to use in three ways:

  1. List available models with :CopilotChatModels
  2. Set model in prompt with $model_name
  3. Configure default model via model config key

For supported models, see:

Agents

Agents determine the AI assistant's capabilities. Control agents in three ways:

  1. List available agents with :CopilotChatAgents
  2. Set agent in prompt with @agent_name
  3. Configure default agent via agent config key

The default "noop" agent is none. For more information:

Contexts

Contexts provide additional information to the chat. Add context using #context_name[:input] syntax:

Context Input Support Description
buffer ✓ (number) Current or specified buffer content
buffers ✓ (type) All buffers content (listed/all)
file ✓ (path) Content of specified file
files ✓ (glob) Workspace files
filenames ✓ (glob) Workspace file names
git ✓ (ref) Git diff (unstaged/staged/commit)
url ✓ (url) Content from URL
register ✓ (name) Content of vim register
quickfix - Quickfix list file contents
system ✓ (command) Output of shell command

Examples:

> #buffer
> #buffer:2
> #files:\*.lua
> #filenames
> #git:staged
> #url:https://example.com
> #system:`ls -la | grep lua`

Define your own contexts in the configuration with input handling and resolution:

{
  contexts = {
    birthday = {
      input = function(callback)
        vim.ui.select({ 'user', 'napoleon' }, {
          prompt = 'Select birthday> ',
        }, callback)
      end,
      resolve = function(input)
        return {
          {
            content = input .. ' birthday info',
            filename = input .. '_birthday',
            filetype = 'text',
          }
        }
      end
    }
  }
}

Selections

Selections determine the source content for chat interactions.

Available selections are located in local select = require("CopilotChat.select"):

Selection Description
visual Current visual selection
buffer Current buffer content
line Current line content
unnamed Unnamed register (last deleted/changed/yanked content)

You can set a default selection in the configuration:

{
  -- Default uses visual selection or falls back to buffer
  selection = function(source)
    return select.visual(source) or select.buffer(source)
  end
}

Providers

Providers are modules that implement integration with different AI providers.

Built-in Providers

  • copilot - Default GitHub Copilot provider used for chat
  • github_models - Provider for GitHub Marketplace models
  • copilot_embeddings - Provider for Copilot embeddings, not standalone

Provider Interface

Custom providers can implement these methods:

{
  -- Optional: Disable provider
  disabled?: boolean,

  -- Optional: Embeddings provider name or function
  embed?: string|function,

  -- Optional: Get extra request headers with optional expiration time
  get_headers?(): table<string,string>, number?,

  -- Optional: Get API endpoint URL
  get_url?(opts: CopilotChat.Provider.options): string,

  -- Optional: Prepare request input
  prepare_input?(inputs: table<CopilotChat.Provider.input>, opts: CopilotChat.Provider.options): table,

  -- Optional: Prepare response output
  prepare_output?(output: table, opts: CopilotChat.Provider.options): CopilotChat.Provider.output,

  -- Optional: Get available models
  get_models?(headers: table): table<CopilotChat.Provider.model>,

  -- Optional: Get available agents
  get_agents?(headers: table): table<CopilotChat.Provider.agent>,
}

External Providers

For external providers (Ollama, LM Studio), see the external providers wiki page.

Configuration

Default Configuration

Below are all available configuration options with their default values:

{

  -- Shared config starts here (can be passed to functions at runtime and configured via setup function)

  system_prompt = 'COPILOT_INSTRUCTIONS', -- System prompt to use (can be specified manually in prompt via /).

  model = 'gpt-4o', -- Default model to use, see ':CopilotChatModels' for available models (can be specified manually in prompt via $).
  agent = 'copilot', -- Default agent to use, see ':CopilotChatAgents' for available agents (can be specified manually in prompt via @).
  context = nil, -- Default context or array of contexts to use (can be specified manually in prompt via #).
  sticky = nil, -- Default sticky prompt or array of sticky prompts to use at start of every new chat.

  temperature = 0.1, -- GPT result temperature
  headless = false, -- Do not write to chat buffer and use history(useful for using callback for custom processing)
  callback = nil, -- Callback to use when ask response is received
  remember_as_sticky = true, -- Remember model/agent/context as sticky prompts when asking questions

  -- default selection
  -- see select.lua for implementation
  selection = function(source)
    return select.visual(source) or select.buffer(source)
  end,

  -- default window options
  window = {
    layout = 'vertical', -- 'vertical', 'horizontal', 'float', 'replace'
    width = 0.5, -- fractional width of parent, or absolute width in columns when > 1
    height = 0.5, -- fractional height of parent, or absolute height in rows when > 1
    -- Options below only apply to floating windows
    relative = 'editor', -- 'editor', 'win', 'cursor', 'mouse'
    border = 'single', -- 'none', single', 'double', 'rounded', 'solid', 'shadow'
    row = nil, -- row position of the window, default is centered
    col = nil, -- column position of the window, default is centered
    title = 'Copilot Chat', -- title of chat window
    footer = nil, -- footer of chat window
    zindex = 1, -- determines if window is on top or below other floating windows
  },

  show_help = true, -- Shows help message as virtual lines when waiting for user input
  highlight_selection = true, -- Highlight selection
  highlight_headers = true, -- Highlight headers in chat, disable if using markdown renderers (like render-markdown.nvim)
  references_display = 'virtual', -- 'virtual', 'write', Display references in chat as virtual text or write to buffer
  auto_follow_cursor = true, -- Auto-follow cursor in chat
  auto_insert_mode = false, -- Automatically enter insert mode when opening window and on new prompt
  insert_at_end = false, -- Move cursor to end of buffer when inserting text
  clear_chat_on_new_prompt = false, -- Clears chat on every new prompt

  -- Static config starts here (can be configured only via setup function)

  debug = false, -- Enable debug logging (same as 'log_level = 'debug')
  log_level = 'info', -- Log level to use, 'trace', 'debug', 'info', 'warn', 'error', 'fatal'
  proxy = nil, -- [protocol://]host[:port] Use this proxy
  allow_insecure = false, -- Allow insecure server connections

  chat_autocomplete = true, -- Enable chat autocompletion (when disabled, requires manual `mappings.complete` trigger)

  log_path = vim.fn.stdpath('state') .. '/CopilotChat.log', -- Default path to log file
  history_path = vim.fn.stdpath('data') .. '/copilotchat_history', -- Default path to stored history

  question_header = '# User ', -- Header to use for user questions
  answer_header = '# Copilot ', -- Header to use for AI answers
  error_header = '# Error ', -- Header to use for errors
  separator = '───', -- Separator to use in chat

  -- default providers
  -- see config/providers.lua for implementation
  providers = {
    copilot = {
    },
    github_models = {
    },
    copilot_embeddings = {
    },
  }

  -- default contexts
  -- see config/contexts.lua for implementation
  contexts = {
    buffer = {
    },
    buffers = {
    },
    file = {
    },
    files = {
    },
    git = {
    },
    url = {
    },
    register = {
    },
    quickfix = {
    },
    system = {
    }
  },

  -- default prompts
  -- see config/prompts.lua for implementation
  prompts = {
    Explain = {
      prompt = 'Write an explanation for the selected code as paragraphs of text.',
      system_prompt = 'COPILOT_EXPLAIN',
    },
    Review = {
      prompt = 'Review the selected code.',
      system_prompt = 'COPILOT_REVIEW',
    },
    Fix = {
      prompt = 'There is a problem in this code. Identify the issues and rewrite the code with fixes. Explain what was wrong and how your changes address the problems.',
    },
    Optimize = {
      prompt = 'Optimize the selected code to improve performance and readability. Explain your optimization strategy and the benefits of your changes.',
    },
    Docs = {
      prompt = 'Please add documentation comments to the selected code.',
    },
    Tests = {
      prompt = 'Please generate tests for my code.',
    },
    Commit = {
      prompt = 'Write commit message for the change with commitizen convention. Keep the title under 50 characters and wrap message at 72 characters. Format as a gitcommit code block.',
      context = 'git:staged',
    },
  },

  -- default mappings
  -- see config/mappings.lua for implementation
  mappings = {
    complete = {
      insert = '<Tab>',
    },
    close = {
      normal = 'q',
      insert = '<C-c>',
    },
    reset = {
      normal = '<C-l>',
      insert = '<C-l>',
    },
    submit_prompt = {
      normal = '<CR>',
      insert = '<C-s>',
    },
    toggle_sticky = {
      normal = 'grr',
    },
    clear_stickies = {
      normal = 'grx',
    },
    accept_diff = {
      normal = '<C-y>',
      insert = '<C-y>',
    },
    jump_to_diff = {
      normal = 'gj',
    },
    quickfix_answers = {
      normal = 'gqa',
    },
    quickfix_diffs = {
      normal = 'gqd',
    },
    yank_diff = {
      normal = 'gy',
      register = '"', -- Default register to use for yanking
    },
    show_diff = {
      normal = 'gd',
      full_diff = false, -- Show full diff instead of unified diff when showing diff window
    },
    show_info = {
      normal = 'gi',
    },
    show_context = {
      normal = 'gc',
    },
    show_help = {
      normal = 'gh',
    },
  },
}

Customizing Buffers

Types of copilot buffers:

  • copilot-chat - Main chat buffer
  • copilot-overlay - Overlay buffers (e.g. help, info, diff)

You can set local options for plugin buffers like this:

vim.api.nvim_create_autocmd('BufEnter', {
    pattern = 'copilot-*',
    callback = function()
        -- Set buffer-local options
        vim.opt_local.relativenumber = false
        vim.opt_local.number = false
        vim.opt_local.conceallevel = 0
    end
})

Customizing Highlights

Types of copilot highlights:

  • CopilotChatHeader - Header highlight in chat buffer
  • CopilotChatSeparator - Separator highlight in chat buffer
  • CopilotChatStatus - Status and spinner in chat buffer
  • CopilotChatHelp - Help messages in chat buffer (help, references)
  • CopilotChatSelection - Selection highlight in source buffer
  • CopilotChatKeyword - Keyword highlight in chat buffer (e.g. prompts, contexts)
  • CopilotChatInput - Input highlight in chat buffer (for contexts)

API Reference

Core Chat Functions

local chat = require("CopilotChat")

-- Basic Chat Functions
chat.ask(prompt, config)      -- Ask a question with optional config
chat.response()               -- Get the last response text
chat.resolve_prompt()         -- Resolve prompt references
chat.resolve_context()        -- Resolve context embeddings (WARN: async, requires plenary.async.run)
chat.resolve_agent()          -- Resolve agent from prompt (WARN: async, requires plenary.async.run)
chat.resolve_model()          -- Resolve model from prompt (WARN: async, requires plenary.async.run)

-- Window Management
chat.open(config)             -- Open chat window with optional config
chat.close()                  -- Close chat window
chat.toggle(config)           -- Toggle chat window visibility with optional config
chat.reset()                  -- Reset the chat
chat.stop()                   -- Stop current output

-- Selection Management
chat.get_selection()                                   -- Get the current selection
chat.set_selection(bufnr, start_line, end_line, clear) -- Set or clear selection

-- Prompt & Context Management
chat.select_prompt(config)    -- Open prompt selector with optional config
chat.select_model()           -- Open model selector
chat.select_agent()           -- Open agent selector
chat.prompts()                -- Get all available prompts

-- Completion
chat.trigger_complete()       -- Trigger completion in chat window
chat.complete_info()          -- Get completion info for custom providers
chat.complete_items()         -- Get completion items (WARN: async, requires plenary.async.run)

-- History Management
chat.save(name, history_path) -- Save chat history
chat.load(name, history_path) -- Load chat history

-- Configuration
chat.setup(config)            -- Update configuration
chat.log_level(level)         -- Set log level (debug, info, etc.)

Chat Window UI API

You can also access the chat window UI methods through the chat.chat object:

local chat = require("CopilotChat")

-- Chat UI State
chat.chat:visible()             -- Check if chat window is visible
chat.chat:focused()             -- Check if chat window is focused

-- Content Management
chat.chat:get_prompt()          -- Get current prompt from chat window
chat.chat:set_prompt(prompt)    -- Set prompt in chat window
chat.chat:add_sticky(sticky)    -- Add sticky prompt to chat window
chat.chat:append(text)          -- Append text to chat window
chat.chat:clear()               -- Clear chat window content
chat.chat:finish()              -- Finish writing to chat window

-- Navigation
chat.chat:follow()              -- Move cursor to end of chat content
chat.chat:focus()               -- Focus the chat window

-- Advanced Features
chat.chat:get_closest_section() -- Get section closest to cursor
chat.chat:get_closest_block()   -- Get code block closest to cursor
chat.chat:overlay(opts)         -- Show overlay with specified options

Example Usage

-- Open chat, ask a question and handle response
require("CopilotChat").open()
require("CopilotChat").ask("Explain this code", {
  callback = function(response)
    vim.notify("Got response: " .. response:sub(1, 50) .. "...")
  end,
  context = "#buffer"
})

-- Save and load chat history
require("CopilotChat").save("my_debugging_session")
require("CopilotChat").load("my_debugging_session")

-- Use custom context and model
require("CopilotChat").ask("How can I optimize this?", {
  model = "gpt-4o",
  context = {"#buffer", "#git:staged"}
})

For more examples, see the examples wiki page.

Development

Setup

To set up the environment:

  1. Clone the repository:
git clone https://github.com/CopilotC-Nvim/CopilotChat.nvim
cd CopilotChat.nvim
  1. Install development dependencies:
# Install pre-commit hooks
make install-pre-commit

To run tests:

make test

Contributing

  1. Fork the repository
  2. Create your feature branch
  3. Make your changes
  4. Run tests and lint checks
  5. Submit a pull request

See CONTRIBUTING.md for detailed guidelines.

Contributors

Thanks goes to these wonderful people (emoji key):

gptlang
gptlang

💻 📖
Dung Duc Huynh (Kaka)
Dung Duc Huynh (Kaka)

💻 📖
Ahmed Haracic
Ahmed Haracic

💻
Trí Thiện Nguyễn
Trí Thiện Nguyễn

💻
He Zhizhou
He Zhizhou

💻
Guruprakash Rajakkannu
Guruprakash Rajakkannu

💻
kristofka
kristofka

💻
PostCyberPunk
PostCyberPunk

📖
Katsuhiko Nishimra
Katsuhiko Nishimra

💻
Erno Hopearuoho
Erno Hopearuoho

💻
Shaun Garwood
Shaun Garwood

💻
neutrinoA4
neutrinoA4

💻 📖
Jack Muratore
Jack Muratore

💻
Adriel Velazquez
Adriel Velazquez

💻 📖
Tomas Slusny
Tomas Slusny

💻 📖
Nisal
Nisal

📖
Tobias Gårdhus
Tobias Gårdhus

📖
Petr Dlouhý
Petr Dlouhý

📖
Dylan Madisetti
Dylan Madisetti

💻
Aaron Weisberg
Aaron Weisberg

💻 📖
Jose Tlacuilo
Jose Tlacuilo

💻 📖
Kevin Traver
Kevin Traver

💻 📖
dTry
dTry

💻
Arata Furukawa
Arata Furukawa

💻
Ling
Ling

💻
Ivan Frolov
Ivan Frolov

💻
Folke Lemaitre
Folke Lemaitre

💻 📖
GitMurf
GitMurf

💻
Dmitrii Lipin
Dmitrii Lipin

💻
jinzhongjia
jinzhongjia

📖
guill
guill

💻
Sjon-Paul Brown
Sjon-Paul Brown

💻
Renzo Mondragón
Renzo Mondragón

💻 📖
fjchen7
fjchen7

💻
Radosław Woźniak
Radosław Woźniak

💻
JakubPecenka
JakubPecenka

💻
thomastthai
thomastthai

📖
Tomáš Janoušek
Tomáš Janoušek

💻
Toddneal Stallworth
Toddneal Stallworth

📖
Sergey Alexandrov
Sergey Alexandrov

💻
Léopold Mebazaa
Léopold Mebazaa

💻
JunKi Jin
JunKi Jin

💻
abdennourzahaf
abdennourzahaf

📖
Josiah
Josiah

💻
Tony Fischer
Tony Fischer

💻 📖
Kohei Wada
Kohei Wada

💻
Sebastian Yaghoubi
Sebastian Yaghoubi

📖
johncming
johncming

💻
Rokas Brazdžionis
Rokas Brazdžionis

💻
Sola
Sola

📖 💻
Mani Chandra
Mani Chandra

💻

This project follows the all-contributors specification. Contributions of any kind are welcome!

Stargazers

Stargazers over time