Skip to content

issues Search Results · repo:Robitx/gp.nvim language:Lua

Filter by

148 results
 (56 ms)

148 results

inRobitx/gp.nvim (press backspace or delete to remove)

Now the model o1 is available via API. Please modify the list of agents to add it. I tried adding this to gp.nvim/lua/gp/config.lua agents list { provider = openai , name = ChatGPTo1 , chat ...
  • JohnnyVM
  • Opened 
    15 days ago
  • #254

Currently, during the time of the output, the chat will scroll down automatically, so that it is difficult to read or copy the content already there during the time. This becomes a problem when you use ...
  • Shallow-Seek
  • Opened 
    28 days ago
  • #253

Error executing luv callback: ...vim/site/pack/packer/start/gp.nvim/lua/gp/dispatcher.lua:244: attempt to index field choices (a nil value) stack traceback: ...vim/site/pack/packer/start/gp.nvim/lua/gp/dispatcher.lua:244: ...
  • danielreis1
  • Opened 
    29 days ago
  • #252

Hello and thank you for your great plugin! I m trying to use it with some exotic LLMs, which are not supported out-of-box and it s kind a hard thing to do. For example, I need to edit payload generation ...
  • seroperson
  • Opened 
    on Feb 21
  • #251

Here s my gp.log and config use({ robitx/gp.nvim , config = function() local GROQ_KEY = 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 local GROQ_HOST = https://api.groq.com/openai/v1/chat/completions ...
  • Mike20403
  • Opened 
    on Feb 9
  • #249

With the release of o3-mini we need to add support for it similar to o1 models. Crud implementation idea: if provider == openai and (model.model:sub(1, 2) == o1 or model.model == o3-mini ) then ...
  • Piotr1215
  • 3
  • Opened 
    on Jan 31
  • #245

I ve been using copilot in vscode, but I don t have ~/.config/github-copilot/hosts.json on my machine.
  • Malakasd748
  • 4
  • Opened 
    on Jan 28
  • #244

I d love to add support for the snacks for popups I m happy to do the work, just wanted to create an issue so it can be agreed on and tracked.
  • joshmedeski
  • 2
  • Opened 
    on Jan 17
  • #242

LLM: Gemini 2.0 flash experimental Issue: Gemini 2.0 flash experimental requires this topK value to be anything between 1 and 41, but it s impossible to setup in gp-nvim settings.
  • nxtkofi
  • 1
  • Opened 
    on Jan 2
  • #241

Hello! I m trying to add a custom LLM using ollama. My plan is to add an indexed llama3.2 I ve managed to do it but I can t seem to format answers properly. app = FastAPI() llm = Ollama(model= llama3.2:3b ...
  • nxtkofi
  • 3
  • Opened 
    on Dec 27, 2024
  • #240
Issue origami icon

Learn how you can use GitHub Issues to plan and track your work.

Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub Issues
ProTip! 
Restrict your search to the title by using the in:title qualifier.
Issue origami icon

Learn how you can use GitHub Issues to plan and track your work.

Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub Issues
ProTip! 
Restrict your search to the title by using the in:title qualifier.
Issue search results · GitHub