-
Notifications
You must be signed in to change notification settings - Fork 36
Insights: ggml-org/llama.vim
Overview
Loading
Could not load contribution data
Please try again later
Loading
3 Pull requests merged by 1 person
-
core : add speculative fim
#53 merged
Mar 10, 2025 -
core : decouple rendering from server requests
#52 merged
Mar 10, 2025 -
core : improve suggestions starting with empty lines
#51 merged
Mar 10, 2025
1 Pull request opened by 1 person
-
core : fix indent termination logic
#54 opened
Mar 11, 2025
1 Issue closed by 1 person
-
cache: Speculative FIM
#17 closed
Mar 10, 2025
1 Issue opened by 1 person
-
Can I use remote endpoint ?
#50 opened
Mar 7, 2025
1 Unresolved conversation
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Change endpoint port to the default llama-server port of 8080?
#49 commented on
Mar 5, 2025 • 0 new comments