You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I have ollama running on a 'good' host, been used with ollama for multiple APIs for a while. With many models.
Twinny extension is installed. Settings updated to point at the external ollama api.
Completions via the chat window are working perfectly.
FIM returns nothing. No error, no timeout.
And the Logs show no requests are being recieved.
To Reproduce
Steps to reproduce the behavior:
Install Twinny
Check/update settings to ensure model correct and available
Test co-pilot style chat
3.1. Check Logs for Ollama to see the API request:
Test auto code complete - Even using manual key-binding nothing happens but this loading symbol spins (forever) and never stops until I click it
4.1. Check Ollama server for API request - there are none, the request from 3.1 is still the most recent:
Expected behavior
Code completions
Desktop (please complete the following information):
MacOS 14.4 (client)
Ubuntu 22 (Ollama Host)
Ollama 0.1.28
Additional context
Ollama autocoder extension works which has similar features to what I suspect you have created for Twinny.
It seems the HTTP requets is not making it to my ollama server?
The text was updated successfully, but these errors were encountered:
Describe the bug
I have ollama running on a 'good' host, been used with ollama for multiple APIs for a while. With many models.
Twinny extension is installed. Settings updated to point at the external ollama api.
Completions via the chat window are working perfectly.
FIM returns nothing. No error, no timeout.
And the Logs show no requests are being recieved.
To Reproduce
Steps to reproduce the behavior:
3.1. Check Logs for Ollama to see the API request:
4.1. Check Ollama server for API request - there are none, the request from 3.1 is still the most recent:
Expected behavior
Code completions
Desktop (please complete the following information):
Additional context
Ollama autocoder extension works which has similar features to what I suspect you have created for Twinny.
It seems the HTTP requets is not making it to my ollama server?
The text was updated successfully, but these errors were encountered: