Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error during inference: fetch failed #29

Open
cadeff01 opened this issue Jan 24, 2024 · 5 comments
Open

Error during inference: fetch failed #29

cadeff01 opened this issue Jan 24, 2024 · 5 comments

Comments

@cadeff01
Copy link

cadeff01 commented Jan 24, 2024

I have an ollama container running the stable-code:3b-code-q4_0 model. I'm able to interact with the model via curl:

curl -d '{"model":"stable-code:3b-code-q4_0", "prompt": "c++"}' https://notarealurl.io/api/generate

and get a response in a terminal in wsl where I'm running vscode:

image

However when I set the Ollama Server Endpoint to https://notarealurl.io/ I just get [warning] Error during inference: fetch failed

@cadeff01
Copy link
Author

cadeff01 commented Jan 25, 2024

my url is using a custom ca but I also have NODE_EXTRA_CA_CERTS=/etc/ssl/certs/ca-certificates.crt set which from my understanding should address any ssl issues with my custom ca.

@cadeff01
Copy link
Author

Got some tests running locally and verified that this is related to the custom ca. Any chance of getting support for custom ca certs?

@itamark-targa
Copy link

I get the same warning in VSCode: #3 (comment)
Appreciate any help with it

@Kevsnz
Copy link
Contributor

Kevsnz commented Feb 5, 2024

VSCode as a host controls all connections extensions open and use, so it's not related to Llama Coder specifically.

Have you tried solution from this Stackoverflow question?

@cadeff01
Copy link
Author

cadeff01 commented Feb 5, 2024

I've tried the NODE_EXTRA_CERTS solution as disabling SSL is a really bad idea but that didn't help. For similar plugins like Continue I know they had to add something to support extra certs in the plugin itself for this to work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants