Skip to content

Post-installation problem with model settings #25

@Zzegfried

Description

@Zzegfried

I'm using Visual Studio 2026 with latest Ollama and Gemma4:e4b model on Windows 11 Pro. After installing LocalPilot extension, the chat repeatedly failed with:

"[LocalPilot Error] Could not reach Ollama: Response status code does not indicate success: 404 (Not Found)."

After a lot of digging I found that the logs indicated that it was trying to connect to the llama model even though I don't have it installed and even though the LocalPilot settings dialog showed Gemma4 configured for all interactions. The chat function only worked after I explicitly saved the settings. It seems to me that when installing the extension it should be initially configured to use the Ollama default model not always llama.

BTW, thank you for sharing this tool. 👍

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions