Run a local instance of a code-specialized model via Ollama and connect to your VSCode for code completion / generation.
-
Updated
Jul 9, 2024 - Dockerfile
Run a local instance of a code-specialized model via Ollama and connect to your VSCode for code completion / generation.
Add a description, image, and links to the codellama topic page so that developers can more easily learn about it.
To associate your repository with the codellama topic, visit your repo's landing page and select "manage topics."