Ollama Code Assistant is a Visual Studio extension designed to enhance your coding experience by integrating AI capabilities. It allows you to interact with the Ollama API to get assistance with software development tasks, such as code completion, debugging tips, and more.
- Interactive Chat Interface: Communicate directly with the Ollama AI model through an integrated chat interface.
- Contextual Code Assistance: Provide contextual information from your current project to receive relevant AI-generated responses.
- Customizable Settings: Configure the extension with different Ollama API URLs and models.
- Microsoft Visual Studio 2022 Community or later (version 17.0 and above).
- Microsoft Edge WebView2 Runtime: Most users have this installed via Windows Update, but if you encounter issues, you can download it from here.
Since the extension is not yet available on the Visual Studio Marketplace, you will need to build and install it manually.
Clone this repository to your local machine using Git:
git clone https://github.com/RyanMathewson/OllamaCodeAssistant.gitOpen the solution file OllamaCodeAssistant.sln in Microsoft Visual Studio.
Build the solution to compile the extension. You can do this by selecting Build > Build Solution from the menu or pressing Ctrl + Shift + B.
You will need to have the https://pkgs.dev.azure.com/azure-public/vside/_packaging/vssdk/nuget/v3/index.json NuGet feed available in your NuGet sources. This is required for the Microsoft.VisualStudio.SDK package.
- Select
Debug > Start Debugging(or pressF5) to build and launch a new instance of Visual Studio with the extension installed. - Alternatively, you can manually install the
.vsixfile generated in thebin\Debugdirectory:- Navigate to the
OllamaCodeAssistant\bin\Debugfolder. - Locate the
OllamaCodeAssistant.vsixfile. - Double-click the
.vsixfile to open the Visual Studio Installer. - Follow the prompts to install the extension.
- Navigate to the
- Open Visual Studio and load your project.
- Navigate to the
Viewmenu, then selectOllama Code Assistantto open the chat interface. - Enter your requests or questions in the chat input box.
- Use the checkboxes to include context from your current selection, file, or all open files when sending prompts.
You can configure the extension settings by navigating to Tools > Options in Visual Studio:
- Ollama API URL: Specify the URL of the Ollama API you want to use.
- Model: Select or enter the model you wish to use for generating responses.
- Click
Refresh Listto update the list of available models if needed.