The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
-
Updated
Nov 17, 2024 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
🌐🦙 Ollama as your translator, with DeepLX-compatible API.
Implements a simple REPL chat with a locally running instance of Ollama.
A full-stack web chatbot application integrated with Ollama
Buni is a TypeScript-based client API for Ollama, designed to be simple yet flexible.
A Next.js chatbot interface integrated with the Ollama API. Features real-time AI responses, markdown formatting with syntax highlighting, and error handling for seamless user interaction.
An AI-powered desktop app for effortless photo organization.
Ollama chat webui - AI Chatbot made with React, Vite, Nest.js, tailwind, shadcn & more
Predictive Prompt is a simple Language Learning Model (LLM) chat window with retro styling. It dynamically populates a dropdown with available models from a local instance of Ollama and uses the streaming API to generate and display results in real-time. The output is rendered in markdown with syntax highlighting support.
The Jax WebUi project provides a web-based interface for connecting with your local Ollama server and engaging in chat conversations. It utilizes the Ollama API to facilitate interactions between users and the AI-powered Ollama server, providing an alternative way to interact with the technology.
A simple yet effective CLI application built on Node.js, using Ollama Vision LLava for auto generate caption based on your image.
Module to integrate LLM models in any workflow
A chat bot built with vercel ai tooklit and ollama
Generate an alternate text for an image locally on your machine.
PDF to text converter to chat with an AI, (ChatGPT, Ollama)
Add a description, image, and links to the ollama-api topic page so that developers can more easily learn about it.
To associate your repository with the ollama-api topic, visit your repo's landing page and select "manage topics."