A brief description of your project. OllamaChat is a chat application that leverages the power of Ollama and LangChain to provide intelligent conversations. It's built with Next.js and Tailwind CSS.
- Real-time conversations with Ollama models
- User-friendly chat interface
- Markdown support
- Easy to set up and deploy
- Open-source and extensible
This project is built with the following modern technologies:
- Framework: Next.js
- UI Library: React
- Language: TypeScript
- Styling: Tailwind CSS
- UI Components: Shadcn/UI
- State Management: Redux Toolkit
- AI/LLM: LangChain & Ollama
Here's a showcase of OllamaChat's main features in action:
Easily switch between different Ollama models and experience their unique capabilities:
Engage in smooth text conversations with AI, supporting Markdown format display:
Upload images and discuss their content with AI, supporting multimodal conversations:
Combine document retrieval with generation for more accurate and relevant responses:
- Node.js (version 20 or higher)
- pnpm (or npm/yarn/bun)
- Ollama installed and running locally. See ollama.com for installation instructions.
To get the best experience with OllamaChat, we recommend downloading the following models:
-
DeepSeek-R1: A family of open reasoning models with performance approaching that of leading models. Great for complex reasoning tasks.
ollama pull deepseek-r1
-
Gemma 3: The current, most capable model that runs on a single GPU with vision capabilities.
ollama pull gemma3
- nomic-embed-text: Required for RAG functionality. This high-performing open embedding model with a large token context window is essential for document indexing and retrieval.
ollama pull nomic-embed-text
Note: The
nomic-embed-textmodel is mandatory if you want to use the RAG features in OllamaChat.
For more models and detailed information, visit ollama.com/search to explore the full catalog of available models.
-
Clone the repository:
git clone https://github.com/Yesturnkey/OllamaChat.git cd OllamaChat -
Install the dependencies:
pnpm install
To start the development server, run:
pnpm devOpen http://localhost:3000 in your browser to see the result.
In the package.json file, you will find the following scripts:
dev: Starts the development server with Next.js TurboPack.build: Builds the application for production.start: Starts a production server.lint: Runs the Next.js linter to check for code errors.
The application might require some environment variables. Create a .env.local file in the root of the project and add the necessary variables.
# Example .env.local
OLLAMA_BASE_URL=http://localhost:11434
(Add any other environment variables your application needs here)
Contributions are welcome! Please see the CONTRIBUTING.md file for details on how you can help.
This project adheres to a CODE_OF_CONDUCT.md. By participating, you are expected to uphold this code.
This project is licensed under the MIT License. See the LICENSE file for details.




