Skip to content

YesTurnkey-AI/OllamaChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

OllamaChat

繁體中文

License: MIT

A brief description of your project. OllamaChat is a chat application that leverages the power of Ollama and LangChain to provide intelligent conversations. It's built with Next.js and Tailwind CSS.

Features

  • Real-time conversations with Ollama models
  • User-friendly chat interface
  • Markdown support
  • Easy to set up and deploy
  • Open-source and extensible

Tech Stack

This project is built with the following modern technologies:

Screenshots

Here's a showcase of OllamaChat's main features in action:

1. Model Switching

Easily switch between different Ollama models and experience their unique capabilities:

Model Switching

2. Text Chat

Engage in smooth text conversations with AI, supporting Markdown format display:

Text Chat

3. Image Chat

Upload images and discuss their content with AI, supporting multimodal conversations:

Image Chat

4. RAG (Retrieval-Augmented Generation)

Combine document retrieval with generation for more accurate and relevant responses:

RAG Feature

RAG Feature

Prerequisites

  • Node.js (version 20 or higher)
  • pnpm (or npm/yarn/bun)
  • Ollama installed and running locally. See ollama.com for installation instructions.

Recommended Models

To get the best experience with OllamaChat, we recommend downloading the following models:

Chat Models

  • DeepSeek-R1: A family of open reasoning models with performance approaching that of leading models. Great for complex reasoning tasks.

    ollama pull deepseek-r1
  • Gemma 3: The current, most capable model that runs on a single GPU with vision capabilities.

    ollama pull gemma3

RAG (Retrieval-Augmented Generation) Support

  • nomic-embed-text: Required for RAG functionality. This high-performing open embedding model with a large token context window is essential for document indexing and retrieval.
    ollama pull nomic-embed-text

Note: The nomic-embed-text model is mandatory if you want to use the RAG features in OllamaChat.

For more models and detailed information, visit ollama.com/search to explore the full catalog of available models.

Installation

  1. Clone the repository:

    git clone https://github.com/Yesturnkey/OllamaChat.git
    cd OllamaChat
  2. Install the dependencies:

    pnpm install

Running the Development Server

To start the development server, run:

pnpm dev

Open http://localhost:3000 in your browser to see the result.

Available Scripts

In the package.json file, you will find the following scripts:

  • dev: Starts the development server with Next.js TurboPack.
  • build: Builds the application for production.
  • start: Starts a production server.
  • lint: Runs the Next.js linter to check for code errors.

Configuration

The application might require some environment variables. Create a .env.local file in the root of the project and add the necessary variables.

# Example .env.local
OLLAMA_BASE_URL=http://localhost:11434

(Add any other environment variables your application needs here)

Contributing

Contributions are welcome! Please see the CONTRIBUTING.md file for details on how you can help.

Code of Conduct

This project adheres to a CODE_OF_CONDUCT.md. By participating, you are expected to uphold this code.

License

This project is licensed under the MIT License. See the LICENSE file for details.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors