Skip to content

wazeerc/nao.ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

nao.ai 🧠 - Your Private AI Companion

(θ„‘; pronounced /now β€’ dot β€’ ai/); nao means "brain" in Mandarin.

nao.ai is Your Private AI Companion, a sleek and intuitive web interface that allows you to run and chat with powerful open-source language models directly on your own machine. Your conversations are private and happen entirely offline.

nao-ai-demo

Publish Docker Images

✨ Features

  • πŸ’¬ Chat with any Model: Seamlessly interact with any language model supported by Ollama.
  • 🧠 Reasoning Insights: View the internal thought process of thinking/reasoning models.
  • πŸ“‘ RAG Document Support: Securely upload and chat with your documents (.pdf & .txt) using Retrieval-Augmented Generation.
  • πŸ”’ Privacy-Focused: All processing is done locally. Your data never leaves your machine.
  • πŸͺ„ Easy Docker Setup: Get up and running with a single command using Docker Compose.

πŸ†• RAG Support

nao.ai now supports Retrieval-Augmented Generation (RAG) to chat with your documents:

  • Supported formats: PDF and TXT files (up to 15MB)
  • Upload documents: Click the πŸ“Ž attachment icon in the chat input
  • Contextual responses: Get answers based on your uploaded documents, solving the problem of "hallucination" in AI responses.

πŸš€ Getting Started with Docker

This is the easiest and recommended way to get started.

Prerequisites

1. Download docker-compose.yml

Download the docker-compose.yml file from the repository. You can download it manually or use the following command:

mkdir nao.ai && cd nao.ai && curl -o docker-compose.yml https://raw.githubusercontent.com/wazeerc/nao.ai/main/docker-compose.yml

2. Create .env File

Create a file named .env in the same directory and add the following content (or use echo):

NUXT_PUBLIC_OLLAMA_MODEL="deepseek-r1:1.5b"
NUXT_PUBLIC_EMBEDDING_MODEL="nomic-embed-text"

NUXT_PUBLIC_OLLAMA_URL="http://localhost:11434"

COMPOSE_PROJECT_NAME=nao-ai

Note

The model will be automatically downloaded the first time you start the application. You can find more models on the Ollama Library.

3. Run the Application

docker compose up -d

The application will be available at http://localhost:3000.

4. Stop the Application

To stop the application, run:

docker compose down

πŸ§‘β€πŸ’» Local Development Setup

For those who prefer to run the application without Docker.

Prerequisites

1. Install Dependencies

pnpm install

2. Configure Your Models

Follow step 2 from the "Getting Started with Docker" section to create and configure your .env file.

3. Run Ollama

Pull and run your desired model with Ollama.

# Pull the main model
ollama pull deepseek-r1:1.5b # Replace with your chosen model

# Pull the embedding model for RAG support
ollama pull nomic-embed-text

4. Start the Dev Server

pnpm dev

The development server will be available at http://localhost:3000.

πŸ› οΈ Built With

πŸ“š Additional Information

Available Models

You can use any model supported by Ollama. The model name in your .env file must match exactly with the model name from Ollama's library.

Some recently tested models:

  • deepseek-r1:1.5b
  • llama3.2:1b
  • qwen2.5:0.5b
  • gemma3:1b

View Installed Models

To see which models are installed in your Docker volume:

echo 'Installed models: ' && docker run --rm -v nao-ai_data:/data alpine ls /data/models/manifests/registry.ollama.ai/library

Environment Variables

  • NUXT_PUBLIC_OLLAMA_MODEL: Model name used by both frontend and Docker.
  • NUXT_PUBLIC_EMBEDDING_MODEL: Embedding model name used for RAG functionality.
  • NUXT_PUBLIC_OLLAMA_URL: Ollama server URL used by the frontend.
  • COMPOSE_PROJECT_NAME: The project name for Docker Compose.

Made with ❀️, Ollama & LangChain

About

🧠 Your Private AI Companion; Open-Source, Secure and Intuitive.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 2

  •  
  •