Skip to content

prohjectmanager/docs-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RAG Chat Application

A document Q&A chat application using IBM Granite and in-memory vector search.

Prerequisites

  • Deno installed
  • Ollama installed and running with:
    • ibm/granite4:3b (for chat and embeddings)

Setup

  1. Place your documents (.txt, .md, .pdf files) in data/documents/
  2. Build embeddings: deno task build
  3. Start server: deno task dev
  4. Open browser to http://localhost:8000

Commands

  • deno task build - Process documents and generate embeddings
  • deno task dev - Run development server with watch mode
  • deno task start - Run production server

Architecture

This RAG application consists of:

  • Vector Search: In-memory cosine similarity search for document retrieval
  • LLM Integration: IBM Granite 4 (3B) via Ollama for answer generation
  • Embeddings: IBM Granite 4 (3B) for semantic search
  • Frontend: HTMX-based chat interface with Tailwind CSS

Project Structure

/project-root
  /data
    /documents              # Place source documents here
    /processed             # Generated embeddings (created by build script)
  /src
    /server                # HTTP server and routes
    /services              # Core business logic
    /lib                   # Utility functions
  /scripts                 # Build scripts
  /public                  # Frontend assets

Notes

  • The build script processes .txt, .md, and .pdf files
  • Default chunk size: 500 words with 100-word overlap
  • Top-5 retrieved chunks used for context
  • Ensure Ollama is running before starting the application

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published