Skip to content

mpaladium/pdf-chat-engine

Repository files navigation

Get up and running with Large Language Models quickly, locally and even offline. This project aims to be the easiest way for you to get started with LLMs based RAG bot for interactive chat engine. You can upload pdf documents and then use it as query engine.

  • Use case scenario
    • Upload private documents and interact with chat
    • Constrained environments where your documents cannot be uploaded to public chat engines and have quick Analysis

Features ✨

  • Beautiful & intuitive UI: Inspired by ChatGPT, to enhance similarity in the user experience.
  • Fully local: Stores chats in localstorage for convenience. No need to run a database.
  • Fully responsive: Use your phone to chat, with the same ease as on desktop.
  • Code syntax highligting: Messages that include code, will be highlighted for easy access.
  • Copy codeblocks easily: Easily copy the highlighted code with one click.
  • Download/Pull & Delete models: Easily download and delete models directly from the interface.
  • Switch between models: Switch between models fast with a click.
  • Chat history: Chats are saved locally and easily accessed.
  • Light & Dark mode: Switch between light & dark mode.
  • Pdf upload & Chat: Upload pdf docs, select a doc and chat engine.

Requisites ⚙️

To use the web interface, these requisites must be met:

  1. Download Ollama and have it running. Or run it in a Docker container. Check the docs for instructions.
  2. Node.js (18+) and npm is required. Download

Quick start with Docker

Uncomment ollama code block from docker compose and run

docker compose up 

You will see ollama, weaviate and mongodb running as containers. Adjust neccesary paths for in upsert

In another terminal run the next-js ui llm service using either yarn or npm

You can also change the default 8080 port if you wish.

Installation locally 📖

Install from source

1. Clone the repository to a directory on your pc via command prompt:

git clone 

2. Open the folder:

cd pdf-chat-engine

3. Rename the .example.env to .env:

mv .example.env .env

4. If your instance of Ollama is NOT running on the default ip-address and port, change the variable in the .env file to fit your usecase:

OLLAMA_URL="http://localhost:11434"

5. Install dependencies:

npm install

6. Start the development server:

npm run dev

5. Go to localhost:3000 and start chatting with your favourite model!

Upcoming features

This is a to-do list consisting of upcoming features.

  • ✅ Voice input support
  • ✅ Code syntax highlighting
  • ✅ Ability to send an image in the prompt to utilize vision language models.
  • ✅ Ability to regenerate responses
  • ✅ Ability to chat with pdf docs and compare.
  • ✅ Part of document feature: List of uploaded docs, delete specific docs.
  • Authentication and Authorization
  • Feature extension priority: able to understand Tables, images, excel

Tech stack

NextJS - React Framework for the Web

TailwindCSS - Utility-first CSS framework

shadcn-ui - UI component built using Radix UI and Tailwind CSS

shadcn-chat - Chat components for NextJS/React projects

Framer Motion - Motion/animation library for React

Lucide Icons - Icon library

Helpful links

Medium Article - How to launch your own ChatGPT clone for free on Google Colab. By Bartek Lewicz.

Lobehub mention - Five Excellent Free Ollama WebUI Client Recommendations.

Weaviate Vector database - guide to set up vector database

Credit and thanks to

The current code is referenced from jakobhoegjakobhoeg

About

pdf-chat-engine is a local RAG based chat engine that leverages Large Language Models (LLMs) for interactive document management. Users can upload PDF documents and engage in chat, ensuring privacy and quick analysis without relying on public chat engines.

Resources

License

Stars

Watchers

Forks

Contributors

Languages