Skip to content

darwin808/parser

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Invoice Parser System

Unified invoice parsing system with Next.js frontend, Express backend, and Python LLM service.

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Next.js       │─────▢│    Express      │─────▢│   Python LLM    β”‚
β”‚   Frontend      β”‚      β”‚    Backend      β”‚      β”‚   (FastAPI)     β”‚
β”‚   Port: 3000    β”‚      β”‚   Port: 3001    β”‚      β”‚   Port: 8000    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                                            β”‚
                                                            β–Ό
                                                    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
                                                    β”‚     Ollama      β”‚
                                                    β”‚   Port: 11434   β”‚
                                                    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸš€ Quick Start

Prerequisites

  • Node.js 18+
  • Python 3.9+
  • Ollama installed and running

1. Complete Setup (First Time)

make setup

This will:

  • Install all dependencies (npm + pip)
  • Check if Ollama is running
  • Set up environment variables

2. Move Your Projects

# Copy your existing projects into:
# - frontend/     (your Next.js app)
# - backend/      (your Express app)
# - llm-service/  (your Python FastAPI app)

3. Start Everything

make dev

That's it! All three services will start simultaneously.

πŸ“‹ Available Commands

Main Commands

Command Description
make help Show all available commands
make dev Start all services in dev mode πŸš€
make status Check service status πŸ“Š
make restart Restart all services πŸ”„
make info Show project information ℹ️

Development

Command Description
make dev-next Start Next.js only
make dev-express Start Express only
make dev-python Start Python only

Installation

Command Description
make install Install all dependencies
make install-next Install Next.js dependencies
make install-express Install Express dependencies
make install-python Install Python dependencies

Build

Command Description
make build Build all services
make build-next Build Next.js
make build-express Build Express

Cleanup

Command Description
make clean Clean all build artifacts
make clean-next Clean Next.js only
make clean-express Clean Express only
make clean-python Clean Python cache

Utilities

Command Description
make ports-check Check if ports are available
make ports-kill Kill processes on ports
make logs Show logs from all services
make test Run all tests
make lint Lint all code

Ollama

Command Description
make ollama-check Check if Ollama is running
make ollama-pull Pull qwen2.5vl model
make ollama-list List installed models

πŸ”§ Configuration

Edit .env file:

# Frontend
NEXT_PUBLIC_API_URL=http://localhost:3001

# Backend
PORT=3001
LLM_SERVICE_URL=http://localhost:8000

# Python LLM
PORT=8000
OLLAMA_HOST=http://localhost:11434
MODEL_NAME=qwen2.5vl:latest

πŸ“ Project Structure

invoice-parser-system/
β”œβ”€β”€ Makefile              # All commands
β”œβ”€β”€ package.json          # Root npm config
β”œβ”€β”€ .env                  # Environment variables
β”œβ”€β”€ README.md
β”‚
β”œβ”€β”€ frontend/             # Next.js app
β”‚   β”œβ”€β”€ package.json
β”‚   β”œβ”€β”€ next.config.js
β”‚   └── src/
β”‚
β”œβ”€β”€ backend/              # Express API
β”‚   β”œβ”€β”€ package.json
β”‚   β”œβ”€β”€ server.js
β”‚   └── src/
β”‚
β”œβ”€β”€ llm-service/          # Python FastAPI
β”‚   β”œβ”€β”€ requirements.txt
β”‚   β”œβ”€β”€ main.py
β”‚   └── .env
β”‚
└── shared/               # Shared code
    β”œβ”€β”€ types/
    └── utils/

πŸ› Troubleshooting

Ports Already in Use

make ports-kill

Services Not Starting

# Check status
make status

# Check ports
make ports-check

# Restart everything
make restart

Ollama Not Running

# Start Ollama
ollama serve

# Check if running
make ollama-check

Clean Install

make clean
make install
make dev

🎯 Workflow Examples

Daily Development

# Morning - start working
make dev

# Check if everything is running
make status

After Pulling New Code

make install
make dev

Before Committing

make lint
make test

Deployment

make build
make start

πŸ“Š Service URLs

πŸ“ License

MIT

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published