AI-powered hardware inventory management system. Take photos of your hardware, let AI identify and catalog it, then track market value over time with automated price lookups.
- AI-Powered Intake - Upload photos of hardware items. AI vision models (Ollama local or Gemini cloud) automatically identify the category, brand, model, condition, and technical attributes.
- Human-in-the-Loop Review - Review AI results before cataloging. Correct any mistakes and re-analyze with AI using your corrections as context.
- Automated Pricing - When items are cataloged, AI estimates current used market value and original MSRP. Calculates depreciation automatically.
- Scheduled Price Refresh - Celery Beat refreshes all inventory prices on a configurable schedule (default: weekly).
- Price History Tracking - Track how item values change over time with historical price data and portfolio value charts.
- Search & Filter - Full-text search across all fields, faceted sidebar filters (category, condition, brand), and sortable table columns.
- Pluggable AI Providers - Configure multiple AI providers (Ollama for local, Gemini for cloud with web-grounded search). Switch between them from the settings page.
- Docker and Docker Compose
- An AI vision model:
- Local: Ollama with a vision model (e.g.,
llama3.2-vision) - Cloud: Google Gemini API key (enables web-grounded pricing)
- Local: Ollama with a vision model (e.g.,
git clone https://github.com/yourusername/HardwareInventory.git
cd HardwareInventory
cp .env.example .env # edit as needed
docker compose up -d --buildOpen http://localhost:8080 in your browser.
- Go to Settings in the nav bar
- Add a provider:
- Ollama (local): Set base URL to your Ollama instance (e.g.,
http://host.docker.internal:11434), model to a vision model likellama3.2-vision - Gemini (cloud): Paste your API key, model name
gemini-2.0-flash(or similar)
- Ollama (local): Set base URL to your Ollama instance (e.g.,
- Check "Set as Active Provider"
- Click Save
- Intake - Upload photos of hardware items
- Review - Check AI analysis, correct if needed, click "Re-analyze with AI" for better results, then "Approve & Catalog"
- Inventory - Browse, search, filter, and sort your cataloged items. View/edit details and pricing. Click "Refresh Price" for manual price updates.
Frontend (Alpine.js + Tailwind)
|
FastAPI (REST API)
|
├── SQLAlchemy (SQLite)
├── Celery Worker (async AI processing + price lookups)
├── Celery Beat (scheduled price refresh)
└── Redis (task broker)
| Service | Purpose |
|---|---|
web |
FastAPI app serving API + frontend (port 8080) |
celery_worker |
Async task processing (image analysis, price lookups) |
celery_beat |
Scheduled price refresh |
redis |
Celery message broker |
| Method | Endpoint | Purpose |
|---|---|---|
| GET | /health |
Health check |
| POST | /upload |
Upload images for AI analysis |
| GET | /items/review |
Items awaiting review |
| GET | /items/inventory |
Cataloged inventory |
| PUT | /items/{id}/catalog |
Approve and catalog an item |
| POST | /items/{id}/reanalyze |
Re-analyze with user corrections |
| POST | /items/{id}/refresh-price |
Manual price refresh |
| POST | /items/inventory/refresh-prices |
Refresh all prices |
| DELETE | /items/{id} |
Delete an item |
| GET | /categories |
List categories |
| POST | /providers/ |
Add AI provider |
| GET | /providers/ |
List AI providers |
| GET | /items/{id}/history |
Price history for item |
| GET | /portfolio/history |
Portfolio value over time |
All configuration is via environment variables. See .env.example for the full list.
| Variable | Default | Description |
|---|---|---|
DATABASE_URL |
sqlite:////app/data/inventory.db |
Database connection string |
CELERY_BROKER_URL |
redis://redis:6379/0 |
Redis broker URL |
ALLOWED_ORIGINS |
http://localhost:8080 |
CORS allowed origins (comma-separated) |
AI_TIMEOUT |
120 |
AI request timeout in seconds |
OLLAMA_URL |
http://host.docker.internal:11434 |
Default Ollama URL |
PRICE_REFRESH_DAYS |
7 |
Skip items priced within this many days |
PRICE_REFRESH_HOUR |
3 |
Scheduled refresh hour (UTC) |
PRICE_REFRESH_DOW |
sunday |
Scheduled refresh day of week |
cd backend
python -m venv venv
source venv/bin/activate # or venv\Scripts\activate on Windows
pip install -r requirements.txt
# Start Redis (required for Celery)
docker run -d -p 6379:6379 redis:7-alpine
# Run the app
uvicorn backend.main:app --reload --port 8000
# In another terminal, start the Celery worker
celery -A backend.tasks.celery_app worker --loglevel=infoHardwareInventory/
├── backend/
│ ├── main.py # FastAPI app and endpoints
│ ├── models.py # SQLAlchemy ORM models
│ ├── schemas.py # Pydantic request schemas
│ ├── database.py # Database configuration
│ ├── ai_factory.py # AI provider abstraction (Ollama, Gemini)
│ ├── pricing.py # AI-powered price lookup
│ ├── tasks.py # Celery tasks (image processing, pricing, scheduling)
│ └── requirements.txt # Python dependencies
├── frontend/
│ └── index.html # Single-page app (Alpine.js + Tailwind CSS)
├── data/sqlite/ # SQLite database (gitignored)
├── static/uploads/ # Uploaded images (gitignored)
├── docker-compose.yml # Multi-service orchestration
├── Dockerfile # Container image
├── .env.example # Environment variable template
└── LICENSE # MIT License
This application is designed for personal/local use. If you plan to expose it to the internet, be aware of the following:
- No authentication - All endpoints are open. Add authentication (e.g., OAuth, API keys) before exposing publicly.
- API keys in database - AI provider API keys are stored in plaintext in SQLite. For production, consider using a secrets manager or environment variables instead.
- File uploads - Uploads are validated by extension and size (20MB max). Filenames are replaced with UUIDs to prevent path traversal.
- CORS - Restricted to configured origins by default. Update
ALLOWED_ORIGINSin your.envfile. - Security headers -
X-Content-Type-Options,X-Frame-Options, andReferrer-Policyheaders are set on all responses. - No rate limiting - Add a reverse proxy (nginx, Caddy) with rate limiting if exposing to the internet.