Skip to content

oddlantern/imagery

Repository files navigation

Imagery

Local AI image generation powered by Ollama. Generate images from text prompts, use reference images for img2img editing, and browse your generation history — all running on your own hardware.

Features

  • Text-to-image with Z-Image Turbo (6B parameter model)
  • Image-to-image with FLUX.2 Klein — single and multi-reference editing
  • Streaming progress — real-time step-by-step progress via SSE
  • Generation history with full metadata (prompt, model, dimensions, seed)
  • Drag-and-drop — drag history images into the prompt as references
  • Fully local — no cloud APIs, no data leaves your machine

Tech Stack

Prerequisites

  • Bun (v1.0+)
  • Docker (for PostgreSQL, or bring your own)
  • macOS with Apple Silicon (Ollama image generation is currently macOS-only)

Ollama Setup

Ollama's image generation feature is experimental and uses the MLX framework on Apple Silicon. Getting it working requires the correct installation method.

Recommended: Install via Homebrew

brew install ollama

Homebrew builds mlx-c from source for your architecture, which avoids the libmlxc.dylib architecture mismatch that can occur with the .app or install script.

Known issue: The Ollama .app download and curl -fsSL https://ollama.com/install.sh | sh may ship an x86_64 libmlxc.dylib even on ARM Macs, causing failed to initialize MLX: libmlxc.dylib not found or incompatible architecture errors. If you hit this, uninstall and use Homebrew instead.

If libmlxc.dylib not found after Homebrew install

Homebrew installs libmlxc.dylib to /opt/homebrew/lib/, but Ollama's runner subprocess may not find it there. Fix by copying it next to the Ollama binary:

cp /opt/homebrew/lib/libmlxc.dylib /opt/homebrew/Cellar/ollama/$(brew info ollama --json | bun -e "console.log(JSON.parse(require('fs').readFileSync('/dev/stdin','utf8'))[0].versions.stable)")/bin/

Then restart Ollama:

brew services restart ollama

Start Ollama and pull models

brew services start ollama

# Text-to-image (12GB, recommended to start)
ollama pull x/z-image-turbo

# Image-to-image editing (6GB, optional)
ollama pull x/flux2-klein

Verify it works:

ollama run x/z-image-turbo "a red circle"

You should see a progress bar. If you see an MLX error, revisit the troubleshooting steps above.

Multiple Ollama instances

If you previously installed Ollama via the .app and then via Homebrew, you may have two ollama serve processes running. The CLI connects to Homebrew's instance but port 11434 may be bound to the old one:

# Check for multiple instances
ps aux | grep "ollama serve"

# Kill the stale one (the /Applications/Ollama.app one)
kill <PID>

Quick Start

# Clone and install
git clone https://github.com/oddlantern/imagery.git
cd imagery
bun install

# Start PostgreSQL
cp .env.example .env
bun run db:up
bun run db:migrate

# Start dev servers
bun run dev

Open http://localhost:5173.

Docker (Production)

Run the full stack in containers (still requires Ollama on the host for GPU access):

cp .env.example .env
docker compose up --build

Open http://localhost:3000.

Update OLLAMA_URL in .env so the containerized API can reach your host Ollama:

Platform OLLAMA_URL value
macOS http://host.docker.internal:11434
Windows (WSL2) http://host.docker.internal:11434
Linux http://172.17.0.1:11434 (or use --network=host)

Project Structure

imagery/
├── apps/
│   ├── api/          # Hono REST API
│   ├── web/          # React + Vite frontend
│   └── shared/       # Shared Zod schemas
├── docker-compose.yml
└── storage/images/   # Generated images on disk

API

Endpoint Method Description
/api/generate POST Generate image (SSE stream)
/api/history GET Paginated generation history
/api/images/:file GET Serve generated image
/api/health GET Health check

Scripts

Command Description
bun run dev Start API + frontend dev servers
bun run dev:api Start API only
bun run dev:web Start frontend only
bun run db:up Start PostgreSQL container
bun run db:down Stop PostgreSQL container
bun run db:migrate Run database migrations
bun run test Run all tests

License

MIT

About

Local AI image generation powered by Ollama

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors