Skip to content

Senzo13/JoyBoy

JoyBoy - Local AI Harness, Workstation, and Image Editor

JoyBoy is a local AI harness and local AI workstation: a private ChatGPT / Grok-style chat app, local AI image editor, Ollama-assisted image generation workspace, SDXL inpainting UI, CivitAI model imports manager, local addons runtime, and Codex-style project mode in development.

License: Apache-2.0 Python Local First Ollama Stable Diffusion

Run AI chat, image workflows, model management, and local creative tools on your own machine. JoyBoy is built for people who want an open source ChatGPT alternative, an offline AI assistant, a local Stable Diffusion / SDXL interface, and a privacy-focused AI harness without relying on a cloud account.

JoyBoy is especially aimed at long-tail local AI workflows: local AI harness, local AI workstation, local AI image editor, Ollama image generation routing, SDXL inpainting UI, CivitAI model imports, and 8GB VRAM Stable Diffusion / SDXL workflows on consumer hardware.

Preview

Local chat and runtime Image edit result
JoyBoy local chat with Ollama model selector and runtime meters JoyBoy image editing result with original and modified previews
Edit mode Before/after viewer
JoyBoy edit mode with brush, mask controls, quick prompts, and model picker JoyBoy before and after comparison viewer

Features

  • Private local AI chat with Ollama UI controls and local model routing.
  • Local AI harness for routing prompts, tools, jobs, models, runtime state, and optional extensions from one app.
  • Local AI workstation that keeps chat, image generation, image editing, video tests, gallery, model imports, and runtime panels together.
  • Text-to-image generation with local image models, Ollama-assisted routing, and provider imports.
  • Local AI image editor / SDXL inpainting UI for background edits, clothing edits, lighting, brush masks, expand/outpaint, and detail fixes.
  • Video experiments for local image-to-video workflows on consumer GPUs.
  • CivitAI model imports and Hugging Face imports with local runtime profiles and 8GB VRAM-aware Stable Diffusion / SDXL defaults.
  • Local addons / packs that can extend routing rules, prompt assets, model sources, and UI surfaces without polluting the public core.
  • Gallery and metadata for generated images/videos, prompts, models, and local artifacts.
  • Doctor and runtime panels for VRAM/RAM state, loaded models, provider keys, and machine readiness.
  • Project mode in development for Codex / Claude Code-style workspace-aware assistance and terminal tools.

Why JoyBoy

JoyBoy is designed for local AI users who care about privacy, control, and hardware limits.

  • Zero cloud by default: chats, outputs, provider secrets, and optional packs stay on your computer.
  • One local app: chat, image generation, video tests, model picker, gallery, local packs, and runtime status live together.
  • Harness mindset: JoyBoy coordinates models, jobs, tools, providers, and packs instead of leaving each workflow as a separate script.
  • Consumer GPU friendly: profiles target real machines, including 8 GB VRAM setups.
  • Open source core: the public repository ships the neutral local AI workstation; optional packs remain separate.
  • Extensible by design: addons can add workflows without turning the core app into a private monolith.

Use Cases

  • Run a local ChatGPT-like or Grok-like assistant with Ollama.
  • Use a local LLM harness and local AI harness to coordinate chat, tools, model routing, and creative jobs.
  • Use JoyBoy as a local AI workstation for chat, image generation, image editing, runtime jobs, and model management.
  • Generate images locally with SDXL, Flux-style workflows, Ollama-assisted routing, and imported checkpoints.
  • Edit photos in a local AI image editor with SDXL inpainting, brush masks, background changes, lighting changes, and outpainting.
  • Test local image-to-video workflows without a hosted AI platform.
  • Manage Hugging Face and CivitAI model imports from a local UI.
  • Run 8GB VRAM Stable Diffusion / SDXL workflows with profiles designed for consumer GPUs.
  • Build local addons for custom routing, prompts, model presets, and creator workflows.
  • Experiment with a local Codex-style dev assistant that can understand a project workspace.

Quick Start

Clone the repository, then run the launcher for your platform.

Windows

Double-click start_windows.bat or run:

start_windows.bat

macOS

./start_mac.command

Linux

./start_linux.sh

Then open:

http://127.0.0.1:7860

On first launch, JoyBoy runs onboarding, detects your machine profile, and shows a Doctor report if something is missing. The launchers include a first-time setup/repair path and a fast start path.

The first inpaint, text-to-image, or video run can take longer than the next ones. JoyBoy may need to download or prepare missing runtime assets such as segmentation checkpoints, SCHP human parsing files, ControlNet helpers, preview VAEs, or video components. The generation card shows setup/download progress while this happens; once cached locally, later generations reuse those assets.

If you have an NVIDIA GPU but JoyBoy logs 0.0GB VRAM or torch ... +cpu, run the Windows launcher and choose Setup complet. That repairs the local virtual environment and reinstalls PyTorch with CUDA support.

Local Secrets

Provider keys are optional and stay local:

  • HF_TOKEN
  • CIVITAI_API_KEY
  • OLLAMA_BASE_URL

Set them through environment variables, a local .env, or the JoyBoy settings UI. UI-managed secrets are stored outside git in:

~/.joyboy/config.json

The public repo only ships placeholders such as HF_TOKEN= and CIVITAI_API_KEY=. You only need provider keys for downloads that require them, for example gated Hugging Face models or CivitAI model imports. If you already use local models only, you can start without keys and add them later in the UI.

Public Core + Local Packs

JoyBoy separates the open source core from optional local extensions.

The public core includes orchestration, routing, onboarding, Doctor checks, model/provider import flows, gallery UI, runtime storage, and pack validation.

Local packs live in:

~/.joyboy/packs/<pack_id>/

Some optional local packs may target mature or adult workflows where legal, consensual, and compliant with platform policies. These packs are not part of the public core.

See Local Packs, Addons, and Third-Party Packs for the pack contract.

Documentation

Contributing

Start with CONTRIBUTING.md, CODE_OF_CONDUCT.md, ROADMAP.md, and docs/GOOD_FIRST_ISSUES.md.

Good early contributions include docs, Doctor checks, UI polish, model import UX, tests around local packs, and release hygiene. Browse open good first issue tasks if you want a contained first PR.

License

Apache License 2.0. See LICENSE.

Releases

No releases published

Packages

 
 
 

Contributors