A web-based chiptune sequencer with AI-assisted composition. Build catchy 8‑bit loops using Square, Triangle, and Pulse wave instruments, then let AI autocomplete melodies, arpeggios, and basslines. Save projects to Supabase and export your creations as MIDI.
- AI composition via Gemini 2.5 (server routes in Next.js)
- Interactive step sequencer with sustain-length per note
- Three classic instruments: Square (lead), Triangle (bass), Pulse (arp/counter)
- Variable length grid (16–64 steps), tempo control, per‑instrument volume
- Project save/load/duplicate/delete with Supabase
- MIDI export using @tonejs/midi
- Next.js 15, React 19, TypeScript
- Tone.js for audio synthesis and transport
- @tonejs/midi for MIDI export
- Supabase JS for auth and Postgres
- Google Generative AI for composition (Gemini 2.5 Flash)
- Tailwind CSS 4 (PostCSS) for styling
- Node.js 18+ and npm
- Supabase project with a
projectstable (schema below) - Google AI Studio API key (Gemini)
Create a .env.local in the project root with:
NEXT_PUBLIC_SUPABASE_URL=your_supabase_url
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_supabase_anon_key
# Either variable name is accepted by the code
GOOGLE_API_KEY=your_gemini_api_key
# or
GEMINI_API_KEY=your_gemini_api_key
# Optional: Python server endpoint used by /api/predict-next-note
PREDICT_URL=http://127.0.0.1:8000/predictnpm install
npm run devThen open http://localhost:3000.
npm run dev: Start Next.js dev servernpm run build: Build for productionnpm start: Start production servernpm run lint: Run ESLint
Table: projects
create table if not exists projects (
id uuid primary key default gen_random_uuid(),
user_id uuid not null,
name text not null,
grid_data jsonb not null, -- boolean[][][]
duration_data jsonb, -- number[][][] (optional, for sustain)
bpm integer not null default 120,
created_at timestamptz not null default now(),
updated_at timestamptz not null default now()
);TypeScript types are in src/types/project.ts.
src/pages/index.tsx: Main sequencer UI with AI compose section and MIDI export.src/hooks/useToneSequencer.ts: Tone.js playback, transport, volumes, and scheduling.src/hooks/useProjectManager.ts: Save/Save As flow, project naming, wiring to Supabase utils.utils/midiExport.ts: Converts grid + durations to a downloadable.midfile.utils/projects.ts: CRUD wrappers for Supabaseprojectstable.utils/supabase.ts: Supabase client using env vars.src/pages/projects.tsx: Manage existing projects (rename, duplicate, delete, open).src/pages/login.tsx: Auth UI using Supabase email/password.- Components:
Header,ControlPanel,SequencerGrid,SaveModal,InstrumentSection,Block.
-
POST /api/gemini-compose-mdb- Body:
{ prompt: string, instruments: {index?: number, name: string, notes: string[]}[], steps?: number, startStep?: number, seed?: { events: { step: number, instrumentIdx: number, note: string, length: number }[] }, maxEvents?: number, stepQuant?: number, maxPolyphony?: number } - Returns:
{ events: { relStep: number, instrumentIdx: number, note: string, length: number }[] } - Uses
@google/generative-aiand extracts strict JSON from Gemini response.
- Body:
-
POST /api/gemini-melody- Body:
{ prompt, instruments, maxEvents?, stepQuant?, maxPolyphony? } - Returns:
{ events: { relStep, instrumentIdx, note, length }[] } - Uses structured JSON output with schema enforcement (alternative flow).
- Body:
-
POST /api/predict-next-note- Body:
{ input_ids: number[] } - Proxies to a Python server at
PREDICT_URL(seesrc/ai/server/).
- Body:
- Navigate to
/loginand create an account or sign in. - On
/:- Click grid cells to place note heads; drag to set sustain length.
- Adjust BPM, grid steps, and instrument volumes.
- Enter a style prompt and click “Generate Music” to autocomplete.
- Export to MIDI at any time.
- Save projects; visit
/projectsto manage them.
- Python server examples in
src/ai/server/python_server_main.pyandpython_server_gpu.py. - The Next.js route
/api/predict-next-noteforwards requests toPREDICT_URL.
- Ensure browser audio starts after a user gesture;
useToneSequencerhandlesTone.start(). duration_datais optional to support older saves; UI will default sustain to 1.- Instrument notes are constrained to chip‑style ranges defined in
index.tsx.
-
We used several AI tools throughout the project for development, debugging, and documentation:
-
ChatGPT (OpenAI GPT-5) — assisted with integrating the
Gemini API, managingSupabasestate, and structuring API routes inNext.js. Also used to improve this README’s clarity and formatting. -
Claude (Anthropic, via CLI) — helped with code cleanup, debugging tone.js playback scheduling, and refining TypeScript typings.
-
Cursor AI — provided in-editor code completions for layout structure,
CSS styling, and UI logic inReactcomponents. -
GitHub Copilot — supported repetitive scaffolding, prop management, and type inference across the project.
-
AI tools were used to assist in writing, refactoring, and documenting code, not to autonomously produce core functionality. All code, logic, and design decisions were reviewed, customized, and validated by our team.
Add screenshots to public/ and reference them here.
MIT (update if different).