feat: AI-first submission, shader detail pages, and review system#15
Merged
Conversation
- Add /shaders/$name detail page with live Three.js preview, interactive uniform controls, recipe code blocks, GLSL source display, and provenance - Make shader cards clickable links to detail pages with star ratings - Replace manual 30-field submission form with AI-first wizard: paste GLSL or a URL (Shadertoy, GitHub gist/file), AI extracts all metadata via Vercel AI SDK, user reviews and submits - Add SQLite-backed review system with POST API for SDK/MCP/agent feedback - Display reviews and ratings on detail pages and shader cards Closes #14 Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Dynamic import('three') so the 725KB module only loads when a shader
preview is actually viewed, not on every page
- IntersectionObserver pauses requestAnimationFrame when canvas is
off-screen (saves GPU cycles when scrolled past)
- Full disposal of geometry, materials, textures, and renderer on unmount
- Reduced sphere geometry from 64x64 to 32x32 segments
- Loading spinner while Three.js loads
- powerPreference: 'high-performance' GPU hint
- Raise Vite chunkSizeWarningLimit to 800KB (Three.js is inherently large)
ShaderPreviewCanvas chunk: 562KB -> 75KB (Three.js loads separately)
Addresses #16
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
TanStack Start v1.166 uses .inputValidator(), not .validator(). The wrong API caused "createServerFn(...).validator is not a function" at runtime, breaking the detail page and submit page. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Why this matters
ShaderBase was built as a git-first, agent-first shader registry — and it nails the data layer. The schema is rigorous, provenance is enforced, recipes are copy-paste ready. But the web frontend has been holding it back in three fundamental ways:
1. The submission flow is hostile to contributors
The current
/submitpage is a 30+ field manual form. A contributor needs to understand the full ShaderBase manifest schema — pipeline vs. stage, capability requirements, provenance source kinds, recipe targets — before they can add a single shader. This is backwards. The data should serve the contributor, not the other way around.Most shader authors already have their code. They have a Shadertoy link, a GitHub gist, or raw GLSL sitting in a file. They shouldn't need to reverse-engineer our schema to share it.
The fix: An AI-first submission wizard. Paste code or a URL. AI analyzes the GLSL, extracts uniforms, infers the pipeline/stage, suggests tags and category, adapts Shadertoy conventions (iTime to uTime), and populates every field. The contributor reviews, tweaks, and submits. The old 545-line manual form is gone entirely.
2. Search results are a dead end
The
/shadersbrowse page shows cards with metadata — but they're divs, not links. You can't click through to see what a shader actually looks like. There's no detail page. No rendering. No way to inspect uniforms, read the GLSL source, or copy a recipe. You search, you see a card, and... that's it.For a shader registry, not being able to see the shaders is a fundamental gap.
The fix: A full detail page at
/shaders/$namewith:3. No feedback loop from usage
When an AI agent uses a shader during vibecoding, there's no way to capture whether it worked well. Did the user like the result? Was the recipe easy to integrate? Did the shader compile correctly in their setup? This signal is invaluable for curation — knowing which shaders are battle-tested vs. which need improvement.
The fix: A review system with:
What changed
New files (10)
components/ShaderPreviewCanvas.tsxcomponents/UniformControls.tsxcomponents/CodeBlock.tsxcomponents/ReviewsSection.tsxcomponents/AiSubmitWizard.tsxlib/server/shader-detail.tslib/server/ai-parse.tslib/server/reviews-db.tsroutes/shaders.$name.tsxroutes/api/-reviews.tsModified files (7)
components/ShaderCard.tsxlib/server/shaders.tsroutes/submit.tsxpackage.json.env.examplebun.lockrouteTree.gen.tsDependencies added
How it works
Detail page flow
/shadersgetShaderDetailserver function reads the full manifest, GLSL files, recipe source code, and preview SVG from the filesystemgetReviewsfetches review data from SQLiteShaderPreviewCanvascreates a Three.js scene with the appropriate geometry (plane for surface/postprocessing, sphere for geometry shaders) and applies a ShaderMaterial with the actual vertex/fragment codeUniformControlsrenders interactive controls per uniform type, feeding overrides back into the materialAI submission flow
resolveShaderSourcedetects the input type and fetches the source code (Shadertoy API, GitHub API, or direct fetch)aiParseShadersends the code to Claude via generateObject() with a Zod schema matching the form data type, plus a system prompt that explains all valid enum values and ShaderBase conventionsReview API contract
SDK/MCP tools call the server functions with
source: 'mcp'orsource: 'sdk'and optional agentContext for tracing which model and task triggered the review.Test plan
/shaders, verify cards are clickable links/shaders-> star rating shows on the card/submit-> paste raw GLSL -> "Parse with AI" -> AI populates all fieldssubmissions/directorybun run check— tests pass, types pass, validation passes, build succeedsCloses #14