llm-inference-js Prerequisite A browser with WebGPU support Requirement Download the gemma2 2b from here Getting Started run the development server: python -m http.server 8000 Open http://localhost:8000 with your browser to see the result.