Demo: https://hub-chat.nuxt.dev
This project is a chat interface to interact with various text generation models supported by Cloudflare Workers AI. It allows users to set different LLM parameters, toggle response streaming, handle streaming/non-streaming responses, parse markdown in responses, and includes a dark mode.
Read the blog post on how I created this LLM playground.
- Select the text generation model to interact with
- Set different LLM parameters (temperature, max tokens, system prompt, top_p, top_k, etc.)
- Toggle LLM response streaming on/off
- Handle streaming and non-streaming LLM responses on both server and client sides
- Parse and display markdown in LLM responses
- Auto-scroll chat container as responses are streamed
- Dark mode support
- Nuxt: Vue.js framework for the application foundation
- Nuxt UI: Module for creating a sleek and responsive interface
- Nuxt MDC: For parsing and displaying chat messages
- NuxtHub: Deployment and administration platform for Nuxt, powered by Cloudflare
- Cloudflare Account: Required for using Workers AI models and deploying the project on Cloudflare Pages
- NuxtHub Account: For managing NuxtHub apps and using AI in development
You can deploy and manage this application with a free Cloudflare and free NuxtHub account.
- Clone the repository and install the dependencies with pnpm:
pnpm i
- Link your NuxtHub project to use AI models in development (it will ask you to create one if you don't have any)
npx nuxthub link
- Start the application in development mode
pnpm dev
Open http://localhost:3000 in your browser.
- Push your code to a GitHub repository.
- Link the repository with NuxtHub.
- Deploy from the Admin console.
Learn more about Git integration
npx nuxthub deploy
Learn more about CLI deployment
This project is licensed under the MIT License. See the LICENSE file for details.