VistAI is a platform that aggregates responses from various language models via OpenRouter, providing a unified and intuitive search experience.
Run the Cloudflare worker locally with:
npx wrangler devBuild the static frontend with:
npm run buildIf API_BASE_URL is set when running the build command a post-build script will append
<script>window.API_BASE_URL = "<%= process.env.API_BASE_URL %>";</script>
to dist/public/index.html.
Open dist/public/index.html in your browser to test locally.
- Multi-Model Search: Query multiple AI models simultaneously and compare their responses
- Real-time Analytics: Track which models users prefer through click metrics
- Responsive UI: Material dark theme built with Material Web components and high-contrast colors that works on all devices
- Performance Tracking: See which models respond fastest and are chosen most often
- User Accounts: Register and log in to track personal model preferences
- Voice Search: Click the microphone to dictate your question
Set OPENROUTER_API_KEY as a secret when deploying the Cloudflare worker.
The worker uses this server-side key exclusively.
Use ACCESS_CONTROL_ALLOW_ORIGIN to control which origins may access the Worker
APIs. By default no cross-origin requests are allowed. Provide * for
development or a comma separated list of allowed origins.
Search analytics are persisted in a Cloudflare D1 database. The worker expects a binding named DB which is configured in wrangler.toml. Replace the sample database_id with the ID of your database from the Cloudflare dashboard and apply the migrations:
# create the database if you haven't already
wrangler d1 create vistai
wrangler d1 migrations apply vistaiThe schema for the initial migration lives in worker/migrations/0001_init.sql.
client/: React frontend components and pagesshared/: Shared types and schemasworker/: Cloudflare Worker implementation
Install dependencies including development packages before running the test suite and TypeScript compilation:
npm ci
npm test
npm run checkSee docs/best-practices.md for code style guidelines and commit etiquette.
Interactive API docs are available when the worker is running. Visit
/docs in your browser to view a Swagger UI powered by the OpenAPI
specification exposed at /api/openapi.yaml.
This application integrates with OpenRouter to query various AI models including:
- OpenAI GPT-4
- Anthropic Claude 2
- Meta Llama 2
- Mistral AI
- A search query is sent to multiple AI models via OpenRouter
- Responses are collected and displayed side-by-side for comparison
- User clicks on responses are tracked to build a performance profile of each model
- Analytics show which models users prefer for different types of queries
The initial implementation using Vite and WebSockets faced compatibility issues in the Replit environment. We created a simplified standalone version that works reliably without external dependencies.
Future enhancements could include:
- User accounts and personalized model rankings
- Adding more AI models to the comparison
- Implementing revenue sharing based on user preferences
For detailed deployment instructions, see DEPLOYMENT.md.
- Database:
wrangler d1 create vistaiand update database_id inwrangler.toml - Secrets:
wrangler secret put OPENROUTER_API_KEYandwrangler secret put JWT_SECRET - Worker:
wrangler deploy --env production - Frontend: Deploy to Cloudflare Pages with
API_BASE_URLenvironment variable
The application supports environment-specific configurations for development and production deployments.
This project is licensed under the MIT License.