A web application for querying multiple Language Model (LLM) instances with a single query.
or in the last version with the older models:
- Clone the repo
- Install the dependencies (npm install)
npm install
npm i ollama
- Open a terminal and run the following command:
node server.mjs
- Open localhost:3000 in your browser
- Start to query the local models with the query input box.
- Add timers per model
- Add more models and a way to select them
- Add more query options / pre-defined queries
- Enable to export to file
- Allow to leverage llama_index
MIT
๐ Feel free to open an issue or contact: https://x.com/greenido