LLM API Aggregator is a tool that allows you to store your chats from various LLMs in one place. Its frontend is built using Vue.js, Pinia, VeeValidate, VueUse, and @vueuse/motion, with components delivered through shadcn-vue for a smooth user experience.
The repository with the application's backend is located at this link. Make sure to set it up, otherwise the application will not work properly.
llm-api-aggregator-preview.mp4
- Node.js v.20 or higher, only if you are not using Docker.
- Docker - it is not required, but if you want to speed up the installation process, it is recommended to use Docker.
Before installing application, make sure that .env
file is created.
The .env
file should look like this:
VITE_BACKEND_URL=BACKEND_API_URL
NOTE:
- Remember to adjust
VITE_BACKEND_URL
according to where the backend API is running. If it is running locally, your URL should look like this:VITE_BACKEND_URL=http://localhost:8000/api
.
git clone https://github.com/marchewadm/llm-api-aggregator-frontend.git
cd llm-api-aggregator-frontend
- Build Docker Image
docker compose build
- Run The Container
docker compose up -d
- Stop The Container
docker compose down
- Install All Dependencies
npm install
- Usage
Run one of the commands below to start using your app:
- Development mode:
npm run dev
- Production mode:
npm run build
This project is licensed under the MIT License.