Skip to content

marchewadm/llm-api-aggregator-frontend

Repository files navigation

LLM API Aggregator (Frontend)

LLM API Aggregator is a tool that allows you to store your chats from various LLMs in one place. Its frontend is built using Vue.js, Pinia, VeeValidate, VueUse, and @vueuse/motion, with components delivered through shadcn-vue for a smooth user experience.

The repository with the application's backend is located at this link. Make sure to set it up, otherwise the application will not work properly.

Table Of Contents

Demo

llm-api-aggregator-preview.mp4

Prerequisites

  • Node.js v.20 or higher, only if you are not using Docker.
  • Docker - it is not required, but if you want to speed up the installation process, it is recommended to use Docker.

Before Usage

Before installing application, make sure that .env file is created.

The .env file should look like this:

VITE_BACKEND_URL=BACKEND_API_URL

NOTE:

  • Remember to adjust VITE_BACKEND_URL according to where the backend API is running. If it is running locally, your URL should look like this: VITE_BACKEND_URL=http://localhost:8000/api.

Installation

Cloning The Repository

git clone https://github.com/marchewadm/llm-api-aggregator-frontend.git
cd llm-api-aggregator-frontend

Running The Project Via Docker Compose (Recommended)

  1. Build Docker Image
docker compose build
  1. Run The Container
docker compose up -d
  1. Stop The Container
docker compose down

Running The Project Without Docker

  1. Install All Dependencies
npm install
  1. Usage

Run one of the commands below to start using your app:

  • Development mode:
npm run dev
  • Production mode:
npm run build

License

This project is licensed under the MIT License.

About

LLM API Aggregator is a tool that allows you to store your chats from various LLMs in one place.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published