Skip to content

profemzy/gisty

Repository files navigation

Gisty - Chat Assistant with WebAssembly

A chat application powered by AI with a WebAssembly frontend and Rust backend.

Project Structure

gisty/
├── src/
│   └── main.rs          # Rust backend server (Actix-web)
├── frontend/
│   ├── src/
│   │   └── lib.rs       # WASM chat interface logic
│   ├── pkg/             # Compiled WASM output
│   ├── index.html       # Frontend UI
│   └── Cargo.toml       # WASM dependencies
├── Cargo.toml           # Backend dependencies
└── .env                 # API configuration

Quick Start

Option 1: Using Docker (Recommended)

# Build and run with Docker Compose
docker compose up --build

# Or run in background
docker compose up -d --build

# Open your browser to http://localhost:8080

Option 2: Local Development

# 1. Build the WASM frontend
./build.sh

# 2. Run the server
cargo run

# 3. Open your browser to http://127.0.0.1:8080

Setup

Prerequisites

  1. Install Rust:

    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
  2. Install wasm-pack:

    cargo install wasm-pack

Configuration

  1. Copy the environment template:

    cp .env.example .env
  2. Edit .env with your actual API key:

    API_KEY=your_actual_api_key_here
    BASE_URL=https://portal.infotitans.com/v1/chat/completions
    MODEL=azure/gpt-5

⚠️ Important: Never commit the .env file to version control. It's already in .gitignore.

API Compatibility

This application is designed to work with chat completion APIs. The default configuration points to Infotitans API with GPT-5, but you can configure it to work with any compatible API:

  • Infotitans API: Use the defaults above
  • OpenAI Compatible APIs: Set BASE_URL=https://api.openai.com/v1/chat/completions and appropriate model
  • Anthropic Claude: Set BASE_URL=https://api.anthropic.com/v1/messages and Claude model
  • Custom APIs: Configure your BASE_URL and MODEL according to your API provider

Note: Different APIs may have different response formats. You may need to modify the JSON parsing logic in src/main.rs to match your API's response structure.

Building and Running

Option 1: Using the build script (recommended)

./build.sh    # Builds WASM frontend
cargo run     # Starts the server

Option 2: Manual build

cd frontend
wasm-pack build --target web
cd ..
cargo run

Then open http://127.0.0.1:8080 in your browser.

Development

  • Backend code: src/main.rs
  • Frontend WASM: frontend/src/lib.rs
  • Frontend UI: frontend/index.html

Rebuilding WASM

After making changes to frontend/src/lib.rs:

cd frontend && wasm-pack build --target web && cd ..

Then restart the server to see changes.

Docker Deployment

Quick Start with Docker

# Using Docker Compose
docker compose up -d

# Using Docker directly
docker build -t gisty .
docker run -p 8080:8080 -e API_KEY=your_key gisty

See docs/docker-guide.md for detailed Docker documentation.

Production

Local Production Build

For production deployment without Docker:

  1. Build WASM with optimizations: cd frontend && wasm-pack build --target web --release && cd ..
  2. Build backend with optimizations: cargo build --release
  3. Run: ./target/release/gisty

Docker Production Build

docker build -t gisty:latest .
docker run -d -p 8080:8080 \
  --name gisty \
  --restart always \
  -e API_KEY=your_api_key \
  gisty:latest

Documentation

About

Simple App to Learn Rust and Web Assembly with a touch of AI(LLMs)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published