Skip to content

1129Aliasgar/Agentic_AI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

AI Agentic SaaS Platform

A comprehensive AI-powered platform for email management, document summarization, and location finding.

Features

  • Email Management: Sort emails by priority and schedule events automatically
  • Document Summary: Upload PDF, DOC, DOCX, or TXT files and get AI-generated summaries
  • Location Finder: Find routes between your location and college destinations

Project Structure

.
├── Frontend/          # Next.js frontend application
│   ├── pages/        # Next.js pages (email, doc, map)
│   ├── components/   # React components (Navbar, NotificationDrawer, MapComponent)
│   ├── styles/       # Global styles and theme
│   └── pages/api/    # API proxy routes
│
└── backend/          # Express.js backend API
    ├── routes/       # API route definitions
    ├── controllers/  # Business logic controllers
    ├── agents/       # LLM agent (Ollama integration)
    └── utils/        # Utility functions (PDF parser, map utils)

Quick Start

Prerequisites

  • Node.js (v18 or higher)
  • npm or yarn
  • Ollama (for AI features) - Download from https://ollama.ai

Frontend Setup

  1. Navigate to Frontend directory:
cd Frontend
  1. Install dependencies:
npm install
  1. Set up environment variables (.env.local):
NEXT_PUBLIC_API_URL=http://localhost:5000
  1. Run development server:
npm run dev

Frontend will be available at http://localhost:3000

Backend Setup

  1. Navigate to backend directory:
cd backend
  1. Install dependencies:
npm install
  1. Set up environment variables (.env):
PORT=5000
OLLAMA_API_URL=http://localhost:11434
OLLAMA_MODEL=deepseek
GMAIL_CLIENT_ID=your_gmail_client_id_here
GMAIL_CLIENT_SECRET=your_gmail_client_secret_here
CORS_ORIGIN=http://localhost:3000
  1. Install and pull Ollama model:
# Make sure Ollama is installed and running
ollama pull deepseek
  1. Run the server:
npm run dev

Backend will be available at http://localhost:5000

Docker Setup (Alternative)

You can use Docker Compose to run both backend and Ollama:

cd backend
docker-compose up

Configuration

College Locations

Update the college locations in backend/utils/mapUtils.js with your actual coordinates:

const COLLEGE_LOCATIONS = [
  { name: 'Library', lat: 12.9716, lng: 77.5946 },
  // Add more locations...
]

Google APIs (Optional)

For email scheduling features, you'll need to:

  1. Create a project in Google Cloud Console
  2. Enable Gmail API and Calendar API
  3. Create OAuth 2.0 credentials
  4. Add credentials to .env file

Technology Stack

  • Frontend: Next.js, React, Tailwind CSS, Leaflet
  • Backend: Express.js, Node.js
  • AI: Ollama (DeepSeek model)
  • File Processing: pdf-parse, mammoth

Theme

The platform uses a black/white/purple color scheme as specified in the design requirements.

Development

  • Frontend runs on port 3000
  • Backend runs on port 5000
  • Ollama runs on port 11434

Notes

  • No authentication or database is implemented (stateless API)
  • All data is processed in-memory
  • Update environment variables before running in production

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors