Skip to content

ArchitJ6/TweetGPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🐦 TweetGPT β€” AI-Powered Tweet Generator

TweetGPT is a modern, full-stack application that uses Groq or Ollama LLM models to generate, evaluate, and refine tweets automatically β€” with no manual intervention. It integrates with X (Twitter) OAuth for secure login and posting, features a sleek, mobile-friendly UI built with React + Vite, and deploys via Docker with Nginx for production-ready frontend hosting.

πŸ“¦ GitHub Repo: https://github.com/ArchitJ6/TweetGPT


✨ Features

  • πŸ€– AI-Powered Tweet Generation β€” Generates witty, snappy tweets using Groq or Ollama.
  • πŸ”„ Iterative Optimization β€” Auto-refines tweets until quality and engagement criteria are met.
  • πŸ” X OAuth Integration β€” Secure sign-in and direct posting to X.
  • 🎯 Automatic Evaluation β€” AI checks humor, originality, brevity, and style without manual review.
  • ⚑ Fast Model Switching β€” Toggle between Groq and Ollama seamlessly.
  • 🎨 Modern UI β€” Responsive, Twitter-inspired interface.
  • πŸ“± Mobile Friendly β€” Optimized for all devices.
  • 🐳 Dockerized Deployment β€” Backend, frontend, and Nginx all containerized.
  • 🌐 Nginx Hosting β€” Production-grade static file serving.
  • πŸ“Š Autonomous AI Workflow β€” Fully automated generation–evaluation–refinement loop.

πŸ›  Tech Stack

Layer Technology
Frontend React, Vite, JavaScript, CSS
Backend Python, Flask, LangChain, LangGraph
AI Models Groq LLM, Ollama LLM
Auth X (Twitter) OAuth 2.0
Deployment Docker, Docker Compose, Nginx
Version Control Git, GitHub

βš™οΈ Prerequisites

  • 🐍 Python 3.12
  • 🐦 X Developer Account with API access
  • πŸ”‘ Groq API Key
  • πŸ’» Ollama installed (if using Ollama mode)
  • 🐳 Docker & Docker Compose

🧠 AI Workflow

graph TD
    A[Start] --> B[Generate Tweet]
    B --> C[Evaluate Tweet]
    C -->|Approved| E[End]
    C -->|Needs Improvement| D[Refine Tweet]
    D --> C
Loading

Steps:

  1. Generate β†’ AI crafts a humorous tweet based on the topic.
  2. Evaluate β†’ AI reviews for creativity, humor, and shareability.
  3. Refine β†’ AI improves the tweet if it doesn’t pass.
  4. Loop until approved or max iterations reached.

πŸ“‚ Project Structure

TweetGPT/
β”œβ”€β”€ backend/
β”‚   β”œβ”€β”€ .env.example
β”‚   β”œβ”€β”€ app.py
β”‚   β”œβ”€β”€ config.py
β”‚   β”œβ”€β”€ Dockerfile
β”‚   β”œβ”€β”€ llm_workflow.py
β”‚   └── requirements.txt
β”œβ”€β”€ frontend/
β”‚   β”œβ”€β”€ .env.example
β”‚   β”œβ”€β”€ Dockerfile
β”‚   β”œβ”€β”€ eslint.config.js
β”‚   β”œβ”€β”€ index.html
β”‚   β”œβ”€β”€ package-lock.json
β”‚   β”œβ”€β”€ package.json
β”‚   β”œβ”€β”€ vite.config.js
β”‚   β”œβ”€β”€ public/
β”‚   β”‚   └── bot.svg
β”‚   └── src/
β”‚       β”œβ”€β”€ App.jsx
β”‚       β”œβ”€β”€ index.css
β”‚       β”œβ”€β”€ main.jsx
β”‚       β”œβ”€β”€ components/
β”‚       β”‚   β”œβ”€β”€ index.js
β”‚       β”‚   β”œβ”€β”€ NavigationBar.jsx
β”‚       β”‚   └── NotificationBar.jsx
β”‚       β”œβ”€β”€ pages/
β”‚       β”‚   β”œβ”€β”€ GeneratorPage.jsx
β”‚       β”‚   β”œβ”€β”€ HomePage.jsx
β”‚       β”‚   └── index.js
β”‚       └── services/
β”‚           β”œβ”€β”€ api.js
β”‚           └── index.js
β”œβ”€β”€ .dockerignore
β”œβ”€β”€ .gitignore
└── docker-compose.yml

πŸš€ Setup & Run

1️⃣ Clone Repository

git clone https://github.com/ArchitJ6/TweetGPT.git
cd TweetGPT

2️⃣ Configure Environment Variables

πŸ“ Backend (backend/.env)

cd backend
cp .env.example .env
X_CLIENT_ID=""
X_CLIENT_SECRET=""
X_REDIRECT_URI="http://localhost:5000/callback"
X_AUTH_URL="https://twitter.com/i/oauth2/authorize"
X_TOKEN_URL="https://api.twitter.com/2/oauth2/token"
GROQ_API_KEY="your_groq_api_key"
USE_OLLAMA="true_or_false"
GROQ_LLM_MODEL_NAME="your_groq_llm_model_name"
OLLAMA_MODEL_NAME="your_ollama_model_name"

πŸ“ Frontend (frontend/.env)

cd frontend
cp .env.example .env
VITE_API_BASE_URL="http://localhost:5000"

3️⃣ Run Without Docker (Optional)

πŸ”Ή Backend

cd backend
pip install -r requirements.txt
python app.py

πŸ”Ή Frontend

cd frontend
npm install
npm run dev

4️⃣ Run With Docker (Recommended)

docker-compose up --build

Frontend will be served via 🌐 Nginx at http://localhost:5173 and backend will run at http://localhost:5000.


πŸ“Έ Screenshots

Home Tweet Generated Tweet Optimization History


πŸ“œ License

MIT License β€” free to use and modify.


🀝 Contributing

PRs welcome! Please follow the existing code style.


Made with ❀️ by Archit Jain