Skip to content

abbas-cs/Blog-Writing-Agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Blog Writing Agent

An AI-powered blog generation agent that creates structured technical posts from a user-provided topic. The project uses a LangGraph workflow orchestrating planning, optional web research, and section-level writing via LLMs (OpenAI and HuggingFace are supported).

This repository includes:

  • A compiled graph workflow implemented in main.ipynb that orchestrates routing, planning, optional research, section generation, and final assembly.
  • A Streamlit frontend at frontend.py that provides a simple UI for entering a topic, running the workflow, and viewing/downloading results.

workflow diagram

Features

  • Generates multi-section, structured blog posts with section goals, bullets, and target word counts.
  • Supports three research modes: closed_book, hybrid, and open_book.
  • Uses Pydantic models for structured outputs (Plan, Task, EvidenceItem).
  • Exports final post as Markdown and (optionally) PDF.

Quick Start

  1. Install Python 3.12+ and clone this repo.
  2. Create a virtual environment and activate it:
python -m venv .venv
# Windows
.venv\Scripts\Activate.ps1
# macOS / Linux
source .venv/bin/activate
  1. Install dependencies (choose one):
# If you have a requirements.txt
pip install -r requirements.txt

# Or use `uv` (lockfile present as `uv.lock`):
uv install

# To add an optional package, e.g. reportlab for PDF export:
uv add reportlab
  1. Add credentials in a .env file at the project root (only the keys you need):
OPENAI_API_KEY=sk-...
HUGGINGFACEHUB_API_TOKEN=hf_...
TAVILY_API_KEY=...
  1. Run the Streamlit UI:
streamlit run frontend.py

Open the UI and enter a topic in the sidebar, then click Generate.

Usage (programmatic)

If you prefer to call the workflow directly from Python (the notebook compiles a workflow object), you can do so from a script or REPL after importing or executing the notebook code. The workflow expects a state dict like:

state = {
    "topic": "Your topic here",
    "mode": "",
    "needs_research": False,
    "queries": [],
    "evidence": [],
    "plan": None,
    "sections": [],
    "final": "",
}

# then call
result = workflow.invoke(state)
# result["final"] contains the generated markdown

See main.ipynb for the full implementation and helper functions like run(topic).

Project Layout

  • main.ipynb — primary workflow notebook (defines nodes, compiles workflow).
  • frontend.py — Streamlit UI that loads and invokes the notebook workflow.
  • pyproject.toml — project metadata and dependencies.
  • State of Multimodal Large Language Models (LLMs) in 2026.md — an example generated post.

PDF Export / Styling

  • The Streamlit UI offers a PDF download; PDF generation uses reportlab. To enable it:
pip install reportlab
  • The generated PDF is a plain rendering of the markdown. For richer PDF styling (fonts, headings, code blocks), consider a markdown-to-PDF toolchain such as weasyprint or pandoc + wkhtmltopdf.

Tech Stack

  • Python 3.12+
  • LangChain + LangGraph for orchestration and LLM interactions
  • Pydantic for structured outputs
  • Streamlit for the frontend UI
  • Optional: HuggingFace / OpenAI model endpoints

Known Limitations

  • PDF rendering is basic; complex layouts (code blocks, tables) may not render perfectly.
  • Per-node live progress is not enabled by default; implementing a progress callback inside the notebook graph is recommended for detailed UI updates.
  • Workflow execution may call external LLM APIs — ensure API keys and network access.

Support

For questions or issues, open an issue or contact the author (Abbas Ali). Include logs and a description of steps to reproduce.


Author: Abbas Ali

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors