Skip to content

namin/argmap

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ArgMap

open-ended argument mapping

License: MIT Python 3.13+ Ask DeepWiki

Features

  • Open-ended types: LLM chooses appropriate node and edge types for each argument
  • Provenance tracking: Links claims back to exact positions in source text
  • Interactive visualization: Force-directed graph with draggable nodes
  • Flexible API: Supports Gemini API key or Google Cloud Project authentication

Quick Start

Backend

# Install dependencies
pip install -r requirements.txt

# Set your API key
export GEMINI_API_KEY="your-key"
# OR
export GOOGLE_CLOUD_PROJECT="your-project"
# OR leave it to the frontend

# Run the server
python server.py

Server runs at http://localhost:8000

Frontend

cd frontend
npm install
VITE_API_URL=http://localhost:8000 npm run dev

Frontend runs at http://localhost:5173

Usage

  1. Enter philosophical or argumentative text
  2. (Optional) Enter your Gemini API key
  3. Click "Extract Argument Map"
  4. Explore the graph, summary, and JSON output
  5. Click on nodes/edges to see details

API

POST /api/extract

Extract argument map from text.

Request:

{
  "text": "Your philosophical text here...",
  "api_key": "optional-api-key",
  "temperature": 0.0,
  "model": "gemini-2.5-flash"
}

Response:

{
  "success": true,
  "result": {
    "version": "1.0",
    "source_text": "...",
    "nodes": [
      {
        "id": "n1",
        "content": "the claim",
        "type": "premise",
        "rhetorical_force": "asserts",
        "span": {"start": 0, "end": 50}
      }
    ],
    "edges": [
      {
        "source": "n1",
        "target": "n2",
        "type": "supports",
        "explanation": "provides evidence for"
      }
    ],
    "summary": "Overview of the argument",
    "key_tensions": ["list of gaps or issues"]
  }
}

Environment Variables

  • GEMINI_API_KEY: Gemini API key
  • GOOGLE_CLOUD_PROJECT: Google Cloud project (for Vertex AI)
  • GOOGLE_CLOUD_LOCATION: Cloud location (default: us-central1)
  • LLM_MODEL: Model to use (default: gemini-2.5-flash)
  • CACHE_LLM: Enable LLM response caching
  • LLM_CACHE_DIR: Cache directory (default: .cache/llm)

Project Structure

argmap/
├── argmap/
│   ├── llm.py           # LLM client with dual auth
│   ├── schema.py        # Pydantic models
│   ├── prompts.py       # Extraction prompts
│   └── extract.py       # Core extraction logic
├── frontend/
│   ├── src/
│   │   ├── App.tsx      # Main application
│   │   ├── types/       # TypeScript types
│   │   └── components/
│   │       ├── ArgumentGraph.tsx  # Graph visualization
│   │       └── NodeDetails.tsx    # Detail panel
├── server.py            # FastAPI server
└── requirements.txt

License

MIT

About

open-ended argument mapping

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published