Merge CSV logs into one timeline. Review. Export to PDF. Share with teammates.
Timeline refreshes every 15 seconds. Share the workspace link + password so others can add logs too.
- Python 3.10+
- PostgreSQL
cd DIR-TimelineBot
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt(Windows: use venv\Scripts\activate instead of source venv/bin/activate)
macOS (Homebrew):
brew install postgresql@16
brew services start postgresql@16
createdb timelinebotOr use the setup script (macOS only):
chmod +x setup_postgres.sh
./setup_postgres.shOther systems: Install PostgreSQL, start it, then run createdb timelinebot (or create a database named timelinebot).
cp .env.example .envEdit .env and set DATABASE_URL. For local PostgreSQL:
DATABASE_URL=postgresql://localhost/timelinebot
Then:
source venv/bin/activate
python run.pyPress Ctrl+C in the terminal.
- Create a workspace – Pick a password, get a link
- Upload CSVs – Add log files (contributor name required)
- Add text notes – Manual entries with timestamps
- Review – Check/uncheck what goes in the report
- Export PDF – Queue a report, download when ready
| Feature | Description |
|---|---|
| Create workspace | Set a password, get a shareable link (e.g. http://localhost:5001/w/abc12345) |
| Join workspace | Enter workspace ID + password to collaborate |
| Workspace passwords | Required for create and join; share link + password with teammates |
| Reset workspace | Delete a single workspace and all its data (ID + password, 3 confirmations) |
| Feature | Description |
|---|---|
| Upload CSV logs | Add multiple CSVs; each gets a color and optional comment |
| Contributor name | Required when uploading (tracks who added each log) |
| Auto-detect delimiter | Comma, tab, semicolon, pipe |
| Auto-detect timestamp | Or specify the timestamp column manually |
| Background processing | Upload returns immediately; queue shows status |
| Duplicate detection | Identical rows (same full content) across any CSV are deduplicated |
| Feature | Description |
|---|---|
| Add manual notes | With current UTC time |
| Contributor name | Required for text entries |
| No deduplication | Every text entry is kept |
| Feature | Description |
|---|---|
| Merged timeline | All entries sorted chronologically, color-coded by source |
| Select/deselect | Per entry or bulk (Select All, Deselect All, Deselect by source) |
| Fast bulk actions | Select All / Deselect All update instantly (no full reload) |
| PDF export | Exports exactly the checked entries in the Review panel |
| Report queue | Reports queued and generated in background; download when ready |
| Original CSV filename | Shown in PDF alongside comment |
| Feature | Description |
|---|---|
| Color scheme | Matches app colors (accent, muted, source colors) |
| White page | Standard print-friendly background |
| Source colors | Each log source has its own color in the report |
| Original filename + comment | Both shown for each log source |
| Feature | Description |
|---|---|
| Chunked CSV parsing | 50k rows per chunk |
| Bulk inserts | 15k rows per batch |
| Connection pooling | 2–20 connections |
| Paginated timeline | 500 entries per page, "Load more" |
| Streaming PDF export | Up to 100k selected entries |
| Configurable workers | WORKER_THREADS in .env (default: 4) |
| Type | How |
|---|---|
| One workspace | Go to http://localhost:5001/reset → enter ID + password → confirm 3x → reset. Workspace is deleted. |
| Everything | Run python reset_all.py – wipes DB, uploads, reports, restarts app. |
git remote add origin <your-url>
git push -u origin main| Variable | What it does | Example |
|---|---|---|
DATABASE_URL |
Where the database lives (required) | postgresql://localhost/timelinebot |
WORKER_THREADS |
How many CSVs to process at once | 4 = normal, 2 = slow PC, 8 = fast PC |
UPLOADS_DIR |
Folder for uploaded files | ./uploads (default) |
REPORTS_DIR |
Folder for PDF reports | ./reports (default) |