UN-3454 [FIX] Add CSV/JSON export to workflow execution logs modal#1949
UN-3454 [FIX] Add CSV/JSON export to workflow execution logs modal#1949
Conversation
Adds an Export dropdown next to the existing log modal title with "Download as JSON" and "Download as CSV" options. Scope is the same as the existing eye-icon view: one workflow execution, optionally one file execution within it. Backend reuses the existing ExecutionLogFilter so the modal's filters (file_execution_id, log_level) apply identically to the export. The endpoint is buffered with a 50,000-row hard cap that returns 413 with a "narrow your filter" message above the limit. Out of scope (deferred): workflow-wide and deployment-wide log export. Those need date-range UI, async job processing, signed URL delivery, and a retention policy — a separate effort. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Summary by CodeRabbit
WalkthroughAdds a backend export endpoint for execution logs (CSV/JSON) with a 50,000-row synchronous cap and a frontend UI/button to download filtered execution logs; oversized exports return HTTP 413. ChangesExecution Log Export
sequenceDiagram
participant User
participant Frontend
participant Backend
participant Database
User->>Frontend: Click "Export" (JSON/CSV)
Frontend->>Backend: GET /execution/{id}/logs/export/?file_format=...
Backend->>Database: Query execution logs (filters, order_by, .only)
Database-->>Backend: Rows (up to MAX_SYNC_EXPORT_ROWS+1)
alt rows > MAX_SYNC_EXPORT_ROWS
Backend-->>Frontend: HTTP 413 with JSON error
Frontend-->>User: Show alert with error message
else rows ≤ MAX_SYNC_EXPORT_ROWS
Backend-->>Frontend: 200 with attachment (application/json or text/csv)
Frontend->>User: Trigger browser download (filename from content-disposition)
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes 🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
| Filename | Overview |
|---|---|
| backend/workflow_manager/workflow_v2/execution_log_view.py | New export action with cap-sentinel pattern, non-dict data guard, charset-correct CSV Content-Type, and mixed HttpResponse/DRF Response return types (valid in DRF); no logic bugs found. |
| frontend/src/components/logging/log-modal/LogModal.jsx | Adds handleExport, exporting state guard, span-wrapped Tooltip fix for disabled-button hover, and Dropdown export menu; 413 blob-decode error path is correctly implemented. |
| backend/workflow_manager/execution/urls.py | Adds execution-log-export URL pattern mapping GET to the new export action; straightforward and consistent with existing url patterns. |
| backend/workflow_manager/workflow_v2/urls/workflow.py | Mirrors the new export URL in the legacy workflow url file for consistency; no issues. |
| frontend/src/components/logging/log-modal/LogModal.css | Adds minimal styling (margin-left, height: auto) for the export button; no issues. |
Sequence Diagram
sequenceDiagram
participant U as User
participant FE as LogModal React
participant BE as WorkflowExecutionLogViewSet
participant DB as Database
U->>FE: Click Export then Download as CSV or JSON
FE->>FE: setExporting true, build params
FE->>BE: "GET /execution/pk/logs/export/?file_format=csv"
BE->>BE: filter_queryset get_queryset
BE->>DB: SELECT rows LIMIT 50001
DB-->>BE: rows up to 50001
alt rows greater than 50000
BE-->>FE: 413 error Too many logs
FE->>FE: decode blob parse JSON setAlertDetails
else rows within limit
BE->>BE: _build_csv or _build_json
BE-->>FE: 200 response with Content-Disposition attachment
FE->>FE: createObjectURL anchor click revokeObjectURL
FE->>U: File downloaded
end
FE->>FE: setExporting false
Prompt To Fix All With AI
Fix the following 1 code review issue. Work through them one at a time, proposing concise fixes.
---
### Issue 1 of 1
backend/workflow_manager/workflow_v2/execution_log_view.py:157
The `json.dumps` call uses `ensure_ascii=True` (the Python default), so any non-ASCII characters in log messages — file paths, error text in Japanese/Arabic/Cyrillic, etc. — will be escaped as `\uXXXX` sequences in the exported file. Since the exported file is intended for human debugging (paste into a ticket, post-mortem), reading `\u30a8\u30e9\u30fc` instead of `エラー` in a text editor makes the file significantly harder to use. Passing `ensure_ascii=False` produces valid UTF-8 JSON that is much more readable, and the `Content-Type: application/json` header already declares the encoding correctly.
```suggestion
return json.dumps(entries, ensure_ascii=False)
```
Reviews (2): Last reviewed commit: "UN-3454 [FIX] Use globalThis.URL instead..." | Re-trigger Greptile
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
Inline comments:
In `@backend/workflow_manager/workflow_v2/execution_log_view.py`:
- Around line 84-89: The Content-Type for CSV responses currently uses
"text/csv" without an explicit charset; update the branch in
execution_log_view.py that builds CSV responses (the block calling
self._build_csv(queryset) in the method that sets export_format) to set
content_type to "text/csv; charset=utf-8" instead of "text/csv" so non-ASCII
characters are interpreted as UTF-8 by browsers; leave the JSON branch using
"application/json" unchanged.
In `@frontend/src/components/logging/log-modal/LogModal.jsx`:
- Around line 241-258: The Tooltip does not show when the Button is disabled
because Chrome blocks pointer events on disabled buttons; wrap the Button in a
non-disabled wrapper element (e.g., a <span>) inside the Tooltip so events reach
Tooltip even when pagination.total is falsy or exporting is true; update the JSX
around Tooltip -> Button (the element with className "export-btn-outlined", icon
DownloadOutlined, loading={exporting}, disabled={!pagination.total ||
exporting}) to place the Button inside a <span> child and keep the Tooltip title
logic unchanged so the "No logs to export" hint appears when pagination.total is
zero.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 0ad25167-6904-42e5-a77d-bb3c6b8bb961
📒 Files selected for processing (5)
backend/workflow_manager/execution/urls.pybackend/workflow_manager/workflow_v2/execution_log_view.pybackend/workflow_manager/workflow_v2/urls/workflow.pyfrontend/src/components/logging/log-modal/LogModal.cssfrontend/src/components/logging/log-modal/LogModal.jsx
muhammad-ali-e
left a comment
There was a problem hiding this comment.
Automated multi-agent review (Code Reviewer, Comment Analyzer, PR Test Analyzer, Silent Failure Hunter, Code Simplifier, Type Design Analyzer). Findings below in priority order. Greptile's prior comments on LogModal.jsx:258 (Tooltip-on-disabled-Button) and execution_log_view.py:89 (CSV charset=utf-8) already cover those issues — not re-raised.
Bot findings: - Add `; charset=utf-8` to text/csv content type so non-ASCII log content (file names, non-Latin error text) renders correctly in Excel. - Wrap the disabled Export button in a span so the antd Tooltip still fires on mouseenter (disabled buttons swallow pointer events). Self-review findings: - Replace `count()` + iterate with a single `list(qs[:CAP+1])` materialization to drop the second DB scan. - Extract `_normalize()` so CSV and JSON share one source of truth for null/dict-guard handling, with a logger.warning when `data` is non-dict so silent blank rows leave a diagnostic trail. - Always surface a 413-specific user message, even when the JSON blob body fails to decode — preserves the narrow-your-filter UX instead of falling through to a generic alert. - Skip the `log_level` query param when no level is selected; axios serializes JS null as the literal string `"null"` which falls through the backend filter's INFO-min default and silently drops DEBUG rows. - Lift the nested ternary tooltip text into a named local for readability. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Review pass — summaryPushed Fixed
Declined / deferred (with rationale on each thread)
Pre-commit clean on the changed files. |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
frontend/src/components/logging/log-modal/LogModal.jsx (1)
116-116: ⚡ Quick win
response.datais already aBlob— avoid the redundant wrapper.When
responseType: "blob"is set, axios deliversresponse.dataas a nativeBlob. Passing it intonew Blob([…])copies the entire content into a secondBlobbefore creating the object URL, doubling peak memory for large exports.🛠️ Proposed fix
- const blobUrl = window.URL.createObjectURL(new Blob([response.data])); + const blobUrl = window.URL.createObjectURL(response.data);🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@frontend/src/components/logging/log-modal/LogModal.jsx` at line 116, The code is creating a new Blob from response.data (const blobUrl = window.URL.createObjectURL(new Blob([response.data]));) even though axios was requested with responseType: "blob" so response.data is already a Blob; remove the redundant wrapping and call window.URL.createObjectURL(response.data) instead (or, if you want to be defensive, check instanceof Blob and use response.data directly when it is a Blob) to avoid copying and doubling memory in the LogModal blob URL creation.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
Nitpick comments:
In `@frontend/src/components/logging/log-modal/LogModal.jsx`:
- Line 116: The code is creating a new Blob from response.data (const blobUrl =
window.URL.createObjectURL(new Blob([response.data]));) even though axios was
requested with responseType: "blob" so response.data is already a Blob; remove
the redundant wrapping and call window.URL.createObjectURL(response.data)
instead (or, if you want to be defensive, check instanceof Blob and use
response.data directly when it is a Blob) to avoid copying and doubling memory
in the LogModal blob URL creation.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: d5a4a5b6-8fd3-4d7c-9cf2-b18cc89b50ef
📒 Files selected for processing (2)
backend/workflow_manager/workflow_v2/execution_log_view.pyfrontend/src/components/logging/log-modal/LogModal.jsx
🚧 Files skipped from review as they are similar to previous changes (1)
- backend/workflow_manager/workflow_v2/execution_log_view.py
SonarCloud rule javascript:S7764 — `globalThis` works in workers and non-browser JS environments where `window` is undefined. Matches the existing convention in SideNavBar.jsx. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
|
Tip For best results, initiate chat on the files or code changes.
The PR description image confirms the Export dropdown UI is in place with both "Download as JSON" and "Download as CSV" options wired correctly. A few notes on the declined items — the rationale for each holds up:
Triggering a fresh review pass on the updated commit. [review] |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
frontend/src/components/logging/log-modal/LogModal.jsx (1)
116-116: 💤 Low value
new Blob([response.data])is redundant whenresponseType: "blob"When axios is configured with
responseType: "blob",response.datais already aBlobinstance. Wrapping it innew Blob([...])creates an unnecessary copy.♻️ Proposed simplification
- const blobUrl = globalThis.URL.createObjectURL(new Blob([response.data])); + const blobUrl = globalThis.URL.createObjectURL(response.data);🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@frontend/src/components/logging/log-modal/LogModal.jsx` at line 116, The call creates an unnecessary Blob copy; in LogModal.jsx where you build the download URL (the line using globalThis.URL.createObjectURL(new Blob([response.data]))), use the response.data Blob directly: pass response.data to URL.createObjectURL instead of wrapping it in new Blob([...]); also verify the axios request that produced response has responseType: "blob" so response.data is already a Blob.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
Nitpick comments:
In `@frontend/src/components/logging/log-modal/LogModal.jsx`:
- Line 116: The call creates an unnecessary Blob copy; in LogModal.jsx where you
build the download URL (the line using globalThis.URL.createObjectURL(new
Blob([response.data]))), use the response.data Blob directly: pass response.data
to URL.createObjectURL instead of wrapping it in new Blob([...]); also verify
the axios request that produced response has responseType: "blob" so
response.data is already a Blob.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 74694528-13b9-458a-ae10-dd32120c7ceb
📒 Files selected for processing (1)
frontend/src/components/logging/log-modal/LogModal.jsx
|
@greptileai start review Re-review request — the two issues flagged in the original 3/5 review (CSV charset and disabled-button tooltip span wrapper) have been fixed in 1cfe52b. SonarCloud findings (window → globalThis) fixed in 335d5a2. |
Frontend Lint Report (Biome)✅ All checks passed! No linting or formatting issues found. |
Test ResultsSummary
Runner Tests - Full Report
SDK1 Tests - Full Report
|
|



What
exportaction onWorkflowExecutionLogViewSetatGET /execution/<pk>/logs/export/?file_format=csv|json(also acceptsfile_execution_idandlog_level— same filters as the list endpoint).LogModalheader. Disabled when there are no logs to export, and while an export is in flight (prevents double-submit).Why
How
ExecutionLogFilterso the active filters in the modal (file scope, log level) apply identically to the export. No new buttons sprinkled around the table.HttpResponse, not streaming. Real-world export size observed ~17 KB; streaming would loseContent-Length(browser progress %) and risk corrupted partial files on mid-stream errors. Industry standard for capped sync exports (Datadog/Sentry/CloudWatch/Kibana/Grafana) is buffered.OrgAwareManagerand the existingOrganizationFilterBackendenforce org scoping for the new action exactly like they do forlist.file_format, notformat. DRF reserves?format=for content negotiation and 404s when the value doesn't match a registered renderer. There's a comment on the backend variable to prevent re-introduction.Out of scope (deferred)
These need date-range UI, async Celery job, signed URL delivery, retention policy, and rate-limiting. Will be tracked separately if/when users explicitly ask. The 50k row cap exists precisely because async deployment-wide export is a different feature with different ergonomics.
Can this PR break any existing features?
No regressions expected. Reasoning:
listendpoint (GET /execution/<pk>/logs/) is unchanged.get_queryset()was not modified; the.order_by("event_time").only(...)and.count()calls are scoped insideexport()only./execution/<pk>/logs/export/is added in both the active route file (backend/workflow_manager/execution/urls.py) and the legacy one (backend/workflow_manager/workflow_v2/urls/workflow.py) so behavior is consistent regardless of which prefix routes the request.LogModal.jsx. The recent UN-3431CustomMarkdownintegration in the same file is preserved (verified after rebase).Database Migrations
Env Config
Notes on Testing
Smoke test plan:
MAX_SYNC_EXPORT_ROWSlocally → confirm the alert renders the "narrow your filter" message instead of a generic error.Code review (separate pass) called out:
datafield in CSV path (defensive against historical/manually-inserted log records that aren't dicts).exportingstate guard to prevent concurrent submits.Both applied.
Related Issues or PRs
LogModal.jsxwhich was last modified by UN-3431 (UN-3431 [FIX] Stream tool-run logs to workflow execution UI with markdown rendering #1927). No conflict — UN-3431'sCustomMarkdownintegration is preserved.Screenshots
To be added on the PR after manual verification — shows the new "Export" button in the modal header.
Checklist
UN-3454 [FIX] ...listendpoint untouched)