Skip to content

Harden MCP server and fix HTTP integration tests#195

Closed
RafaelPo wants to merge 43 commits intomainfrom
fix/pr-review-hardening
Closed

Harden MCP server and fix HTTP integration tests#195
RafaelPo wants to merge 43 commits intomainfrom
fix/pr-review-hardening

Conversation

@RafaelPo
Copy link
Contributor

Summary

  • Harden MCP server for public deployment (input validation, rate limiting fixes, auth hardening)
  • Fix auth rate limiter crash when client IP is None
  • Fix HTTP results tool re-registration to properly replace schema
  • Test real Redis results flow in HTTP integration tests — remove mocked try_cached_result/try_store_result and exercise the full pipeline: submit → poll → results → CSV download with pagination verification

Test plan

  • Unit tests pass (215 passed, 15 skipped)
  • HTTP integration tests pass with real Redis + API key (3/3 passed)
  • Screen pipeline: submit → poll → results → CSV download verified
  • Agent pipeline: pagination (page_size=1), cache hit on page 2, full CSV download with column verification
  • Dual polling mode (REST + MCP tool) verified

🤖 Generated with Claude Code

RafaelPo and others added 30 commits February 21, 2026 20:15
- http_config.py: HTTP mode setup (auth + no-auth) with OAuth, CSP, middleware
- tool_helpers.py: transport-aware helpers (_with_ui, _submission_ui_json,
  _get_client with per-request JWT, _fetch_task_result)
- tool_descriptions.py: transport-specific tool descriptions patched at startup
- routes.py: REST endpoints for progress polling and CSV download
- result_store.py: Redis-backed result caching with pagination
- middleware.py: rate limiting middleware
- tools.py: all tools use _with_ui for widget JSON, token storage in Redis
- models.py: add input_data/input_json alternatives to input_csv
- app.py: separate lifespans for stdio/http/no-auth-http
- server.py: CLI args for --http, --no-auth, --host, --port
- deploy/: Dockerfile, docker-compose, env example

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
… http_config

- Remove `settings` field from ServerState; add `everyrow_api_url` and
  `preview_size` as direct fields with defaults
- Add lazy `settings` property that returns the right settings via
  `_get_http_settings()` / `_get_dev_http_settings()` / `_get_stdio_settings()`
- Extract `_BaseHttpSettings` base class for shared Redis and HTTP fields
- Remove eager module-level settings singletons from config.py
- Inline Redis creation in `configure_http_mode`, extract `_patch_tool_csp`
- Update auth.py to use `_get_http_settings()` instead of `http_settings`

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…ering

- Fix EveryRowAuthProvider constructor to pass (redis, token_verifier)
  instead of raw settings fields
- Use Transport.HTTP/STDIO enum values instead of string literals
- Remove duplicate results.html/session.html resource registrations
  from app.py (only registered in http_config.py for HTTP mode)
- Swap middleware order so logging wraps rate limiting (sees all requests)
- Merge identical progress descriptions in tool_descriptions.py
- Add override notes above everyrow_progress and everyrow_results
- Restore manifest.json everyrow_results description for stdio accuracy

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Each tool return site now builds the base TextContent list and
conditionally prepends the widget JSON only in HTTP mode. This
avoids eagerly computing _submission_ui_json (which stores tokens
in Redis) in stdio mode where the result was discarded.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
All 6 submission tools now call create_tool_response() instead of
repeating the stdio/http branching inline. The helper lives in
tool_helpers.py alongside _submission_text and _submission_ui_json.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Remove dead fields: auth_provider (write-only), dev_mode (only read
  by dead settings property), settings property (never called)
- Remove dead _build_csv_url from result_store.py
- Add validate_assignment=True to catch invalid transport values
- Fix is_http to use == Transport.HTTP (closed-form check)
- Fix tests to use Transport.HTTP instead of "streamable-http" string
- Use spec=AuthenticatedClient on mock clients for Pydantic compat
- Add docstring note about singleton client in no-auth HTTP mode (C1)
- Restore field descriptions shortened in models.py (equivalence_relation,
  merge fields) and fix misleading output_path description for stdio

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The everyrow_client fixture was setting the old app._client attribute
which no longer exists after the refactor. Use state.client instead.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Export assert_stdio_clean from test_stdio_content (rename private helpers
to public) and apply it at every submit/progress/results step in
test_integration.py. Also assert len(result)==1 and normalize to
result[0].text, fixing a stale comment about widget JSON.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace the global state.client singleton with a ClientFactory yielded
from lifespans and accessed via ctx.request_context.lifespan_context.
Tools now receive the client through FastMCP's Context parameter
injection instead of reading mutable global state, eliminating 22
patch.object(state, "client", ...) patterns in tests.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace bare Context with Context[ServerSession, SessionContext] alias,
giving type checkers visibility into lifespan_context fields and making
the context contract explicit via a dataclass.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace make_singleton_client_factory with lambda: client and inline
make_http_auth_client_factory into _http_lifespan. Removes indirection
that added no value over the inline forms.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The TOKEN_BUDGET setting (default 20k) was defined but unused. This wires
it in: after slicing a result page, clamp_page_to_budget binary-searches
for the largest subset that fits within the budget (~4 chars/token
heuristic). Applied in all three result paths (Redis cache hit, Redis
store, inline fallback). The progress completed message now recommends
page_size=preview_size and notes the auto-adjustment.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…actories

ServerState no longer stores everyrow_api_url, preview_size, or
token_budget as flat fields.  A new `settings` property dispatches to
the correct @lru_cache-d factory (_get_stdio_settings,
_get_dev_http_settings, or _get_http_settings) based on transport mode.

Extracted _CommonSettings base class in config.py so preview_size,
token_budget, and everyrow_api_url are defined once and inherited by
both StdioSettings and _BaseHttpSettings.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Convert TaskState from stdlib dataclass to Pydantic BaseModel with
  PrivateAttr, computed_field properties, and model_dump serialisation
- Add write_initial_task_state for submission tools, consolidate file
  writing into _write_task_state_file
- Move progress message building into TaskState.progress_message
- Replace ui_dict with model_dump(exclude=_UI_EXCLUDE) at call sites
- Remove dead inline-results fallback from everyrow_results (Redis is
  always available in HTTP mode)
- Extract _register_widgets and _register_routes in http_config
- Clean up dead re-exports in server.py
- Use dedent for multiline tool response strings

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
… harden edges

- Move Redis client creation and state.mcp_server_url assignment to server.py
  (before configure_http_mode) so state is fully initialised before consumers read it
- Pass redis_client explicitly to configure_http_mode, fixing RedisStore/Redis
  type mismatch in EveryRowAuthProvider and RateLimitMiddleware
- Add missing lifespan assignment in auth branch of configure_http_mode
- Add ctx argument to all tool calls in test_http_real.py (was missing after
  the lifespan context refactor)
- Add everyrow_single_agent to expected tools in test_http_transport.py
- Add meta={"ui": {"resourceUri": ...}} to everyrow_progress tool decorator
- Guard state.store.ping() with None check in both HTTP lifespans
- Reject FAILED/REVOKED tasks in _fetch_task_result before fetching results
- Use get_tool() instead of _tools.get() in _patch_tool_csp
- Use `is not None` in load_csv source counting
- Extract override_state() context manager in conftest.py, deduplicate 4 test fixtures
- Inline _cors_preflight, add return type annotations to route handlers
- Remove dead _validate_redis model validator (lifespan ping catches misconfigured Redis)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Remove `assert progress["total"] > 0` from screen integration test
  (screen tasks don't report row-level totals)
- Rename _get_http_settings → get_http_settings, _get_dev_http_settings →
  get_dev_http_settings, _get_stdio_settings → get_stdio_settings (drop
  leading underscore — these are used across modules)
- Extract local `transport` variable in server.py main()

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
… external URLs

- Fix state.transport never being set (local var instead of state assignment),
  which caused HTTP mode to run as stdio and exit immediately
- Remove progress bar MCP App widget (PROGRESS_HTML, ui://everyrow/progress.html
  resource, widget JSON from everyrow_progress responses)
- Replace "Copy link" button with "Download CSV" button using app.openLink()
- Use app.openLink() for all session/download links in results and session widgets
  so they trigger the host's "Open external link" confirmation dialog
- Fix research popover hover: mouseenter → mouseover (mouseenter doesn't bubble)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…collapse

- Add linkify() helper to auto-link URLs in cell text and research popovers
- Move Download CSV link next to session link as styled anchor
- Fix expand/collapse using innerHTML with linkify instead of textContent

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…s class

Eliminate the ServerState singleton and RedisStore class in favor of:
- A single merged Settings class with transport, is_http, is_stdio properties
- Standalone async functions in redis_store.py using get_redis_client()
- settings.transport set at startup from CLI args

Updates all source modules and 10 test files to use the new architecture.
96 tests pass.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…de_settings

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
request.client.host returns the proxy IP (e.g. Cloudflare tunnel),
causing all users to share a single rate limit bucket. Now checks
CF-Connecting-IP and X-Forwarded-For before falling back.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…_size in hints

- try_cached_result returns None (triggering API re-fetch) when CSV is
  expired or unreadable, instead of returning an empty preview
- Next-page hints use the user's original page_size, not the clamped
  effective size, so the server re-clamps independently per page
- Bump default preview_size from 50 to 100 (token budget auto-clamps)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Hide fields that don't apply in each mode:
- HTTP: remove input_csv/left_csv/right_csv (server can't read client filesystem)
- HTTP: remove output_path from everyrow_results
- Stdio: remove offset/page_size from everyrow_results, make output_path required

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…ata field

Collapse 3 mutually exclusive input fields (input_csv, input_data, input_json)
down to 2 (input_csv, data) where data accepts str|list[dict] with auto-detection.
MergeInput goes from 6 input fields to 4 (left_data/right_data replace
left_input_data/left_input_json/right_input_data/right_input_json).

SingleAgentInput.input_data (dict context, not tabular) is unchanged.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…riptions.py

Replace the single ResultsInput model with StdioResultsInput (output_path
required) and HttpResultsInput (output_path optional, pagination fields).
Each mode gets its own function — everyrow_results_stdio and
everyrow_results_http — eliminating runtime schema patching and description
overrides. The stdio variant is registered by default; server.py re-registers
the HTTP variant when --http is used.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Remove ARG002 from ruff ignore list. Fix violations: add noqa for
auth.py protocol override, remove redundant everyrow_client fixture
params from test_integration.py (already pulled in via real_ctx).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
When the poll token TTL (24h) elapses, get_poll_token() returns None
and _get_csv_url() silently interpolated it into the URL, producing a
broken download link that 403s. Now _get_csv_url() returns None on
expiry, and both callers fall back to a fresh API fetch.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
RafaelPo and others added 10 commits February 24, 2026 09:32
Resolve conflicts to integrate forecast and cancel tools from main with
the split results functions on this branch.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Use Redis pipeline with EXPIRE NX to avoid race where TTL is reset on
every request, and increase default preview_size to 1000 rows.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
ForecastInput now inherits from _SingleSourceInput, gaining the data
field so forecast works in HTTP mode where file paths are unavailable.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Instead of returning a misleading error when try_store_result fails,
return the results inline as a preview with a summary message.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
ToolManager.add_tool() is a no-op for existing names, so the HTTP
override of everyrow_results was silently ignored, leaving the stdio
schema (with required output_path) exposed in HTTP mode.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Verifies that remove_tool + re-register replaces the stdio schema
with HttpResultsInput, preventing regression of the silent no-op bug.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- get_client_ip returns None instead of "unknown" when IP cannot be
  determined; rate limiter skips these requests instead of grouping
  them into a shared bucket.
- HttpResultsInput no longer validates that output_path's parent
  directory exists on the server, since in HTTP mode the path comes
  from a remote client's filesystem.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
get_client_ip now returns None instead of "unknown", but auth's
_client_ip passed it directly to build_key which calls .replace().
Fall back to "unknown" in auth — we want to rate limit unidentified
OAuth clients, unlike the middleware where we skip them.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Security and reliability fixes identified during PR review of the
HTTP transport branch, covering auth, rate limiting, input validation,
container security, CORS, XSS, and error handling.

Auth:
- Fix auth code/refresh token race conditions (get+verify+delete)
- Re-store refresh token on Supabase failure to prevent token loss
- Pin JWT algorithms to RS256 only
- Block empty redirect_uris bypass
- Fix auth rate limit pipeline (nx=True, async context manager)

Rate limiting:
- Only trust proxy headers when TRUST_PROXY_HEADERS=true
- Route unknown IPs to shared bucket instead of bypassing
- Narrow exception catch to (RedisError, OSError)
- Exempt /health from rate limiting

Input validation:
- Add inline data size limits (10MB / 50k rows)
- Cap response_schema properties at 50
- Validate task_id as UUID in all inputs
- Sanitize redis_store.build_key with strict regex
- Cap CSV cache at 50MB

Container security:
- Run as non-root user in Dockerfile
- Remove public Redis port, add requirepass
- Add resource limits and restart policies
- Add .dockerignore to prevent secret leakage

Error handling:
- Strip internal details from user-facing error messages
- Log full exceptions server-side
- Don't clear task state on cancel failure
- Skip file writes and task state clearing in HTTP mode
- Validate required env vars at HTTP startup

CORS / XSS:
- Replace wildcard CORS with origin-locked headers
- Escape dynamic values in HTML widget templates
- Validate task_id in route handlers
- Sanitize Content-Disposition filename
- Add Referrer-Policy header to downloads
- Health endpoint now checks Redis connectivity

Tests updated: 215 passed, 18 skipped, 0 failures.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Remove mocked try_cached_result/try_store_result from integration tests
and exercise the full pipeline: submit → poll → results → CSV download.
Adds download route to ASGI app, pagination assertions, and CSV content
verification.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@RafaelPo
Copy link
Contributor Author

@claude code review

@github-actions
Copy link
Contributor

github-actions bot commented Feb 24, 2026

Claude finished @RafaelPo's task in 2m 5s —— View job


Code Review: Harden MCP server and fix HTTP integration tests

  • Read repo guidelines
  • Analyze PR diff and changed files (41 files, +6418/-1112)
  • Review security hardening changes
  • Review auth and rate limiting changes
  • Review new modules (middleware, routes, redis_store, result_store, tool_helpers)
  • Review test coverage
  • Post review summary

Overall this is a well-structured PR that significantly improves the codebase: the monolithic state.py and redis_utils.py are decomposed into focused modules (redis_store.py, result_store.py, routes.py, middleware.py, tool_helpers.py), input validation is much more thorough, and there's a solid deployment story with Docker. The security hardening (rate limiting, key sanitization, token validation, proxy header trust gating) is sound.

Below are the issues I found, ordered by severity.


Issues

1. _UNSAFE_decode_server_jwt skips signature verification
auth.py:186-196 — The _UNSAFE_decode_server_jwt method decodes JWTs with verify_signature=False. The comment says it's safe because the token comes from Supabase's token endpoint over HTTPS, which is a reasonable trust model for a server-to-server flow. However, the method is @staticmethod and reachable from any code that has a reference to the auth provider. Consider making it a module-level private function (harder to discover) or adding runtime assertions (e.g., that it's only called from _issue_token_response).

Fix this →

2. CORS origin defaults to "*" when mcp_server_url is empty
routes.py:23-24_cors_headers() falls back to "*" when settings.mcp_server_url is empty (which happens in no-auth dev mode). This means the download and progress endpoints would accept cross-origin requests from any domain in that mode. While no-auth mode is dev-only and gated by ALLOW_NO_AUTH, consider explicitly setting the origin to the local URL (e.g., http://localhost:{port}) instead of "*".

Fix this →

3. Singleton Redis client via @lru_cache is not test-safe
redis_store.py:97-106get_redis_client() uses @lru_cache to create a module-level singleton. Tests patch it per-test in conftest, but if any import path bypasses the patch (e.g., calling the function before the patch is applied), it silently uses the production-configured client. Consider using a module-level variable with an explicit set_redis_client() for testing, or clear the cache in the fixture.

4. Missing Access-Control-Allow-Headers on CORS preflight
routes.py:52-56 — The OPTIONS handler returns 204 with Allow-Origin and Allow-Methods, but does not include Access-Control-Allow-Headers. Browsers that send preflight with Access-Control-Request-Headers: Authorization would be rejected. This affects the download endpoint which requires a ?token= param (so it works), but if any future change moves the token to a header, CORS will break.

5. load_authorization_code — GET-then-DELETE is not atomic
auth.py:408-426 — The pattern of GET followed by client_id check followed by DELETE is subject to a race: two concurrent requests could both GET the code successfully before either DELETEs it. Under load, this means a code could be used twice. Consider using a Lua script or Redis transaction with WATCH to make this atomic, or accept the risk given the short TTL and one-time nature of auth codes.

6. clamp_page_to_budget binary search uses O(n log n) JSON serialization
result_store.py:41-61 — Each iteration of the binary search calls json.dumps() on a slice of records to estimate tokens. For large pages (up to 10k rows), this means multiple large JSON serializations. Consider caching the per-row sizes or using a running total instead.

Fix this →


Minor / Nits

7. _validate_task_id raises bare ValueError from except ValueError
models.py:379-385 — The re-raise without from loses the original traceback. Use raise ValueError(...) from exc.

8. Docker STOPSIGNAL SIGTERM is the default
deploy/Dockerfile:30STOPSIGNAL SIGTERM is the Docker default. Not harmful but redundant.

9. .env.example shows MCP_SERVER_URL is required but not listed
deploy/.env.example — The compose file requires MCP_SERVER_URL but the example env file doesn't include it. Users will get a confusing error on first run.

Fix this →

10. esc() function in session widget HTML is undefined
templates.py (SESSION_HTML, line ~737) — The session widget uses esc(cls) and esc(d.status) but never defines the esc() helper (unlike the results widget which defines it). This will throw a runtime error when the session status renders.

Fix this →


Looks Good

  • Redis key sanitization (redis_store.py:37-43): _KEY_UNSAFE regex + build_key() properly prevents key injection.
  • Rate limiting fail-open (middleware.py:85-86): Redis outage doesn't block legitimate traffic.
  • Proxy header trust gating (middleware.py:20-34): trust_proxy_headers must be explicitly enabled.
  • Token length limits (auth.py:413,476): Authorization code and refresh token lengths are bounded at 256 chars before Redis lookup, preventing abuse.
  • No-auth deployment guard (server.py:58-64): Requires explicit ALLOW_NO_AUTH=1 env var.
  • Input validation (models.py): Pydantic models with extra="forbid", min_length, ge/le, and custom validators for schemas and paths.
  • Poll token auth on REST endpoints (routes.py:41-49): Uses secrets.compare_digest for timing-safe comparison.
  • Test infrastructure (conftest.py): Real Redis instance per test with proper flush isolation.
  • Decomposition: Breaking state.py into tool_helpers.py + redis_store.py + result_store.py + routes.py is a clear improvement in modularity.

|

RafaelPo and others added 3 commits February 24, 2026 11:15
# Conflicts:
#	everyrow-mcp/deploy/.env.example
#	everyrow-mcp/deploy/Dockerfile
#	everyrow-mcp/deploy/docker-compose.yaml
#	everyrow-mcp/src/everyrow_mcp/auth.py
#	everyrow-mcp/src/everyrow_mcp/config.py
#	everyrow-mcp/src/everyrow_mcp/http_config.py
#	everyrow-mcp/src/everyrow_mcp/middleware.py
#	everyrow-mcp/src/everyrow_mcp/models.py
#	everyrow-mcp/src/everyrow_mcp/redis_store.py
#	everyrow-mcp/src/everyrow_mcp/result_store.py
#	everyrow-mcp/src/everyrow_mcp/routes.py
#	everyrow-mcp/src/everyrow_mcp/templates.py
#	everyrow-mcp/src/everyrow_mcp/tools.py
#	everyrow-mcp/tests/conftest.py
#	everyrow-mcp/tests/test_http_integration.py
#	everyrow-mcp/tests/test_http_real.py
#	everyrow-mcp/tests/test_middleware.py
#	everyrow-mcp/tests/test_routes.py
#	everyrow-mcp/tests/test_server.py
- Add defense-in-depth expiration check on auth codes after deserialization
- Derive revocation TTL from token remaining lifetime + 60s buffer
- Remove exc_info=True from JWT failure debug log to avoid leaking internals
- Tighten SameSite cookie policy from lax to strict on auth state cookies
- Validate Supabase token response via Pydantic model_validate with clear error
- Add tests for expired auth codes, revocation TTL, and malformed Supabase responses

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- CORS: replace wildcard "*" fallback with localhost, add Allow-Headers
- Redis: replace @lru_cache singleton with explicit getter/setter for test safety
- Auth: make load_authorization_code atomic via GETDEL, re-store on client mismatch
- Results: optimize clamp_page_to_budget with prefix sums instead of repeated json.dumps
- Models: chain ValueError properly in _validate_task_id (raise from exc)
- Deploy: add MCP_SERVER_URL to .env.example
- Templates: add missing esc() XSS helper to SESSION_HTML widget

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Comment on lines +154 to +160
if len(csv_text) > MAX_CSV_CACHE_BYTES:
logger.warning(
"Skipping Redis cache for task %s: CSV is %d bytes (limit %d)",
task_id,
len(csv_text),
MAX_CSV_CACHE_BYTES,
)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: For CSV results over 50MB, store_result_csv fails silently, leading to a response with a download link that results in a 404 error upon access.
Severity: MEDIUM

Suggested Fix

Modify store_result_csv to either raise an exception or return a status indicator (e.g., a boolean) when it skips storing a large CSV. The calling function, try_store_result, should then check for this failure and return None to trigger the fallback mechanism, which correctly serves inline results without a broken download link.

Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: everyrow-mcp/src/everyrow_mcp/redis_store.py#L154-L160

Potential issue: When a task's CSV result exceeds 50MB, the `store_result_csv` function
silently skips storing the file in Redis without raising an exception. The calling
function, `try_store_result`, proceeds as if storage was successful, generating a
response that includes a download URL for the CSV. However, when the user attempts to
use this URL, the download endpoint cannot find the CSV in Redis and returns a 404
error. The user is presented with a broken download link while the inline preview works
correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant