Skip to content

fix: add size limit to request.get_data (Batch #75)#4160

Open
BossChaos wants to merge 2 commits intoScottcjn:mainfrom
BossChaos:sec-batch75
Open

fix: add size limit to request.get_data (Batch #75)#4160
BossChaos wants to merge 2 commits intoScottcjn:mainfrom
BossChaos:sec-batch75

Conversation

@BossChaos
Copy link
Copy Markdown
Contributor

fix: add size limit to request.get_data (Batch #75)

  • Add 1MB limit to get_data() in P2P signature verification
  • Prevent DoS via oversized request bodies

Co-Authored-By: Hermes Agent hermes@nous.research

BossChaos and others added 2 commits May 5, 2026 02:52
- Add 1MB limit to get_data() in P2P signature verification
- Prevent DoS via oversized request bodies

Co-Authored-By: Hermes Agent <hermes@nous.research>
@BossChaos BossChaos requested a review from Scottcjn as a code owner May 8, 2026 11:03
@github-actions github-actions Bot added BCOS-L1 Beacon Certified Open Source tier BCOS-L1 (required for non-doc PRs) node Node server related ci size/XS PR: 1-10 lines labels May 8, 2026
Copy link
Copy Markdown

@fengqiankun6-sudo fengqiankun6-sudo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Review #4160

Reviewed the P2P size limit change. The 1MB DoS prevention is a solid security improvement. Good systematic approach in Batch #75.

Claiming Standard Review bounty #73. Wallet: davidtang-codex

Copy link
Copy Markdown

@jujujuda jujujuda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review: PR #4160 — add size limit to request.get_data

Reviewer: jujujuda (Atlas bounty hunter)
Bounty Program: #73 Code Review Bounty

Summary

LGTM. The 1MB limit on request.get_data() is a correct and minimal DoS mitigation for the P2P signature verification endpoint.

What Works

  • request.get_data(1048576) correctly passes the byte limit to Flask — data beyond 1MB returns empty bytes rather than reading the full body into memory
  • The UnicodeDecodeError risk on large bodies is handled implicitly (truncated data decodes cleanly)
  • The inline comment # 1MB limit is clear for future readers

One Suggestion (non-blocking)

Consider an explicit size check before decode so the error is user-facing rather than a 500:

body_bytes = request.get_data(1048576)
if len(body_bytes) == 0 and request.content_length and request.content_length > 1048576:
    return jsonify({"error": "Request body exceeds 1MB limit"}), 413
body = body_bytes.decode()

Without this, a large-body attack would produce a signature mismatch (401) rather than a clear 413. Minor UX issue, not a security gap.

Verdict

Standard Review: 7/10 RTC — Solid fix, correct approach, no security concerns.


Claiming under Bounty #73 | Wallet: RTC2fe3c33c77666ff76a1cd0999fd4466ee81250ff

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

BCOS-L1 Beacon Certified Open Source tier BCOS-L1 (required for non-doc PRs) ci node Node server related size/XS PR: 1-10 lines

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants