Skip to content

Comments

builtin: add offset and line_count pagination to read_file and read_multiple_files#1828

Merged
trungutt merged 6 commits intodocker:mainfrom
trungutt:fix/filesystem-tool-output-limit
Feb 24, 2026
Merged

builtin: add offset and line_count pagination to read_file and read_multiple_files#1828
trungutt merged 6 commits intodocker:mainfrom
trungutt:fix/filesystem-tool-output-limit

Conversation

@trungutt
Copy link
Contributor

@trungutt trungutt commented Feb 23, 2026

  • read_file and read_multiple_files had no output size limit, so a single call reading a large repository could return 150K+ characters in one tool result, pushing the session context over 100K tokens (which is pretty big)
  • Replace the blunt limitOutput() cap with proper pagination: both tools now accept offset (1-based line number) and line_count parameters so the model can read large files incrementally.
  • When a subset is returned, a header is prepended (e.g. [Showing lines 1-150 of 250 from AGENTS.md]) so the model knows how much content remains and can request the next chunk.
  • ReadFileMeta.LineCount renamed to TotalLines — always reflects the full file regardless of the window requested.
  • Tool descriptions and Instructions() updated to guide the model to paginate large files.

Large file reads could produce tool results exceeding 150K characters,
causing the total session context to exceed 100K tokens and trigger
504 Gateway Timeout errors from the production proxy infrastructure.

Apply the existing limitOutput() (30,000 char cap) to read_file and
read_multiple_files, consistent with shell, sandbox, and API tools.
For read_multiple_files the limit is applied per-file so the model
knows which specific file was truncated.
@trungutt trungutt requested a review from a team as a code owner February 23, 2026 17:10
Copy link

@docker-agent docker-agent bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

The implementation correctly applies limitOutput() to prevent oversized tool results. However, there's one issue with metadata accuracy: when files are truncated, the LineCount metadata reflects only the truncated content rather than the actual file line count, which could be misleading to users.

LineCount was being calculated on the already-truncated string, so a
10,000-line file truncated to ~1,000 lines would report LineCount as
~1,000. Calculate it on the original content before applying limitOutput.
Copy link
Member

@rumpl rumpl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should not limit read files, instead we should add offset and size parameters and instruct the LLM to use them. If we limit as simply as this there is no way an agent could read a whole file, which will limit the agent severely

…ultiple_files

Replace the blunt limitOutput() cap with proper pagination: both tools
now accept offset (1-based line number) and line_count parameters so
the model can read large files incrementally rather than receiving a
truncated blob.

When a subset is returned a header is prepended with the line range and
total line count so the model knows how much content remains. The
TotalLines metadata field (renamed from LineCount) always reflects the
full file regardless of the window requested.

Tool descriptions and instructions are updated to guide the model to
paginate large files.
@trungutt trungutt requested review from a team and rumpl February 23, 2026 17:40
…ange

The new offset and line_count parameters and updated descriptions are
now reflected in all VCR cassettes for the affected providers
(OpenAI, Anthropic, Gemini, Mistral).
@trungutt trungutt changed the title builtin: limit read_file and read_multiple_files output size builtin: add offset and line_count pagination to read_file and read_multiple_files Feb 23, 2026
Copy link
Contributor

@krissetto krissetto left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some minor things

Move usage guidance to Instructions() only, keeping tool descriptions
concise as suggested in review feedback.
@trungutt trungutt merged commit f39705d into docker:main Feb 24, 2026
5 checks passed
trungutt added a commit to trungutt/cagent that referenced this pull request Feb 24, 2026
…ool-output-limit"

This reverts commit f39705d, reversing
changes made to deec6a2.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants