improvements: cap read_file at 10 MiB to avoid multi-hundred-MB string round-trips#167
Merged
jmaxdev merged 1 commit intoTrixtyAI:mainfrom Apr 21, 2026
Conversation
|
Thanks for the contribution! I'll review it as soon as possible. If you have still changes, please mark this PR as draft and all reviews will be cancelled. Tests reviews will be re-run only when the PR is marked as ready for review. |
There was a problem hiding this comment.
Pull request overview
This PR adds a size limit to the Tauri read_file command to prevent large file reads from being fully buffered into a String and sent over IPC, which can lead to renderer freezes/OOMs in the desktop app.
Changes:
- Introduce a
READ_FILE_MAX_BYTESconstant (10 MiB) forread_file. read_filenowstats the target first and returns a clear error when the file exceeds the limit.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
23730d1 to
5d19f7c
Compare
5d19f7c to
789d846
Compare
…g round-trips read_file returned the entire file body as a String with no size gate. Opening a 200 MB log pipes ~500+ MB through: disk read → owned String allocation → serde_json encode → IPC → JS string decode → Monaco buffer, which is enough to OOM the renderer on low-memory machines and causes a visible freeze on anything bigger. Fix: stat the file first and reject up front when the length exceeds 10 MiB (READ_FILE_MAX_BYTES). 10 MiB is well above any source file the Monaco editor, AI chat context window, or git-explorer diff viewer actually handle well, and every in-tree caller reads small text (package.json, skill/index MD, workspace files opened from the tree), so legitimate usage is unchanged. Consumers that genuinely need large files (future streaming viewer, log tailer) are expected to use a separate chunked Tauri command rather than lifting this cap. The error message includes the actual file size and the limit so the frontend can surface an actionable message to the user.
789d846 to
fabcd4e
Compare
jmaxdev
approved these changes
Apr 21, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
[Improvement]: Cap
read_fileat 10 MiB to avoid multi-hundred-MB string round-tripsDescription
read_filereturned the entire file body as aStringwith no sizegate. Opening a 200 MB log pipes ~500 MB+ through the pipeline: disk
read → owned
Stringallocation → serde_json encode → IPC → JS stringdecode → Monaco buffer. That is enough to OOM the renderer on
low-memory machines and causes a visible freeze on anything
substantially larger than the available RAM.
Change
apps/desktop/src-tauri/src/lib.rs—read_filenow stats the filefirst and rejects up front when
metadata.len()exceedsREAD_FILE_MAX_BYTES = 10 * 1024 * 1024(10 MiB). The error messageincludes the actual file size and the limit so the frontend can
surface an actionable message to the user instead of a generic I/O
error.
Trade-offs
git-explorer diff viewer are all the realistic consumers of
read_filetoday; none of them handle files larger than a few MiBwell in practice. 10 MiB is comfortably above every legitimate
source file /
package.json/ skill-MD / project file in the treeand still an order of magnitude below the "renderer falls over"
region.
via Tauri v2 channels) requires a new command, a new frontend
consumer UI, and teaching Monaco to render chunks incrementally —
all bigger than this PR. Consumers that genuinely need large files
(log tailer, binary viewer) should use a dedicated command rather
than lift this cap.
setting) needs a settings-schema addition and a docs update. The
current behaviour — fail fast with a clear number in the error —
is the part that matters for the memory-safety fix; making it
tunable can land separately without reopening the safety question.
get_recursive_file_listandget_git_file_diff. The issuealso mentions both as carrying similar patterns. They are kept out
of this PR: the first returns an in-memory
Vec<String>of paths(risk is cardinality, not per-file size — a different shape of
fix), and the second shells out to
git diff, where the rightbound is on the diff output rather than any single source file.
Verification
cargo checkandcargo clippy -- -D warningsonsrc-tauri→clean.
awareness.ts,AgentContext.tsxskill/indexreads,
AiChatComponent.tsxtool path,GitExplorerComponent.tsxdiff/search file open) all pass small text paths → unaffected.
Failed to stat fileinstead of
Failed to read file. SameResult::Errbranch forthe frontend, friendlier message.
File … is N bytes, which exceeds the 10485760-byte read_file limit; use a streaming reader….Related Issue
Closes #87
Checklist