-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Write large strings in bounded memory #30
Comments
Hmmm. I'm just thinking out loud, but if jsonstreams used |
Yea, that sounds promising, as long as jsonstreams only holds a bounded number of iterencode-output chunks at a time (probably just one chunk). This would mostly affect the pretty printer, which is the only part of jsonstreams that looks at the encoded data before writing it. |
Thanks for your help with this! #32 serves my use case, so I'm going to close this now. I feel kinda bad leaving leaving the pretty-printing code still using Looking forward to the next release! |
jsonstreams
is a big win over the built-injson
for bounding memory usage when encoding JSON documents that are large because they contain many elements, but it doesn't help for JSON documents that are large because they contain one large element -- the current implementation requires that each element be entirely loaded into memory for encoding.I sketched a method of overcoming this limitation in this string-streams branch. The key thing there is the
test_memory_usage
test, which verifies that memory usage does not scale with element size. The changes currently in that branch to make that test pass are inelegant.Thoughts?
The text was updated successfully, but these errors were encountered: