Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix #14. Also make it faster #15

Closed
wants to merge 2 commits into from
Closed

Fix #14. Also make it faster #15

wants to merge 2 commits into from

Conversation

lilactown
Copy link
Owner

@lilactown lilactown commented Dec 15, 2021

Fixes #14, as well as makes normalizing and adding data to a map twice as fast than before according to my basic benchmarks. 馃槃

The tradeoff is that normalizing now uses function recursion, which means we allocate closures on the stack every iteration and are limited by the stack size, whereas before we used a loop/recur which allowed "unlimited" depth.

I tried using fast-zip to do this in a tail recursive/"stackless" fashion, however it was about 15% to 30% slower in my basic benchmarking and less deterministic (probably due to GC).

I would like to verify this in our production uses to see if we run into any cases where we blow the stack. If it seems fine given reasonable data sizes, I may in the future provide additional functions for normalizing that aren't limited by the stack size to accommodate huge trees of data being added at once.

go with depth-first with mutable

use volatile for entities
@lilactown lilactown mentioned this pull request Dec 17, 2021
@lilactown
Copy link
Owner Author

Closing in favor of #16

@lilactown lilactown closed this Dec 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Data loss in p/add with nested maps
1 participant