Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merkle DAG reconstruction on write ops #20

Closed
richardhundt opened this issue Jul 31, 2014 · 0 comments
Closed

Merkle DAG reconstruction on write ops #20

richardhundt opened this issue Jul 31, 2014 · 0 comments

Comments

@richardhundt
Copy link

Hi guys, awesome project! I have a question though.

If the nodes in the DAG are stored as immutable content addressable blobs, surely if I write a single byte to a (logical) file somewhere down in the hierarchy, then because its content hash changes, anything which references it (tree node, for example) must also be updated, and it in turn will have a new hash, and so on all the way up to the root key after the mutable namespace?

In other words, every change anywhere in a tree will cause you to have to recursively recompute a new DAG for the ancestors. Isn't this kinda expensive? Think mounting a VM disk image where lots of writes are happening? I think this is why Ivy chose a log-based approach. Or have I missed something?

[EDIT] of course, I'm assuming "commit on file close after writing" semantics

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant