-
-
Notifications
You must be signed in to change notification settings - Fork 734
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pass around metadata #765
Comments
Chunk is a namedtuple of (meta, data), create chunks using mkchunk(data, **meta). This does not yet have any visible functionality, meta is always empty dict right now.
Chunk is a namedtuple of (meta, data), create chunks using mkchunk(data, **meta). This does not yet have any visible functionality, meta is always empty dict right now.
pass meta-data around, fixes #765
related to #14 |
Can't find out why. I guess we need this for sparse handling (and maybe also for other metadata later). @enkore do you remember? |
Anyway, in #5620 I re-added a little bit of it to support communicating between chunker and hasher. After the hasher, everything is still as it was - the compressor will not get any metadata yet. |
Considering the metadata (about sparseness) produced by the chunker (mostly by the fixed chunker, a bit less by the buzhash chunker):
|
in borg, there is a flow of data through the different stages / layers, but metadata of files / of content is missing mostly. we could use a metadata dict and pass it around with the data, e.g. as a tuple (meta, data).
e.g. currently, the compression component is statically set up, you choose it via commandline param.
it could be that an entry in the meta dict determines the compression that will be used.
the entry could default to whatever commandline says, but it could also be changed dynamically (e.g. if the file reader knows it is .mp3 and can't be compressed, so it sets meta["compression"]='none' for that data).
e.g. it could be also used for sparse files, so hole=True/False can get passed around.
The text was updated successfully, but these errors were encountered: