Splitting large files into chunks #534
Comments
odarriba
commented
Feb 18, 2017
|
But, theorically, that will make that files difficult to use in computers that doesn't make use of acd_cli for syncing, doesn't it? I'm noy saying it is a bad idea, but if done, maybe it should be configurable |
manwe-pl
commented
Feb 18, 2017
|
There's nothing wrong with it being an option. |
bgemmill
commented
Feb 18, 2017
|
Have a look at duplicity if this is a concern; it may be more straightforward to layer a chunking system over top of acdcli than to build it in. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
manwe-pl commentedFeb 17, 2017
•
edited
Just a wild thought. If files >10GB are problematic, and >50GB are impossible to upload, maybe acd_cli could split them into 8GB chunks? So for example file
movie.mkv(30GB) could be sliced into##part000##movie.mkvup to##part003##movie.mkv. Of course in fuse those files would be visible as one. Finding the offset and proper part is quite easy. If parts are 8GB in size (8,589,934,592 bytes) and we want for example byte number 10,000,000,000, part number is10000000000 // 8589934592 = 1, position in that part is10000000000 % 8589934592 = 1410065408.I think there are people (including me) that would consider a donation/bounty for such feature :)