You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Implementing the entire processing in a streaming fashion would be very nice, but requires quite some refactoring as currently almost all code expects files to be passed. If you were to implement this I'd be happy to help out, but I could imagine that it's quite a complex undertaking.
Hello,
I am about to start this week or next week the addition of compression backend.
For example, to use Zstd which is more efficient at some points.
But I would like to ask a question,
I have 200 GB + up to 2 TB files that I want to back up. It is actually ultra long because :
Am I right?
Is there any solution to directly pass files to the compressor ?
Maybe, I'll find out the answer by reading deeply the code, but maybe there is a logic I need to know before.
Thanks !
The text was updated successfully, but these errors were encountered: