You can clone with
If I give Sag::bulk() an absurdly large .json array full of docs, I'd love for it to split them into 1k bulk sends and/or further optimize the bulk request for me.
Otherwise, this bulk doc optimizing code gets written in "developer land" and unnecessarily slows people down.
Maybe it can throw a warning--if you'd like--for educational purposes: "Sag is optimizing your absurdly large bulk docs request. You're welcome. :hugs:"
as long as it's configurable.
How exactly do you optimize this? Isn't absurdly the same as "YMMV"? =)
Provide the ability to chunk bulk docs into batches (closes #67)
@BigBlueHat @till Implemented in the 1.0 branch.
To @till's point, I decided to make the default size 0 so that we don't run into YMMV disagreements.
Let me know what you think.