Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
fix(preview): chunk document availability ids on id length
The previous approach attempted to solve the issue of requesting too many IDs at the same time by chunking the IDs into groups of 300. For shorter IDs this was an acceptable solution - but with long document IDs, the limit (enforced by the backend) can actually be reached much sooner. This commit changes the algorithm to account for the length of IDs - attempting to stay within the limit of ~11kB per chunk. This is the same as @sanity/client uses as a limit for its query string, and should work both with older browsers, the backend and account for a potentially large set of headers being present.
- Loading branch information