-
Notifications
You must be signed in to change notification settings - Fork 381
Clarify text on HTTP API batch endpoint limits #1042
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@@ -33,7 +33,7 @@ There is no hard rate limit at which point Segment will drop your data. We ask t | |||
|
|||
## Max Request Size | |||
|
|||
There is a maximum of `32KB` per call (our `batch` endpoint accepts a maximum of `500KB` per batch and `32KB` per call). Server-side, Segment's API will respond with `400 Bad Request` if these limits are exceeded. | |||
There is a maximum of `32KB` per call (our `batch` endpoint accepts a maximum of `500KB` per batch and `32KB` per contained event). Server-side, Segment's API will respond with `400 Bad Request` if these limits are exceeded. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change the first call
to event
too?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could. The first "per call" didn't bother me since the title is "max request size". Maybe "per request" would be even better? I'm thinking we should also change "per batch" to get consistency with the unit. Maybe something like:
There is a maximum of
32KB
per request. The one exception is ourbatch
endpoint which accepts a maximum of500KB
per request with a limit of32KB
per event in the batch. Server-side, Segment's API will respond with400 Bad Request
if these limits are exceeded.
Could probably be made more concise. WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the proposed new wording with request
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🚢 🇮🇹
src/connections/sources/catalog/libraries/server/http-api/index.md
Outdated
Show resolved
Hide resolved
…x.md Co-authored-by: LRubin <sanscontext@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! 🚢
Proposed changes
I was looking at the docs on the site, and I found the language on batch limits confusing since, by definition, the batch is a single call. This is my attempt to clarify, but I will happily accept other suggestions. 😄
Merge timing
No rush.
Related issues (optional)
n/a