New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BatchMaxQueries #1659
BatchMaxQueries #1659
Conversation
…ms and recursively starts new HTTP calls
@happylinks: Thank you for submitting a pull request! Before we can merge it, you'll need to sign the Meteor Contributor Agreement here: https://contribute.meteor.com/ |
@helfer This is my first try. I'm having some issues with creating a correct test for this. Any hints would be great. I'll also keep trying myself of course. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @happylinks, thanks for the PR! I think first you should fix the tests by removing the console.log statements. Then you can make the changes based on my comment, and then finally look at the current tests for the batch network interface and try to write a couple of tests that have a similar style and use the batchMax
. One thing you could check is that if batchMax is 2 and you make 5 requests in a short interval (< batchInterval), you get 3 requests with 2, 2 and 1 query respectively.
src/transport/batching.ts
Outdated
@@ -62,20 +66,20 @@ export class QueryBatcher { | |||
// Consumes the queue. | |||
// Returns a list of promises (one for each query). | |||
public consumeQueue(): (Promise<ExecutionResult> | undefined)[] | undefined { | |||
const requests: Request[] = this.queuedRequests.map( | |||
const queueSlice = this.queuedRequests.splice(0, this.batchMax); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of slicing up when the queue is consumed, it would be better to immediately consume the queue once it reaches a length of batchMax
. Otherwise you're still going to send all of those queries at the same time, just in chunks.
Sorry for the slow response. I actually got it working now, thanks for the hints. Do you think this test covers the full use-case or should I make more? |
Hi @helfer, I see you tried to merge master again but now the tests fail. Is there something I can do to help speed this PR up :)? P.S. I'm also available to talk about this on the apollo slack. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @happylinks, I think the PR is almost ready to be merged, but we should make sure the new batchMax
argument is optional. If it's not provided or set to 0
, the behavior should be the same as before. Once the code is finished, you'll also have to make a PR to the docs so people can find out about it!
src/transport/batching.ts
Outdated
|
||
//This function is called to the queries in the queue to the server. | ||
private batchFetchFunction: (request: Request[]) => Promise<ExecutionResult[]>; | ||
|
||
constructor({ | ||
batchInterval, | ||
batchMax, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's make batchMax
optional, and set it to 0 by default, in which case it should behave the same way it did before this PR.
test/batching.ts
Outdated
@@ -16,6 +16,7 @@ describe('QueryBatcher', () => { | |||
assert.doesNotThrow(() => { | |||
const querySched = new QueryBatcher({ | |||
batchInterval: 10, | |||
batchMax: 10, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we make it optional, we won't have to add it to every test. This will also ensure that the change is not breaking for current users.
@@ -2201,6 +2204,109 @@ describe('client', () => { | |||
}); | |||
}); | |||
|
|||
it('should limit the amount of queries in a batch according to the batchMax value', (done) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test looks good! After making batchMax
optional, please also add a test that checks that no batching happens if batchMax
is set to zero.
Hi @helfer, I made it optional. Also I changed the params of the HTTPBatchNetworkInterface to be an object. This made it nicer to have optional params. Let me know if that's ok. |
It looks great now, thanks a lot @happylinks! |
Awesome! Thanks for the help :) |
Just a heads up that the change in the constructor from parameters to an object is a breaking change for libraries like I like the change to an object, it just would have been nice as a semver breaking change ( |
Hey @jaydenseric sorry about that! |
Issue reference: #1654
TODO:
Docs: https://github.com/apollographql/core-docs/pull/302