Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BatchMaxQueries #1659

Merged
merged 17 commits into from
Jun 20, 2017
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
10 changes: 8 additions & 2 deletions src/transport/batchedNetworkInterface.ts
Original file line number Diff line number Diff line change
Expand Up @@ -46,15 +46,20 @@ export class HTTPBatchedNetworkInterface extends BaseNetworkInterface {
public _afterwares: BatchAfterwareInterface[];
private batcher: QueryBatcher;

constructor(uri: string, batchInterval: number, fetchOpts: RequestInit) {
constructor(uri: string, batchInterval: number, batchMax: number, fetchOpts: RequestInit) {
super(uri, fetchOpts);

if (typeof batchInterval !== 'number') {
throw new Error(`batchInterval must be a number, got ${batchInterval}`);
}

if (typeof batchMax !== 'number') {
throw new Error(`batchMax must be a number, got ${batchMax}`);
}

this.batcher = new QueryBatcher({
batchInterval: batchInterval,
batchMax: batchMax,
batchFetchFunction: this.batchQuery.bind(this),
});
}
Expand Down Expand Up @@ -218,12 +223,13 @@ export class HTTPBatchedNetworkInterface extends BaseNetworkInterface {
export interface BatchingNetworkInterfaceOptions {
uri: string;
batchInterval: number;
batchMax: number;
opts?: RequestInit;
}

export function createBatchingNetworkInterface(options: BatchingNetworkInterfaceOptions): HTTPNetworkInterface {
if (! options) {
throw new Error('You must pass an options argument to createNetworkInterface.');
}
return new HTTPBatchedNetworkInterface(options.uri, options.batchInterval, options.opts || {});
return new HTTPBatchedNetworkInterface(options.uri, options.batchInterval, options.batchMax, options.opts || {});
}
17 changes: 13 additions & 4 deletions src/transport/batching.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,20 +24,24 @@ export class QueryBatcher {
// Queue on which the QueryBatcher will operate on a per-tick basis.
public queuedRequests: QueryFetchRequest[] = [];

private batchInterval: Number;
private batchInterval: number;
private batchMax: number;

//This function is called to the queries in the queue to the server.
private batchFetchFunction: (request: Request[]) => Promise<ExecutionResult[]>;

constructor({
batchInterval,
batchMax,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's make batchMax optional, and set it to 0 by default, in which case it should behave the same way it did before this PR.

batchFetchFunction,
}: {
batchInterval: number,
batchMax: number,
batchFetchFunction: (request: Request[]) => Promise<ExecutionResult[]>,
}) {
this.queuedRequests = [];
this.batchInterval = batchInterval;
this.batchMax = batchMax;
this.batchFetchFunction = batchFetchFunction;
}

Expand All @@ -62,20 +66,20 @@ export class QueryBatcher {
// Consumes the queue.
// Returns a list of promises (one for each query).
public consumeQueue(): (Promise<ExecutionResult> | undefined)[] | undefined {
const requests: Request[] = this.queuedRequests.map(
const queueSlice = this.queuedRequests.splice(0, this.batchMax);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of slicing up when the queue is consumed, it would be better to immediately consume the queue once it reaches a length of batchMax. Otherwise you're still going to send all of those queries at the same time, just in chunks.

const requests: Request[] = queueSlice.map(
(queuedRequest) => queuedRequest.request,
);

const promises: (Promise<ExecutionResult> | undefined)[] = [];
const resolvers: any[] = [];
const rejecters: any[] = [];
this.queuedRequests.forEach((fetchRequest, index) => {
queueSlice.forEach((fetchRequest, index) => {
promises.push(fetchRequest.promise);
resolvers.push(fetchRequest.resolve);
rejecters.push(fetchRequest.reject);
});

this.queuedRequests = [];
const batchedPromise = this.batchFetchFunction(requests);

batchedPromise.then((results) => {
Expand All @@ -87,6 +91,11 @@ export class QueryBatcher {
rejecters[index](error);
});
});

if (this.queuedRequests.length) {
promises.concat(this.consumeQueue());
}

return promises;
}

Expand Down
4 changes: 2 additions & 2 deletions test/batchedNetworkInterface.ts
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ describe('HTTPBatchedNetworkInterface', () => {
opts?: RequestInit,
}) => {
const url = 'http://fake.com/graphql';
const batchedNetworkInterface = new HTTPBatchedNetworkInterface(url, 10, opts);
const batchedNetworkInterface = new HTTPBatchedNetworkInterface(url, 10, 10, opts);

batchedNetworkInterface.use(middlewares);
batchedNetworkInterface.useAfter(afterwares);
Expand Down Expand Up @@ -113,7 +113,7 @@ describe('HTTPBatchedNetworkInterface', () => {
it('should construct itself correctly', () => {
const url = 'http://notreal.com/graphql';
const opts = {};
const batchedNetworkInterface = new HTTPBatchedNetworkInterface(url, 10, opts);
const batchedNetworkInterface = new HTTPBatchedNetworkInterface(url, 10, 10, opts);
assert(batchedNetworkInterface);
assert.equal(batchedNetworkInterface._uri, url);
assert.deepEqual(batchedNetworkInterface._opts, opts);
Expand Down
10 changes: 10 additions & 0 deletions test/batching.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ describe('QueryBatcher', () => {
assert.doesNotThrow(() => {
const querySched = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we make it optional, we won't have to add it to every test. This will also ensure that the change is not breaking for current users.

batchFetchFunction: networkInterface.batchQuery.bind(networkInterface),
});
querySched.consumeQueue();
Expand All @@ -25,6 +26,7 @@ describe('QueryBatcher', () => {
it('should not do anything when faced with an empty queue', () => {
const batcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: networkInterface.batchQuery.bind(networkInterface),
});

Expand All @@ -36,6 +38,7 @@ describe('QueryBatcher', () => {
it('should be able to add to the queue', () => {
const batcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: networkInterface.batchQuery.bind(networkInterface),
});

Expand Down Expand Up @@ -84,6 +87,7 @@ describe('QueryBatcher', () => {
);
const batcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: myNetworkInterface.batchQuery.bind(myNetworkInterface),
});
const request: Request = {
Expand All @@ -93,6 +97,7 @@ describe('QueryBatcher', () => {
it('should be able to consume from a queue containing a single query', (done) => {
const myBatcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: myNetworkInterface.batchQuery.bind(myNetworkInterface),
});

Expand Down Expand Up @@ -123,6 +128,7 @@ describe('QueryBatcher', () => {

const myBatcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: NI.batchQuery.bind(NI),
});
myBatcher.enqueueRequest(request);
Expand All @@ -148,6 +154,7 @@ describe('QueryBatcher', () => {
);
const myBatcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: NI.batchQuery.bind(NI),
});
const promise = myBatcher.enqueueRequest(request);
Expand All @@ -162,6 +169,7 @@ describe('QueryBatcher', () => {
it('should work when single query', (done) => {
const batcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: networkInterface.batchQuery.bind(networkInterface),
});
const query = gql`
Expand All @@ -185,6 +193,7 @@ describe('QueryBatcher', () => {
it('should correctly batch multiple queries', (done) => {
const batcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: networkInterface.batchQuery.bind(networkInterface),
});
const query = gql`
Expand Down Expand Up @@ -233,6 +242,7 @@ describe('QueryBatcher', () => {
);
const batcher = new QueryBatcher({
batchInterval: 10,
batchMax: 10,
batchFetchFunction: myNetworkInterface.batchQuery.bind(myNetworkInterface),
});
const promise = batcher.enqueueRequest(request);
Expand Down
98 changes: 98 additions & 0 deletions test/client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2025,6 +2025,7 @@ describe('client', () => {
const networkInterface = createBatchingNetworkInterface({
uri: 'http://not-a-real-url.com',
batchInterval: 5,
batchMax: 5,
opts: {},
});
Promise.all([
Expand Down Expand Up @@ -2086,6 +2087,7 @@ describe('client', () => {
const networkInterface = createBatchingNetworkInterface({
uri: 'http://not-a-real-url.com',
batchInterval: 5,
batchMax: 5,
opts: {},
});
Promise.all([
Expand Down Expand Up @@ -2166,6 +2168,7 @@ describe('client', () => {
const networkInterface = createBatchingNetworkInterface({
uri: 'http://not-a-real-url.com',
batchInterval: 5,
batchMax: 5,
opts: {},
});
Promise.all([
Expand All @@ -2181,6 +2184,101 @@ describe('client', () => {
});
});

it('should limit the amount of queries in a batch according to the batchMax value', (done) => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test looks good! After making batchMax optional, please also add a test that checks that no batching happens if batchMax is set to zero.

const firstQuery = gql`
query {
author {
firstName
lastName
}
}`;
const firstResult = {
data: {
author: {
firstName: 'John',
lastName: 'Smith',
},
},
loading: false,
};
const secondQuery = gql`
query {
person {
name
}
}`;
const secondResult = {
data: {
person: {
name: 'Jane Smith',
},
},
};
const url = 'http://not-a-real-url.com';

const networkInterface = createBatchingNetworkInterface({
uri: 'http://fake-url.com',
batchInterval: 5,
batchMax: 1,
opts: {},
});

const oldFetch = fetch;
fetch = createMockFetch({
url,
opts: {
body: JSON.stringify([
{
query: print(firstQuery),
},
]),
headers: {
Accept: '*/*',
'Content-Type': 'application/json',
},
method: 'POST',
},
result: createMockedIResponse([firstResult]),
});

const firstFetch = networkInterface.query({ query: firstQuery });
firstFetch.then((results) => {
console.log('first results', results);
assert.deepEqual(results, [firstResult]);
fetch = oldFetch;
}).catch( e => {
console.error(e);
});

fetch = createMockFetch({
url,
opts: {
body: JSON.stringify([
{
query: print(secondQuery),
},
]),
headers: {
Accept: '*/*',
'Content-Type': 'application/json',
},
method: 'POST',
},
result: createMockedIResponse([secondResult]),
});

const secondFetch = networkInterface.query({ query: secondQuery });
secondFetch.then((results) => {
console.log('second results', results);
assert.deepEqual(results, [secondResult]);
fetch = oldFetch;
}).catch( e => {
console.error(e);
});

done();
});

it('should enable dev tools logging', () => {
const query = gql`
query people {
Expand Down