-
Notifications
You must be signed in to change notification settings - Fork 977
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support scalable pagination #613
Comments
Settings offsets is not the most convenient API and we need good pagination support, connections seem like a good model to follow.
Could you elaborate on what's the issue you're envisioning here? |
In most database systems, a query like |
@Arachnid I see. Though that seems to be more a concern of implementation than of graphql interface. We could do a good implementation of graphql offsets that doesn't use |
True, but I don't think it's possible (at least without low level DB
support) to do a good implementation that uses offsets - so better to fix
the API early.
…-Nick
On Tue, 18 Dec 2018, 05:33 Leonardo Yvens, ***@***.***> wrote:
@Arachnid <https://github.com/Arachnid> I see. Though that seems to be
more a concern of implementation than of graphql interface. We could do a
good implementation of graphql offsets that doesn't use OFFSET, and it's
also possible to do a bad implementation of cursors that does use OFFSET
on the DB.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#613 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AABFyUH_o306Ed2dlvxJrf5RH4Xq-LIVks5u58dHgaJpZM4Y43Ry>
.
|
I wrote a little utility hook that takes care of automatically scraping the endpoint for more results (using skip & limit parameters) until it's exhausted: import { useQuery } from '@apollo/react-hooks';
import { useRef, useEffect } from 'react';
import { DocumentNode } from 'graphql';
type QueryPair = [DocumentNode, DocumentNode];
type ProceedOrNotFn = (result: any, expected: number) => boolean;
export function useScrapingQuery([query, more]: QueryPair, proceed: ProceedOrNotFn, props?: any) {
const limit = (props.variables && props.variables.limit) || 100;
const skip = useRef((props.variables && props.variables.skip) || 0);
const result = useQuery(query, {
...props,
variables: {
...(props && props.variables),
limit,
skip,
},
});
useEffect(() => {
if (!!result.loading || !!result.error || !proceed(result.data, skip.current + limit)) {
return;
}
result.fetchMore({
query: more,
variables: {
...result.variables,
skip: skip.current + limit,
},
updateQuery: (previous, options) => {
skip.current = skip.current + limit;
const moreResult = options.fetchMoreResult;
const output = Object.keys(moreResult).reduce(
(carry, current) => ({
...carry,
[current]: carry[current].concat(moreResult[current] || []),
}),
previous,
);
return output;
},
});
}, [result, skip.current]);
return result;
} Basically, you pass a query tuple (first query mandatory, second is optional to provide a custom query for the "fetch more" logic (e.g. if the first query has other, non-paginated fields in it). Example: import gql from 'graphql-tag';
export const FundOverviewQuery = gql`
query FundOverviewQuery($limit: Int!) {
funds(orderBy: name, first: $limit) {
id
name
gav
grossSharePrice
isShutdown
creationTime
}
nonPaginatedQueryField(orderBy: timestamp) {
...
}
}
`;
export const FundOverviewContinueQuery = gql`
query FundOverviewContinueQuery($limit: Int!, $skip: Int!) {
funds(orderBy: name, first: $limit, skip: $skip) {
id
name
gav
grossSharePrice
isShutdown
creationTime
}
}
`; It uses the "limit" and "skip" query variables. The hook automatically adds these by default. Additionally, you need to provide a callback that checks if more needs to be fetched after each cycle. Full usage example: const FundList: React.FunctionComponent<FundListProps> = props => {
const proceed = (current: any, expected: number) => {
if (current.funds && current.funds.length === expected) {
return true;
}
return false;
};
const result = useScrapingQuery([FundOverviewQuery, FundOverviewScrapingQuery], proceed, {
ssr: false,
});
return <div>{...}</div>; // Render full fund list (keeps adding more items until the resource is exhausted.
} |
I also have same problem in our project. I want to request a feature that you can provide in the following way. // assume I have entity like this // then we can query like this // in this case can we use like this? I think it's not too difficult to add this feature in your dev team. |
Just adding my thoughts here. Today, pagination is implemented on the root of every Query type, and returns a We can implement Cursor-based pagination (see spec here https://relay.dev/graphql/connections.htm). It's supported in all popular clients, and makes pagination super easy and robust (since it's cursor based, so it's easier to get a reliable response, instead of using We can expose a Here's an example: type Query {
purpose(id: ID): Purpose!
purposes(filter: PurposeFilter): [Purpose!]!
purposeConnection(filter: PurposeFilter, paginate: PaginationFilter): PurposeConnection!
}
input PaginationFilter {
before: String
after: String
first: Int
last: Int
}
type Purpose { ... }
type PageInfo {
hasNextPage: Boolean!
hasPreviousPage: Boolean!
startCursor: String
endCursor: String
}
type PurposeEdge {
node: Purpose
cursor: String!
}
type PurposeConnection {
pageInfo: PageInfo!
edges: [PurposeEdge]!
} |
Having Found 49 tokens (show first 10 tokens) [1] [2] [3] [4] |
Is this still being worked on? Pagination with lots of historical data is a huge pain, and applying offsets really, really doesn't scale. |
I think it would be useful to add counter / cursor as pagination. Any idea if this feature will be supported? |
Presently, it's possible to query entities using a
where
clause, but this uses offsets from start or end, which likely won't scale well if paging over a large dataset. It'd be good to use the graphqlconnection
pattern, or something similar, where result sets return an opaque cursor that can be passed in on subsequent calls to pick up where the previous query left off.The text was updated successfully, but these errors were encountered: