Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

client.get_block broken due to highest accepted "limit" value 100, but we use 100000 #304

Open
phoenixeliot opened this issue Mar 20, 2021 · 6 comments · May be fixed by #345
Open

client.get_block broken due to highest accepted "limit" value 100, but we use 100000 #304

phoenixeliot opened this issue Mar 20, 2021 · 6 comments · May be fixed by #345

Comments

@phoenixeliot
Copy link

phoenixeliot commented Mar 20, 2021

I'm finding that using the client as it is now, with the "limit" set to 10000 when fetching a block, I get an HTTP error "invalid input".

"limit": 100000,

By testing manually, I found that 100 is the highest value this will accept as of right now, and removes the error.

The library should be updated with the new limit here, and we might need new logic to paginate this request when the page has more than 100 blocks in it.

From a cursory glance it looks like there are other places with the number "10000" that may also need to be updated.

@phoenixeliot phoenixeliot changed the title client.get_block broken due to max "limit" of 100 client.get_block broken due to highest accepted "limit" value 100, but we use 10000 Mar 20, 2021
@phoenixeliot phoenixeliot changed the title client.get_block broken due to highest accepted "limit" value 100, but we use 10000 client.get_block broken due to highest accepted "limit" value 100, but we use 100000 Mar 20, 2021
@phoenixeliot
Copy link
Author

It looks like the notion web client uses syncRecordValues to get the values of all the nodes after the first limit amount (the web client uses 30 for the limit). That network request sends back all the data for the specified block IDs, the list of which comes back in the initial loadPageChunk request (that this package uses now).

@miletoda
Copy link

@phoenixeliot Have you find the way to load/read more than 100 records?

@phoenixeliot
Copy link
Author

@miletoda You or I would need to write a patch/PR that implements the above suggestion. I might, but I don't have any use cases for 100+ block pages yet, so it's not super relevant to me personally yet. It should be quite doable, though, if you want to take a crack at it!

@iansinnott
Copy link

iansinnott commented Aug 8, 2021

To add more confirmation, this seems to be an issue with search_pages_with_parent as well. Manually lowering the limit param in the request succeeds. As of this comment client.current_space.pages seems to be throwing due to this issue.

edit: s/commit/comment

@shawwn
Copy link

shawwn commented Sep 8, 2021

This case is pretty relevant to me. I'd like to help, if I can.

The problem is, I'm not sure how the pagination works. Is there an example somewhere?

I'd also like to help fix monitoring, but that's unrelated to this issue. (Is there an issue open for it?)

Thanks!

@shawwn shawwn linked a pull request Sep 8, 2021 that will close this issue
@shawwn
Copy link

shawwn commented Sep 8, 2021

Okay, I've fixed block pagination. See PR #345

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants