-
Notifications
You must be signed in to change notification settings - Fork 473
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add optional limit argument to get_block #294
base: master
Are you sure you want to change the base?
Conversation
Well, overall the PR looks good. +1 |
notion/collection.py
Outdated
@@ -360,6 +360,7 @@ def __init__( | |||
sort=[], | |||
calendar_by="", | |||
group_by="", | |||
limit=100 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to increase the default limit for CollectionQuery
? The current value doesn't fetch all rows from my database with ~200 items. I didn't run into any API errors with a higher value (100000) during my short test.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree, I tried same value for limit and it runs without an issue (https://charts.mathix.ninja/).
But it would be nice to have some sort of pagination, as after 100000 documents the same issue will occur.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have you tried: limit=-1? How is it working with 10000+ ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@felixII You can supply a limit argument like this cv.collection.get_rows(limit=200)
to fetch all your rows.
I know that this method isnt applicable to everyone as they have varying table sizes, therefore I have added support to fetch the total size of the collection first before requesting for the actual data.
Now you can do this cv.collection.get_rows(limit=-1)
, backend it will query by your total collection size instead of a fixed limit.
This isnt an elegant solution though as it doubles the api calls in the background, and also I'm not able to verify it for large table with 10K over rows 😅.
@jamalex plz take a look! |
this is a pretty paralyzing issue. does anyone know how long it will take for it to get merged in? |
In the meantime, you can |
thanks a lot @gnestor! that's exactly what i needed :) and it's so much cleaner than fiddling with the code in site-packages which is what i ended up doing 🙈 |
Thanks a lot @gnestor! However, I have problem using setuptools to specify the merge version of the git repo as dependency in my python package. Referring to the this link, the setuptools.setup(
...
install_requires=[
"notion @ git+https://github.com/jamalex/notion-py.git@refs/pull/294/merge"
],
...
) Could you help me with this problem by any chance? |
My setuptools chops are rusty, but try replacing |
I tried the following setuptools.setup(
...
install_requires=[
"git+https://github.com/jamalex/notion-py.git@refs/pull/294/merge"
],
...
)
Also, I have tried the following with this answer. But this method has a problem when uploading to pypi.
I need to find a way to set git repo to direct dependence when uploading to pypi. |
It looks like your original approach was the right way to do it in setuptools but I'm not sure how you can point to a git ref... |
I also couldn't find a way to upload a package that have a git repo as a dependency to pypi. Instead, I solved the problem by importing the git forked version directly. Thank you for your reply. |
Hi @jamalex Can it be merged? |
Have you find the way to load/read more than 100 records at a time? |
Please audit and merge if possible. This is affecting many users! |
disable get_aggregate
Use the commit:
|
Is there a reason this hasn't been merged in yet? It's breaking lots of things :) |
FWIW I have started moving all my code across to the official Notion API. Sick of sourcing on a PR for a fix. |
I run the code to install commit with pip, but setup.py error occurred. I run the following code, with hash code of this commit.# fe8ac41e1c10a229c040bd7cadd9429a66fbcda3 is the hash code of this commit.
pip install git+https://github.com/jamalex/notion-py.git@fe8ac41e1c10a229c040bd7cadd9429a66fbcda3 I get an error related to setup.py.
|
I closed this PR as it was included in #352 and was merged into the master branch. I'll reopen it again, didn't realise closing this would have such side effect. |
Even better, I didn't realize the changes had been merged in elsewhere. I'll switch to head, ty! |
#292
although setting a new fixed limit works, adding an optional limit argument would be better if notion decide to change the query limit in the future.