-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Backup of likes doesn't stop at expected count #217
Comments
If it's caused by a regression, I have a suspicion commit 4961a2f (script here) would work better for you. But chances are https://github.com/aggroskater/tumblr-utils still has better support for likes. |
It's possible that this condition is failing: Line 223 in da3370c
You could try replacing that line with this more verbose code: if doc.get('meta', {}).get('status', 0) != 200:
sys.stderr.write('API response has non-200 status:\n{}\n'.format(doc))
return None
return doc |
I don't get what the more verbose code does differently, ecxept for printing the document if the response code is not 200. How would this help in this situation? |
I tried this commit and it did indeed finish and created the html file. However, it clearly did not save all my likes so it's still not working for me. |
This did not work either. |
@bbolli If that |
@sldx12 If the older commit worked for you then try the latest version of tumblr-utils, which has a potential fix for this issue. If neither gets all your likes then aggroskater's might -- it walks them by timestamp instead of offset. |
@cebtenzzre none of those worked. The older commit keeps having the same bug and aggroskater's one doesn't get all likes. |
Does the latest version (download it fresh from GitHub or update your clone if you made one) still try to download more likes than expected? If that's fixed, we can close this issue and open a new one for not downloading all of the likes. |
Yes, the latest version still tries to download more likes than expected. |
I can't reproduce the issue on a test blog with ~30 likes - I thought I could at one point but I realized I didn't have enough likes to prove my theory. print len(posts) |
@cebtenzzre I'm not sure if this what you asked me but here's the result I got: blogname: Getting posts 0 to 49 (of 1410 expected) 41 |
Yeah, that's what I wanted to see. I see two problems:
try:
print '\nnext before is {}'.format(soup['response']['_links']['next']['query_params']['before'])
except KeyError:
print '\nno next before, should probably stop'
posts = _get_content(soup) |
|
Sometimes, at least when backing up likes, the API can get stuck endlessly returning the same set of posts instead of returning an empty list. Inspect _links and stop if the offset/before fails to change. Fixes bbolli#217
Sometimes, at least when backing up likes, the API can get stuck endlessly returning the same set of posts instead of returning an empty list. Inspect _links and stop if the offset/before fails to change. Fixes bbolli#217
@cebtenzzre Oh, sorry. I don't think that the first potential fix saved all my likes. It's hard to know because, since I have to stop the script, it doesn't generate the html file. However, I looked at the media folder and it didn't look like it had all the likes. The script from PR #219 stopped on its own, gave the following output and did not save all my likes: https://pastebin.com/ktJPmL89 |
Leave this issue open so it will be closed when (if?) the PR is merged. The issue of not all likes downloading even with the PR is probably Issue #118, so discuss that there. According to that issue, the offset parameter is limited to 1,000 for likes, which explains why anything past offset=1000 is the same as offset=1000; I had forgotten about this as I use aggroskater's fork for likes anyway. If not even aggroskater's fork backs up all of your likes, feel free to open a new issue. |
When backing up likes, the API repeats responses past offset=1000. Inspect _links and stop if the "before" parameter fails to change. Fixes bbolli#217
I just installed tumblr-utils and, trying to download all my likes, it just keeps running even after passing the expected posts. Example:
The text was updated successfully, but these errors were encountered: