You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 15, 2021. It is now read-only.
I've noticed when saving URLs, there is now an increased rate of the URLs failing to save, especially from twitter and uchinokomato.me, 20-40% of the time it fails (before, it was 1-4%). This is not due to excluded URLs, because when I try to save the URLs again, they then get saved. I was thinking of adding a feature in which you add this to the syntax to make it save the URL again (after a user-specified delay, in seconds) and repeats until it succeeds or fails if it fails by a number specified by the user:
And it will output Page submission failed :-( or Your page submitted sucessfully! (failed x time(s)).
x is the number of fails before it finally saves the URL.
MaxNumberOfFails is a number of how many failed attempts the program will try again to save the URL. Once this number is reached, will quit re-attempts and outputs Page submission failed :-(
<TimeInSecondsDelayBetweenAttempts> is how long (in seconds) between re-attempts to try to save a failed URL. This is because yesterday, the 429 error now states you are not to save more than 15 urls in a minute.
if it fails, it will try again, until it successfully saves the URL and ends (Your page submitted sucessfully! (failed 2 time(s))), and continues the following command in the batch file. If it fails 3 times, then it will stop attempting to save and outputs Page submission failed :-( and continue towards the next command in the batch file.
The text was updated successfully, but these errors were encountered:
I've noticed when saving URLs, there is now an increased rate of the URLs failing to save, especially from twitter and uchinokomato.me, 20-40% of the time it fails (before, it was 1-4%). This is not due to excluded URLs, because when I try to save the URLs again, they then get saved. I was thinking of adding a feature in which you add this to the syntax to make it save the URL again (after a user-specified delay, in seconds) and repeats until it succeeds or fails if it fails by a number specified by the user:
And it will output
Page submission failed :-(
orYour page submitted sucessfully! (failed x time(s))
.x
is the number of fails before it finally saves the URL.MaxNumberOfFails
is a number of how many failed attempts the program will try again to save the URL. Once this number is reached, will quit re-attempts and outputsPage submission failed :-(
<TimeInSecondsDelayBetweenAttempts>
is how long (in seconds) between re-attempts to try to save a failed URL. This is because yesterday, the 429 error now states you are not to save more than 15 urls in a minute.For example:
if it fails, it will try again, until it successfully saves the URL and ends (
Your page submitted sucessfully! (failed 2 time(s))
), and continues the following command in the batch file. If it fails 3 times, then it will stop attempting to save and outputsPage submission failed :-(
and continue towards the next command in the batch file.The text was updated successfully, but these errors were encountered: