You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Just realized that our fondamentaux ZIM was drastically smaller than it should (and previously was): 2.35 GB and 2.54 GB instead of 9.36 GB.
Those numbers were from three different runs a few days apart (Nov 4th – large one, Nov 6th and Nov 10th).
With an exit-code of 0, we had no idea those newer ZIMs were problematic.
Understanding we can't fail on every single error when scraping a generic website, we could still be a little smarter by recording and exposing the number of failed fetches so that our QA process can evaluate whether the output is OK or not.
@ikreymer, how realistic is adding a count of succeeded/failed fetches? I think the error count in stdout only regards the webpages, right ? Those runs didn't had any 1513 / 1513 (100.00%), errors: 0 (0.00%).
The text was updated successfully, but these errors were encountered:
This issue has been automatically marked as stale because it has not had recent activity. It will be now be reviewed manually. Thank you for your contributions.
Just realized that our fondamentaux ZIM was drastically smaller than it should (and previously was): 2.35 GB and 2.54 GB instead of 9.36 GB.
Those numbers were from three different runs a few days apart (Nov 4th – large one, Nov 6th and Nov 10th).
With an exit-code of
0
, we had no idea those newer ZIMs were problematic.Understanding we can't fail on every single error when scraping a generic website, we could still be a little smarter by recording and exposing the number of failed fetches so that our QA process can evaluate whether the output is OK or not.
@ikreymer, how realistic is adding a count of succeeded/failed fetches? I think the error count in stdout only regards the webpages, right ? Those runs didn't had any
1513 / 1513 (100.00%), errors: 0 (0.00%)
.The text was updated successfully, but these errors were encountered: