You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As @Kirkman found in #597 , a scraper can stop producing output without triggering an error in workflow.
While a few states keep WARN and non-WARN layoffs in the same database it's unlikely that many states would ever have a reduction in the number of incidents reported in the files getting scraped. So if a state moves from 283 reports to 123 reports or 0 reports that should get flagged. Simple row counts of CSVs compared to earlier snapshots would have caught Missouri problem.
Weekly Github Action built in warn-support repo, perhaps?
The text was updated successfully, but these errors were encountered:
As @Kirkman found in #597 , a scraper can stop producing output without triggering an error in workflow.
While a few states keep WARN and non-WARN layoffs in the same database it's unlikely that many states would ever have a reduction in the number of incidents reported in the files getting scraped. So if a state moves from 283 reports to 123 reports or 0 reports that should get flagged. Simple row counts of CSVs compared to earlier snapshots would have caught Missouri problem.
Weekly Github Action built in warn-support repo, perhaps?
The text was updated successfully, but these errors were encountered: