Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve handling of failed crawling jobs #169

Open
kamtschatka opened this issue May 19, 2024 · 1 comment
Open

Improve handling of failed crawling jobs #169

kamtschatka opened this issue May 19, 2024 · 1 comment

Comments

@kamtschatka
Copy link
Contributor

kamtschatka commented May 19, 2024

I have a few bookmarks imported from chrome that fail due to 404/500... errors. They get added (which is fine), but since they also get a response, they go through ollama for tagging and can't really be filtered for in the UI (no indication that something went "wrong" during crawling as in received a statuscode that indicates an error).
Would be great if there was a filter that shows bookmarks that failed and exclude them from AI tagging (400+ status code)

@MohamedBassem
Copy link
Collaborator

hmmm, not sure I understand what you mean by not found in the UI? Even if the crawler fails, the bookmark will be shown in the UI.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants