Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature: Recrawl failed links #95

Merged
merged 5 commits into from
Apr 11, 2024
Merged

feature: Recrawl failed links #95

merged 5 commits into from
Apr 11, 2024

Conversation

AhmadMuj
Copy link
Contributor

  • Added a backoff for the Links Crawler Queue
  • Added a new field in ( crawStatus ) in the bookmarkLinks table
  • Added a new button the web to recrawl the failed URLs based on the crawStatus

Copy link

vercel bot commented Apr 11, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
hoarder-app-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 11, 2024 8:41pm
hoarder-app-landing ✅ Ready (Inspect) Visit Preview 💬 Add feedback Apr 11, 2024 8:41pm

Copy link
Collaborator

@MohamedBassem MohamedBassem left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That was fast! It looks good to me, I requested only some minor changes :)

EDIT, can you also maybe rebase your branch to get the PDF commits out of the PR? They are not that big of a problem though :)

Comment on lines 184 to 174
<TableRow>
<TableCell className="lg:w-2/3">Failed Crawling Jobs</TableCell>
<TableCell>{serverStats.failedCrawls}</TableCell>
</TableRow>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about we change the table to be a bit more 2d?

As in:

Pending Failed
Crawling jobs 0 0
Inference jobs 0 0
Search jobs 0 -

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For now the indexing and openai are going to be based off the queue status.

@@ -59,7 +69,21 @@ export const adminAppRouter = router({
),
);
}),

recrawlFailedLinks: adminProcedure.mutation(async ({ ctx }) => {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that we can easily add a onlyFailures in the existing recrawlAllLinks instead of creating a new endpoint?

Comment on lines +2 to +3
UPDATE bookmarkLinks SET crawlStatus = 'failure' where htmlContent is null;--> statement-breakpoint
UPDATE bookmarkLinks SET crawlStatus = 'success' where htmlContent is not null;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love that!

packages/trpc/routers/admin.ts Show resolved Hide resolved
@MohamedBassem
Copy link
Collaborator

Thanks again, I'm pretty sure a lot of people will find this to be super useful!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants