Skip to content

feat(api/map): use new RPCs + set limit max to 100k#2065

Merged
mogery merged 1 commit intomainfrom
mogery/eng-3308-map-better-rpc
Aug 29, 2025
Merged

feat(api/map): use new RPCs + set limit max to 100k#2065
mogery merged 1 commit intomainfrom
mogery/eng-3308-map-better-rpc

Conversation

@mogery
Copy link
Copy Markdown
Member

@mogery mogery commented Aug 29, 2025

Summary by cubic

Switch map index queries to the new RPCs (query_index_at_split_level_with_meta_2 and query_index_at_domain_split_level_with_meta_2) and remove the newer-than filter to match their API. Raise the map request limit max from 30k to 100k to support larger crawls; aligns with ENG-3308.

@mogery mogery merged commit b111da4 into main Aug 29, 2025
7 of 11 checks passed
Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 2 files

React with 👍 or 👎 to teach cubic. You can also tag @cubic-dev-ai to give feedback, ask questions, or re-run the review.

search: z.string().optional(),
sitemap: z.enum(["only", "include", "skip"]).default("include"),
limit: z.number().min(1).max(30000).default(5000),
limit: z.number().min(1).max(100000).default(5000),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Schema allows limit up to 100k, but server caps results at 30k (MAX_MAP_LIMIT) causing silent truncation for requests >30k; align schema with backend cap or raise the backend cap to match.

Prompt for AI agents
Address the following comment on apps/api/src/controllers/v2/types.ts at line 624:

<comment>Schema allows limit up to 100k, but server caps results at 30k (MAX_MAP_LIMIT) causing silent truncation for requests &gt;30k; align schema with backend cap or raise the backend cap to match.</comment>

<file context>
@@ -621,7 +621,7 @@ export const mapRequestSchema = crawlerOptions
     search: z.string().optional(),
     sitemap: z.enum([&quot;only&quot;, &quot;include&quot;, &quot;skip&quot;]).default(&quot;include&quot;),
-    limit: z.number().min(1).max(30000).default(5000),
+    limit: z.number().min(1).max(100000).default(5000),
     timeout: z.number().positive().finite().optional(),
     useMock: z.string().optional(),
</file context>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant