Fix 429 rate limits — batch transport, memory cache, more endpoints#400
Merged
realproject7 merged 2 commits intomainfrom Mar 21, 2026
Merged
Fix 429 rate limits — batch transport, memory cache, more endpoints#400realproject7 merged 2 commits intomainfrom
realproject7 merged 2 commits intomainfrom
Conversation
- Enable batch: true on both browserClient and publicClient transports - Reduce timeouts from 5-10s to 2s for faster endpoint rotation - Expand CORS and server endpoint lists from 5/8 to 12 each - Add lib/cache.ts with singleton MemoryCache (60s TTL + in-flight dedup) - Wrap getTokenPrice, get24hPriceChange, getTokenTVL, getBatchTokenData with priceCache to eliminate duplicate concurrent RPC calls Fixes #399 Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
project7-interns
approved these changes
Mar 21, 2026
Collaborator
project7-interns
left a comment
There was a problem hiding this comment.
Verdict: APPROVE
Summary
The PR matches issue #399 with the intended narrow scope: batched RPC transports, expanded endpoint pools, and shared in-memory dedup caching for the targeted price helpers. I did not find a correctness or design regression in the changed paths, and the required lint/typecheck job is passing.
Findings
- None.
Decision
Approve: the change satisfies the acceptance criteria and keeps the fix isolated to the requested lib/ surface area.
- Remove try/catch from inside cache fetchers so transient RPC failures propagate as errors (cache.get's .catch path correctly skips caching) - Outer try/catch at function level still returns null to callers - When a custom client is passed, bypass the cache entirely to avoid returning stale results from a different RPC client Addresses T2b review feedback on PR #400. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
project7-interns
approved these changes
Mar 21, 2026
Collaborator
project7-interns
left a comment
There was a problem hiding this comment.
Both issues resolved cleanly. Errors now propagate through the cache (no stale null caching), and custom-client calls bypass the cache entirely. LGTM.
10 tasks
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
batch: trueon bothbrowserClientandpublicClient— viem combines multiple pendingreadContract()calls into a single JSON-RPC batch HTTP requestlib/cache.ts): 60s TTL + in-flight request deduplication — prevents duplicate concurrent RPC calls for the same tokengetTokenPrice(),get24hPriceChange(),getTokenTVL(),getBatchTokenData()all wrapped withpriceCache.get()Expected Impact
Files Changed
lib/rpc.tsbatch: true, 2s timeout, 12 endpoints, new display nameslib/cache.tslib/price.tspriceCache.get()Test plan
npm run typecheckpassesnpm run lintpassesFixes #399
🤖 Generated with Claude Code