Skip to content

perf: upgrade analysis cache from FIFO to LRU eviction#246

Merged
askpt merged 2 commits intomainfrom
repo-assist/perf-lru-cache-2026-04-07-75018df41e67a7cc
Apr 10, 2026
Merged

perf: upgrade analysis cache from FIFO to LRU eviction#246
askpt merged 2 commits intomainfrom
repo-assist/perf-lru-cache-2026-04-07-75018df41e67a7cc

Conversation

@github-actions
Copy link
Copy Markdown
Contributor

@github-actions github-actions Bot commented Apr 7, 2026

🤖 This PR was created by Repo Assist, an automated AI assistant.

Summary

The analysis result cache in MetricsAnalyzerFactory.analyzeFile previously used FIFO eviction: when the cache was full, the first-inserted entry was deleted, regardless of whether it had been accessed recently.

This changes it to LRU (Least Recently Used) eviction by deleting and re-inserting a cache entry on every hit, which moves it to the end of the Map's insertion order. The entry at the front is then always the least recently used, so eviction removes the entry least likely to be needed again.

Why this matters

In a typical VS Code workflow a developer cycles between a small number of open files. With LRU eviction those files stay warm in the 20-entry cache even in a large workspace, whereas with FIFO they could be evicted by infrequently-opened files that happened to be visited first.

Changes

  • src/metricsAnalyzer/metricsAnalyzerFactory.ts: on cache hit, delete + set to promote to MRU position; updated JSDoc comment.

Test Status

  • npm run compile ✅ no errors
  • npm run lint ✅ no errors
  • npm run test:unit ✅ all 35 existing tests pass

No new tests in this PR (cache-behaviour tests are covered in the companion PR repo-assist/test-coverage-improvements-2026-04-07).

Generated by 🌈 Repo Assist, see workflow run. Learn more.

To install this agentic workflow, run

gh aw add githubnext/agentics/workflows/repo-assist.md@7c7feb61a52b662eb2089aa2945588b7a200d404

When a cache entry is hit, delete and re-insert it so it moves to the
back of the Map's insertion-order. This means the entry deleted on
eviction is always the least-recently-used rather than the first
inserted, which better retains hot files (e.g. the file a developer
is currently editing) across larger workspaces.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
@askpt askpt changed the title [Repo Assist] perf: upgrade analysis cache from FIFO to LRU eviction perf: upgrade analysis cache from FIFO to LRU eviction Apr 10, 2026
@askpt askpt marked this pull request as ready for review April 10, 2026 06:39
@askpt askpt self-requested a review as a code owner April 10, 2026 06:39
Copilot AI review requested due to automatic review settings April 10, 2026 06:39
@askpt askpt enabled auto-merge (squash) April 10, 2026 06:39
@codecov
Copy link
Copy Markdown

codecov Bot commented Apr 10, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 68.83%. Comparing base (27d31cc) to head (805635e).
⚠️ Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #246      +/-   ##
==========================================
+ Coverage   68.80%   68.83%   +0.03%     
==========================================
  Files           8        8              
  Lines        2936     2939       +3     
  Branches      283      283              
==========================================
+ Hits         2020     2023       +3     
  Misses        914      914              
  Partials        2        2              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@askpt askpt merged commit 37d89c1 into main Apr 10, 2026
11 checks passed
@askpt askpt deleted the repo-assist/perf-lru-cache-2026-04-07-75018df41e67a7cc branch April 10, 2026 06:40
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the in-memory analysis result cache used by MetricsAnalyzerFactory.analyzeFile to use LRU (least-recently-used) eviction semantics, improving cache hit rates in typical VS Code “switch between a few files” workflows.

Changes:

  • Promotes cache entries on hit by delete + set to move them to the most-recently-used position in Map insertion order.
  • Keeps eviction logic (when full) but makes it effectively remove the least-recently-used entry due to the new promotion behavior.
  • Updates the cache JSDoc to describe LRU eviction.

const CACHE_MAX_SIZE = 20;

/** Cache of analysis results keyed by language + content hash. Evicts oldest entry when full. */
/** Cache of analysis results keyed by language + content hash. Evicts least-recently-used entry when full. */
Copy link

Copilot AI Apr 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cache key generation includes sourceText.length in addition to language + hash, but the cache JSDoc says it’s keyed only by language + content hash. Please update the comment to match the actual key components (or remove the length component if it’s not needed).

Suggested change
/** Cache of analysis results keyed by language + content hash. Evicts least-recently-used entry when full. */
/** Cache of analysis results keyed by language + source text length + content hash. Evicts least-recently-used entry when full. */

Copilot uses AI. Check for mistakes.
Comment on lines 118 to 122
if (cached) {
// Move to end to maintain LRU order (most recently used stays at back)
analysisCache.delete(cacheKey);
analysisCache.set(cacheKey, cached);
return cached;
Copy link

Copilot AI Apr 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LRU behavior is implemented by delete+set on cache hit, but there isn’t a test that verifies LRU eviction semantics (i.e., a recently-accessed entry should survive an eviction while the true least-recently-used entry is evicted). Please add a unit test that fills the cache, re-accesses one key to promote it, then inserts one more entry and asserts the promoted key is still cached (e.g., via reference equality) while the oldest non-promoted key is recomputed.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants