Skip to content

fix: preserve Python types in L1-only mode, allow cache_clear() on async#117

Merged
27Bslash6 merged 2 commits into
mainfrom
fix/l1-only-type-preservation
May 16, 2026
Merged

fix: preserve Python types in L1-only mode, allow cache_clear() on async#117
27Bslash6 merged 2 commits into
mainfrom
fix/l1-only-type-preservation

Conversation

@27Bslash6
Copy link
Copy Markdown
Contributor

@27Bslash6 27Bslash6 commented May 16, 2026

Summary

Root Cause

The L1-only code path serialized results into MessagePack bytes on cache miss but returned the original object. On cache hit, it deserialized — returning a different type (tuples → lists, sets → lists). This made type(fn()) != type(fn()) across calls.

Design

@cache(backend=None) now uses ObjectCache (same as @cache.local()) for storage. This means:

  • Raw Python objects stored by reference (like lru_cache)
  • No serialization overhead in L1-only mode
  • Types preserved perfectly

Users who need serialization consistency with L2 (for upgrade path testing) can use @cache(backend=None, serializer='standard') — but the default behavior should match user expectations.

Test plan

  • 1401 unit tests pass (0 regressions)
  • Tuple/set/frozenset preserved across miss→hit (verified manually)
  • cache_clear() works on async L1-only functions
  • cache_clear() still raises TypeError for async+backend (intentional)
  • Invalidation (per-key and bulk) works with new storage
  • Lint + type check pass

Closes #73
Closes #76

Summary by CodeRabbit

  • Bug Fixes

    • Cache invalidation now clears entries from the additional in-memory L1 storage to avoid stale hits.
    • Clearing caches for async-decorated functions no longer raises in L1-only mode.
  • Improvements

    • L1-only mode preserves raw Python objects to skip serialization/deserialization and reduce overhead.
  • Tests

    • Updated tests cover the new async-clear semantics and L1-only behavior.

Review Change Stack

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented May 16, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: e6bde1d6-ceeb-433e-b7a8-5c2393eed156

📥 Commits

Reviewing files that changed from the base of the PR and between ee843fa and b890434.

📒 Files selected for processing (2)
  • src/cachekit/decorators/wrapper.py
  • tests/unit/test_cache_clear_async.py
🚧 Files skipped from review as they are similar to previous changes (2)
  • src/cachekit/decorators/wrapper.py
  • tests/unit/test_cache_clear_async.py

📝 Walkthrough

Walkthrough

L1-only mode now stores raw Python objects via ObjectCache; sync and async wrappers read/write this cache and track keys; invalidate_cache/ainvalidate_cache remove ObjectCache entries; cache_clear() works synchronously for async wrappers in L1-only mode. Tests updated to reflect these behaviors.

Changes

L1-only mode with object type preservation

Layer / File(s) Summary
ObjectCache import and initialization
src/cachekit/decorators/wrapper.py
ObjectCache is imported and _object_cache is conditionally created for L1-only mode to store raw Python objects without serialization.
Sync and async L1-only wrappers
src/cachekit/decorators/wrapper.py
Both sync and async wrapper code paths retrieve and store raw objects directly in _object_cache with TTL fallback, returning cached objects on hit and tracking written keys.
Cache invalidation for ObjectCache
src/cachekit/decorators/wrapper.py
invalidate_cache and ainvalidate_cache delete tracked keys from _object_cache for bulk and single-key invalidation, falling back to _l1_cache when _object_cache is absent.
Async cache_clear() for L1-only mode
src/cachekit/decorators/wrapper.py
cache_clear() now raises TypeError for async functions only when a backend is configured; in L1-only mode sync cache_clear() works for async-decorated functions.
Test validation
tests/unit/test_cache_clear_async.py
Tests assert that L1-only cache_clear() does not raise for async wrappers and clears entries, that cache_clear() raises when a backend exists, and that ainvalidate_cache() continues to work.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

  • cachekit-io/cachekit-py#108: Both PRs modify cache invalidation paths in wrapper.py to clear tracked keys; this PR extends that logic to also clear _object_cache when present.
  • cachekit-io/cachekit-py#96: Overlaps on introducing and using ObjectCache for local in-memory caching semantics referenced by this change.

Poem

🐰 I stash objects soft and light,
No MessagePack to change their type,
Async clears now hop in line,
Keys are tracked, entries decline,
Fresh results bloom — carrots shine!

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 38.89% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title 'fix: preserve Python types in L1-only mode, allow cache_clear() on async' clearly and concisely describes the two main changes: type preservation for L1-only mode and cache_clear() support for async functions.
Description check ✅ Passed The PR description comprehensively covers the changes with clear sections on Summary, Root Cause, Design, and Test plan. It documents the two issues addressed (#73 and #76), explains the motivation, and confirms test results.
Linked Issues check ✅ Passed The code changes fully address both linked issues: #73 is resolved by using ObjectCache for L1-only mode to preserve Python types without serialization; #76 is resolved by allowing cache_clear() on async functions in L1-only mode while retaining TypeError for async+backend.
Out of Scope Changes check ✅ Passed All changes in wrapper.py and test_cache_clear_async.py are directly scoped to addressing #73 (type preservation via ObjectCache) and #76 (cache_clear() for async L1-only functions). No unrelated modifications detected.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch fix/l1-only-type-preservation

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (1)
tests/unit/test_cache_clear_async.py (1)

98-128: ⚡ Quick win

This test never exercises the async+backend invalidation path.

The class/docstring says this is the recommended path for async functions with a backend, but @cache(backend=None) only covers the L1-only branch. Please either rename it to match the scenario under test or back it with a backend double so the async backend invalidation path is actually covered.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@tests/unit/test_cache_clear_async.py` around lines 98 - 128, The test
TestAsyncInvalidateCacheStillWorks currently uses `@cache`(backend=None) so it
only exercises the L1-only path; either update the test to reflect that by
renaming the class/docstring to indicate L1-only, or change the decorator to
supply a backend test double (e.g., a simple async-capable backend mock) so
async_func and its method ainvalidate_cache actually exercise the async+backend
invalidation path; modify the test to instantiate and pass that backend into
cache(...) and keep calling async_func.ainvalidate_cache(5) to validate backend
invalidation behavior.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@src/cachekit/decorators/wrapper.py`:
- Around line 587-593: The wrapper currently coerces ttl to 300 on cache writes,
violating create_cache_wrapper()'s contract that ttl=None means no expiration;
update the write sites (in the cache wrapper where _object_cache.put(cache_key,
result, ttl=ttl or 300) and the similar block around lines 912-917) to pass ttl
through unchanged (or translate None to the explicit sentinel expected by
ObjectCache.put) instead of using "ttl or 300", ensuring that ttl=None remains
unbounded expiry when calling ObjectCache.put with cache_key, result.
- Around line 579-585: In the L1-only fast-return path inside the wrapper (when
_l1_only_mode is true and _object_cache.get(cache_key) yields found), call
features.clear_correlation_id() before resetting function stats and returning
the cached value so the correlation context is not preserved across requests;
i.e., insert a call to features.clear_correlation_id() just prior to
reset_current_function_stats(token) / return cached_value in the branch that
currently invokes _stats.record_l1_hit().

---

Nitpick comments:
In `@tests/unit/test_cache_clear_async.py`:
- Around line 98-128: The test TestAsyncInvalidateCacheStillWorks currently uses
`@cache`(backend=None) so it only exercises the L1-only path; either update the
test to reflect that by renaming the class/docstring to indicate L1-only, or
change the decorator to supply a backend test double (e.g., a simple
async-capable backend mock) so async_func and its method ainvalidate_cache
actually exercise the async+backend invalidation path; modify the test to
instantiate and pass that backend into cache(...) and keep calling
async_func.ainvalidate_cache(5) to validate backend invalidation behavior.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: a865f557-8877-4d5b-bc91-de7d3ef68c8c

📥 Commits

Reviewing files that changed from the base of the PR and between 3c3ba52 and 7b8ab0e.

📒 Files selected for processing (2)
  • src/cachekit/decorators/wrapper.py
  • tests/unit/test_cache_clear_async.py

Comment thread src/cachekit/decorators/wrapper.py Outdated
Comment thread src/cachekit/decorators/wrapper.py Outdated
@codecov
Copy link
Copy Markdown

codecov Bot commented May 16, 2026

Codecov Report

❌ Patch coverage is 90.90909% with 3 lines in your changes missing coverage. Please review.
✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
src/cachekit/decorators/wrapper.py 90.90% 0 Missing and 3 partials ⚠️

📢 Thoughts on this report? Let us know!

… async

L1-only mode (`backend=None`) now stores raw Python objects via ObjectCache
instead of serializing through MessagePack. This fixes:

- #73: tuples, sets, frozensets no longer degrade to lists on cache hit
- #76: cache_clear() works synchronously on async functions in L1-only mode
  (no backend I/O needed, so no reason to force async invalidation)

The type inconsistency was particularly insidious: first call (miss) returned
the original type, second call (hit) returned the deserialized type. Now both
return the same object.

Closes #73
Closes #76
@27Bslash6 27Bslash6 force-pushed the fix/l1-only-type-preservation branch from 7b8ab0e to ee843fa Compare May 16, 2026 10:41
- Fix ttl=None coercion: use 1-year sentinel (31536000s) instead of
  hardcoded 300s, preserving the contract that ttl=None means no expiry
- Clear correlation ID on L1 cache hit (both sync and async paths) to
  prevent context leaking across requests
- Rename TestAsyncInvalidateCacheStillWorks to reflect it only tests
  L1-only path
@27Bslash6 27Bslash6 merged commit 1fc506b into main May 16, 2026
32 checks passed
@27Bslash6 27Bslash6 deleted the fix/l1-only-type-preservation branch May 16, 2026 11:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bug: async cached functions use ainvalidate_cache() not cache_clear() bug: L1-only mode serializes data unnecessarily (tuples→lists)

1 participant