Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecating cache utilities (removal in 3.6.0) #237

Closed
simonbasle opened this issue Oct 8, 2020 · 6 comments · Fixed by #265
Closed

Deprecating cache utilities (removal in 3.6.0) #237

simonbasle opened this issue Oct 8, 2020 · 6 comments · Fixed by #265
Labels
for/user-attention This issue needs user attention (feedback, rework, etc...) warn/deprecation This issue/PR introduces deprecations
Milestone

Comments

@simonbasle
Copy link
Member

simonbasle commented Oct 8, 2020

Motivation

CacheMono and CacheFlux helpers were an attempt at providing a sane solution to adapt various cache / means of caching into a reactive facade.

There is very little bandwidth to maintain it, and it seems it is not entirely helpful. Maybe even going as far as causing more problems than it solves.

See for example #162, #181, #201 (and recently closed #234...)

When using a caching solution like https://github.com/ben-manes/caffeine that has a future-based async API, we can recommend to just wrap the futures using Mono#toFuture() and Mono.fromFuture(f).

For other use cases for CacheFlux I could find in the wild the path would be a bit more ambiguous.

A few relevant projects that build up on CacheFlux/CacheMono I could find in github include:

Desired solution

One of the following outcomes:

  1. gather feedback indicating that although it has shortcomings (eg. cache stampeding), it is useful for usecases that are not too advanced and should be kept as is, in low maintenance mode
  2. get a community member to step up and spin CacheFlux/CacheMono into a separate community-supported project, deprecate reactor.cache classes in favor of said community project
  3. gather feedback that the drawbacks of CacheFlux/CacheMono overcome the benefits, deprecate reactor.cache classes for future removal

pinging the owners of the above projects for feedback:
@deeperunderstanding, @Odysseymoon, @alex-pumpkin, @rezaarshad, @pkgonan, @Hothire, @making, @mmaggioni, @spencergibb.

also pinging users that contributed to discussions in the above issues:
@dave-fl, @cybuch, @wer-mathurin, @ben-manes, @hmble2, @dannyjiang001

@simonbasle simonbasle added the for/user-attention This issue needs user attention (feedback, rework, etc...) label Oct 8, 2020
@simonbasle simonbasle added this to the 3.5 planning milestone Oct 8, 2020
@wer-mathurin
Copy link

wer-mathurin commented Oct 9, 2020

My understanding from this article:

We are not able to use our loved caching annotations when working with custom cache-manager, because the serialization of the class is required and Mono/Flux are not serializable.

Is there any plan to support the caching annotation in the future when working with webflux?
If this is not the case, we just need to specify it in the documentation!

Louis-Michel

@simonbasle
Copy link
Member Author

My understanding from this article:

Thanks for chiming in @wer-mathurin ! There is indeed a missing piece in the cache abstraction, but one caveat I could gather from looking at these various projects and issues is that unless the backing cache provider supports async use cases (and especially async write-through), it is going to be difficult to come up with a reliable abstraction (like CacheMono) because the async nature of the writes mean that the caching becomes even more subject to cache stampedes.

From what I can gather, there isn't a whole lot of caches that provide async APIs (basically, Caffeine). Off-memory distributed caches might have an even greater challenge in supporting this. My guess is that this is why no AsyncCache abstraction has been officially pursued in the Spring ecosystem yet.

@simonbasle simonbasle added the warn/deprecation This issue/PR introduces deprecations label Nov 10, 2021
@simonbasle
Copy link
Member Author

simonbasle commented Nov 10, 2021

We've decided to phase out these utilities for the reasons mentioned above.
Both CacheFlux and CacheMono will be deprecated in the next release of 3.4.x (3.4.7) and removed in 3.6.0 at the earliest.

@jayanthpatki91
Copy link

jayanthpatki91 commented Apr 4, 2023

Is there a plan/intent to introduce caching feature?

@chemicL
Copy link
Member

chemicL commented Apr 4, 2023

At the moment we are not considering active development or maintenance of a new caching mechanism. However, here's what we can offer at this time to not discourage your efforts: if you have a proof of concept, feel free to host it as your repository and share with us for feedback by opening a new issue. In that issue we can discuss advertising your caching mechanism from the Reactor resources. If your solution gains some traction and you still feel like it, we can get back to the subject at that point and discuss if at that time you are willing and whether we have the resources to help integrate it here. What do you think @jayanthpatki91?

@ben-manes
Copy link

ben-manes commented Aug 7, 2023

Embarrassingly, I only now got around to experimenting with Reactor to understand the reactive streams programming model. While this was a poor fit and shouldn't be used, the combination is actually quite beautiful in two advanced scenarios.

In this example, a CacheLoader coalesces independent asynchronous loads to perform a single batch request. In addition to an AsyncLoadingCace, it is useful for anywhere enabling refreshAfterWrite. That optimistically reloads an entry if a stale request occurred prior to it expiring so that the fetch latency is hidden from callers. Reactor makes it easy to batch this over a size / time window to reduce the impact on the source system.

Another scenario to a write-back cache, where changes are batched and flushed some time after the cache was updated. That might be useful for a replicated cache or similar setups. In this case key-order is maintained using a Map.compute that also publishes to the reactive library for a subscriber to perform the write.

In both cases the reactive logic is very trivial and composes very neatly with the caching library. In the inverse, Spring 6.1 adds support for AsyncLoadingCache so that Spring Cache can return reactive types like CompletableFuture and Mono. So I think these libraries fit very cleanly together to solve complex problems in a seemly trivial manner, even though exactly how where to compose them wasn't immediately obvious.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
for/user-attention This issue needs user attention (feedback, rework, etc...) warn/deprecation This issue/PR introduces deprecations
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants