Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a way to get a list / iterate over all items in the cache #16

Closed
theadin opened this issue Oct 3, 2022 · 6 comments · Fixed by #17
Closed

Is there a way to get a list / iterate over all items in the cache #16

theadin opened this issue Oct 3, 2022 · 6 comments · Fixed by #17
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@theadin
Copy link

theadin commented Oct 3, 2022

Hi,

Is there a way to get a list /iterate over all items in the cache?

Cheers

@neon-sunset
Copy link
Owner

neon-sunset commented Oct 4, 2022

Hi and thank you for interest in the library.

Listing all items or providing enumerator was intentionally not done for the following reasons:

  • To hide implementation details
  • To not have too many ways to do the same thing because even default dictionaries may have confusing method calls that kill performance like .AddOrUpdate(key, addFactory, updFactory) which people often misuse
  • To limit library scope as it tries to do few things but well

For bulk operations, you can instead use CachedRange<MyType>.Save and CachedRange<MyType>.Remove, for large lists and arrays it will be multithreaded.

If that's not what you need, do let me know which use case you have in mind and I will look into implementing it if it's viable to provide simple API for that.

@theadin
Copy link
Author

theadin commented Oct 4, 2022

Thanks for replying. Your library is actually very useful for one of my projects.

I'm currently adding items to it using:

string key = $"{a}_{b}_{c}"; //For example
if (Cached<int>.TryGet(key, out var cached)) {
	int cnt = cached.Value;
	....some logic...
	cached.Update(cached.Value + 1);
} else {
	cached.Save(1, TimeSpan.FromMinutes(1));
}

Basically, I'm just counting how many calls are being made to a certain function with a certain key in the past minute since first one.

Anyway, since this is a Web Api endpoint which can get many different keys which I don't know what they will be ahead of time, I wanted a way to know at any given point at least how many are currently in the cache. It perhaps isn't necessary to iterate over them all although that would be nice

Thanks for your work
Cheers

@neon-sunset neon-sunset reopened this Oct 4, 2022
@neon-sunset neon-sunset self-assigned this Oct 4, 2022
@neon-sunset neon-sunset added the good first issue Good for newcomers label Oct 4, 2022
@neon-sunset neon-sunset added this to the 1.6.0 milestone Oct 4, 2022
@neon-sunset neon-sunset added enhancement New feature or request and removed good first issue Good for newcomers labels Oct 4, 2022
@neon-sunset
Copy link
Owner

neon-sunset commented Oct 4, 2022

@theadin Addressed in https://github.com/neon-sunset/fast-cache/releases/tag/1.6.0 (also fixed publishing XML documentation which should make using the lib. easier)

The usage for new API is as following:

using FastCache.Services;

// Returns only non-expired
foreach (var cached in CacheManager.EnumerateEntries<string, int>())
{
   // Do stuff
}
using FastCache.Services;

var totalCount = CacheManager.TotalCount<string, int>(); // includes expired

Small comment regarding .TotalCount:

Exposing .Count that returns the count of only non-expired entries would require walking through all cache entries and counting them explicitly, which is anti-performance and I prefer not to provide methods that are a performance trap because that, well, would make the cache not fast which would be terrible :D

As for your particular use case, I would like to note that such counting logic is non-atomic because it reads last cached requests count value from the cache, increments it by one and then saves it back. If you have two concurrent requests executed right at the same time, there is a small chance that one of the increments will be missed if both requests read, increment and write back the value at precisely the same time.

Therefore, you may want to use Interlocked.Increment. For example:

internal class Counter
{
   private uint _value = 1;

   public int Value => _value;

   public void Increment() => Interlocked.Increment(ref _value);
}
if (Cached<Counter>.TryGet(a, b, c, out var cached)) {
	cached.Value.Increment();
	// some logic...
} else {
	cached.Save(new Counter(), TimeSpan.FromMinutes(1));
}

This way we ensure atomicity and precision of the counter at the cost of having to allocate a separate "holder" object for it.

And the last note - the library exposes methods for "multi-argument keys" which means you don't have to concat a string like $"{a}_{b}_{c}". Instead, you can simply call Cached<Counter>.TryGet(a, b, c, out var cached). Keep in mind that in such case, in order to enumerate cache entries, you will need to specify corresponding multi-argument key as (K1, K2, K3).

For example, if variables a, b and c are int, bool and string respectively, the signature for enumerating all cache entries will be CacheManager.EnumerateEntries<(int, bool, string), Counter>().

@theadin
Copy link
Author

theadin commented Oct 4, 2022

Thanks. I'm sure anyone who uses this library will appreciate the added functionally :)

You are right. I should be using Interlocked.Increment.

And thanks for pointing out "multi-argument keys" I did see this option in prior to using the library but was updating existing code so I did use it but it is a good idea. If a or b or c or null sometimes, will it still work? I suppose I can test it myself :)

It is good you are putting speed at the top of your priority 👍

@neon-sunset
Copy link
Owner

The key itself must not be null (hence notnull constraint). However, its contents can be null because keys comprising of multiple arguments are "structurally evaluated". Which means that in (string?, int, int), the string? argument can be null and will be evaluated accordingly.

Also thanks, I'm happy this library was useful to you :)

@theadin
Copy link
Author

theadin commented Oct 4, 2022

Excellent work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants