Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve query cache eviction #12905

Open
ajcvickers opened this issue Aug 6, 2018 · 18 comments
Open

Improve query cache eviction #12905

ajcvickers opened this issue Aug 6, 2018 · 18 comments

Comments

@ajcvickers
Copy link
Member

No description provided.

@ajcvickers
Copy link
Member Author

See #12902, #8223

ajcvickers added a commit that referenced this issue Aug 15, 2018
Issue #12905

I started this with the intention of leaving IMemoryCache in place and adding eviction around it. However, I could not see that IMemoryCache was providing any value other than overhead, so I removed it here.

We have #12157 open to consider changing the way we use IMemoryCache in 3.0, which means that even if we stop using it now, we may want to start using it again in the future.

Therefore, open questions:
* Is it okay to stop using IMemoryCache?
* Should we obsolete the UseMemoryCache methods? After this change they do nothing, but we may want them back in 3.0.
  * Instead of obsoleting, we could update the API doc comments to reflect the current situation
* At the moment, the query limit and number queries to evict is hard-coded. I think it should be possible to configure these in case this change kills perf for some app using a huge number of queries. But then if the implementation changes in 3.0, any configuration we add here may not apply. So ideas:
  * Make them protected properties in the internal service, so people can set them if needed, but only as a workaround to hitting issues.
  * Add proper config and obsolete if needed in 3.0.
@ajcvickers ajcvickers modified the milestones: 2.2.0-preview2, 2.2.0 Sep 11, 2018
@ajcvickers ajcvickers modified the milestones: 2.2.0, 3.0.0 Sep 28, 2018
@craigpopham
Copy link

Just wanted to drop in and say this would be a high value feature for us. We've been chasing persistent memory leaks in some applications of ours running in Azure App Service,

We have some features which dynamically build queries, and this is causing the cache to explode in size. It was made worse by the fact that we had some instance members (on objects which contained the DBContext) being captured as in #8223, so the plan cache blew up in size really quickly.

We've been able to work around it, but improvements in this area would be welcome.

@ajcvickers
Copy link
Member Author

Notes from discussion: we will use MemoryCache but fully internalized.

@AndriySvyryd
Copy link
Member

AndriySvyryd commented Sep 3, 2019

We should consider using a simple heuristic to estimate the relative cache entry size (e.g. number of expressions in the tree).

See also: #27458

AndriySvyryd added a commit that referenced this issue Sep 3, 2019
@AndriySvyryd AndriySvyryd changed the title Implement query cache eviction Improve query cache eviction Sep 10, 2019
@smitpatel smitpatel added this to the 3.1.0 milestone Sep 20, 2019
@smitpatel
Copy link
Member

For heuristic consider the cost of computing & usage count.

@NickCraver
Copy link
Member

NickCraver commented Nov 24, 2019

Unfortuntely, the changes in 67a200f related to this issue are affecting other libraries. If EF Core is setup first (common) the rather low size limit added to the memory cache breaks things downstream, a la MiniProfiler/dotnet#433 (breaking MiniProfiler's storage). If it's not setup first, this limit isn't even in place (given the TryAdd)...so it's not really a global solution to the issue anyway.

Has the impact of the startup race and the somewhat magical/hidden limits been considered here? I'm not sure the approach used is great for such a widely used service to be registered this way. Any thoughts on approaching this a different way?

@AndriySvyryd
Copy link
Member

@NickCraver I thought this was fixed in cbd28ff#diff-1f53a7b867113cb4b9c61eccef38abd9L486, but that was a different change. This is definitely not the intended behavior.

@AndriySvyryd AndriySvyryd removed this from the Backlog milestone Nov 24, 2019
@ajcvickers
Copy link
Member Author

@NickCraver In 3.0, by default, the memory cache used by EF is independent of other memory cache instances precisely because of this issue. Concretely, this means AddDbContext no longer uses the memory cache setup by ASP.NET Core. However, the UseMemoryCache API still exists to optionally allow use of the same cache instance. Are you calling this? Is it necessary (or desired, within the limits of what IMemoryCache allows) to use the same cache for EF and other things?

Ideally, IMemoryCache would allow much better control over these kinds of things. However, the last time we revisited this with the ASP.NET team there was really no interest in going there.

@AndriySvyryd
Copy link
Member

@ajcvickers I believe @NickCraver refers to the services added by the AddEntityFramework* methods. There are still ASP.NET apps that use them instead of just calling AddDbContext

@ajcvickers
Copy link
Member Author

@NickCraver Are you calling AddEntityFramework or AddEntityFrameworkSqlServer? If so, for what reason?

@Cooksauce
Copy link

@ajcvickers The startup config which causes this issue is in the MiniProfiler issue he referenced. Specifically, this comment: MiniProfiler/dotnet#433-547077949

Are the AddEntityFramework methods not recommended? Is there documentation somewhere about why you would (or wouldn't) want to use them?

@ajcvickers
Copy link
Member Author

@Cooksauce Yes, that is absolutely not recommended.

@AndriySvyryd
Copy link
Member

Filed #19053 to help identifying this pattern.

@simeyla
Copy link

simeyla commented Sep 24, 2023

I've been quite confused about EF Core's usage of IMemoryCache for literally years!

I had a call to AddEntityFrameworkSqlServer() in my startup because it seemed like something that should have been there. I've finally removed it today.

When it was there, it was calling the following (indirectly, as part of TryAddCoreServices):

TryAdd<IMemoryCache>(_ => new MemoryCache(new MemoryCacheOptions { SizeLimit = 10240 }));

This 'breaks' usage of IMemoryCache elsewhere if you don't want to use a size limit.

So why I'm confused is that if the memory cache is now internal to EF Core (as mentioned by @ajcvickers in 2019) then why is a SizeLimit still set via TryAdd above?

Is it just that since AddEntityFrameworkSqlServer is obsolete that this config is just being left untouched to avoid breaking changes?

@roji
Copy link
Member

roji commented Sep 24, 2023

@simeyla first, the IMemoryCache in question here is in EF Core's internal service provider; this has nothing to do with any IMemoryCache that you register e.g. in your ASP.NET DI container. It only manages caching that's internal to EF, notably EF LINQ query caching. EF's internal service provider is an implementation detail and you should generally not be modifying or tweaking things there.

AddEntityFrameworkSqlServer() certainly isn't obsolete; it is used whenever you configure EF Core to use SQL Server via UseSqlServer(). It is almost never needed that you call this method directly (but in some edge cases it is), so you can also consider it an internal implementation detail.

Finally, it's generally a bad idea to not have a SizeLimit on caches. For example, if you remove the size limit on EF's internal IMemoryCache - which caches LINQ queries - that means it will grow in an unbounded way as more and more query shapes are executed, without ever evicting anything. That could result in the application consuming too much memory and eventually crashing.

The same logic applies to caching in other layers - you should generally always have a size limit. Why specifically are you attempting to remove it?

@simeyla
Copy link

simeyla commented Sep 24, 2023

@roji My caching needs are pretty simple and I'm using LazyCache for simplicity. It sits on top of IMemoryCache. Point taken about the dangers of 'size-less' caches, but this is fine for my needs 😈

If I use AddEntityFrameworkSqlServer() explicitly I get the Cache entry must specify a value for Size when SizeLimit is set. error.

the IMemoryCache in question here is in EF Core's internal service provider; this has nothing to do with any IMemoryCache that you register e.g. in your ASP.NET DI container.

I think this is the crux of my confusion, because it is the same one!

using Microsoft.Extensions.Caching.Memory;

Then EF does TryAdd() for the global Microsoft.Extensions.Caching.Memory.IMemoryCache with a size:

TryAdd<IMemoryCache>(_ => new MemoryCache(new MemoryCacheOptions { SizeLimit = 10240 }));

This explains why my StackOverflow question has over 8k views - because so many people have run into the issue that EF Core is still using the global IMemoryCache and has set a size on it (even if core team members think it isn't using it!).

I seem to be able to 'fix' LazyCache by just not calling AddEntityFrameworkSqlServer() and then in some kind of deterministic fluke it works. However it makes me wonder if EF Core is now using a size-less IMemoryCache that it could surely lead to memory overflow?

Thanks for your reply. It seems to either be a longstanding bug, a misunderstanding or something else!

@roji
Copy link
Member

roji commented Sep 24, 2023

You are simply not supposed to call AddEntityFrameworkSqlServer() on your ASP.NET container - that configures all of EF's services on that container; the right way to use EF in almost all cases is to let it configure its own internal service provider. In other words, just call UseSqlServer() as indicated in all the tutorials, and EF will set up an internal DI service provider (completely distinct from the ASP.NET one), and will configure its own internal IMemoryCache there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants