New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom cache keys or support for multiple schemas in a single database #120
Comments
Cacheops supports multiple databases with different but samely named tables with BTW, your solution with configurable cache key functions seems like an overkill. |
P.S. Oh, I forgot about postgresql schemas, Django currently doesn't support them, so there is no definite way for cacheops to do it. |
I know about
Looks like an easy way to support multiple schemas or any such use case to me. This single feature solves any possible usecase where keys might possibly clash. |
A solution for multiple schemas would be most useful, using numbered redis dbs isn't really a desirable workaround. |
I also don't know why this issue would be closed. Using multiple redis databases doesn't seem like a very good or easy solution to implement seamlessly. |
This issue is closed because there is no official or de facto standard way of using PostgreSQL schemas in Django. So even asking to solve this for cacheops doesn't make sense until it's solved for Django. Also, is there anybody here who needs this? It looks for me as theoretical issue for now. |
Hi @Suor. Thanks for the quick reply. First I should say that I love django-cacheops. I've used it in a couple of projects and it has worked very well for us. I failed to give you context into my current issue, so here goes. You're right that Django core does not support schemas, but it can be made to work. My current project is using django-tenants, so that is really the source of the issue. To me, the solution @owais provided is good enough to allow me to continue using cacheops. Maybe using multiple schemas isn't popular enough for this to be a priority, which is fair, but for me it's a +1. Thanks! |
The fact that you was able to use cachalot says that cacheops is probably overkill for you anyway. I'll reopen this and mark it as looking for more interest. We'll see if any more people need/want this. |
Thanks for reopening @Suor. Fwiw, I'm not using cachalot. I just liked their idea making the key configurable. |
Please review this |
I have a Django project that shares the database for many users and a custom cache key would allow me to prevent invalidation for every user when just one user updates their data. |
Any more interest here? I'd love to use cacheops, but I don't really want to have to monkey-patch or maintain my own fork. |
My original use case was very much real and not theoretical at all. Lack of this ability to influence the generation of cache-keys made this otherwise super library useless for me. I've since then moved to a different job and don't need it now but I know the project I left behind will still not be able to use cacheops if they wanted to until this feature is added. |
Yeah. I have the use case of supporting users who share tables and don't want to invalidate all of the table cache for all users when one user updates their data. |
@meizon your case should be handled automatically. If you share table then you query it by condition and cacheops does its thing. |
@Suor Ah, I wasn't sure how cacheops invalidated caches, because when I save a "Resource" how does cacheops know to invalidate that particuar resource for "Resource.objects.for_user(user)" ? |
@MeiZin don't know what |
Yeah. That's exactly what it does. I'll take a look with some debugging. Thanks! |
@grjones yes, that's what I assumed. |
@grjones @meizon invalidation happens on query level based on conditions. It is explained in detail here - http://hackflow.com/blog/2014/03/09/on-orm-cache-invalidation/ |
@Suor thanks for that post. Going to go and make sure I understand it all 😃 |
I'm using Schema-aware database caching would be awesome :) |
Would be great to have a Schema-aware database cahing !! The project I'm working on will defenitely need this feature ! |
This has been implemented in 29ad33f, no docs yet. Used like: CACHEOPS_PREFIX = lambda query: ...
# or
CACHEOPS_PREFIX = 'some.module.prefix_func'
def prefix_func(query):
return _request_context.request.get_host()
# Simple middleware to store request
_request_context = threading.local()
class StoreRequestMiddleware:
def process_request(self, request):
_request_context.request = request
def process_response(self, request, response):
del _request_contexts.request
def process_exception(self, request, exception):
del _request_contexts.request If anyone is interested to try this I will be glad to hear about your experience. Also, take a look at CHANGELOG it has some backwards incompatible changes. |
This thread was really helpful! I have implemented the from django.db import connection
def cacheops_prefix(_):
return connection.tenant.schema_name This way each query will only be cached for its own tenant. I moved the function to a separate file to prevent import errors for |
Postgresql support multiple schemas. The schemas can have same table names. Two rows from similar tables from 2 different schemas might cause a race condition and return cached object from the WRONG schema.
It can be solved by making it possible to generate the cache key at runtime. We discussed this over at django-cachalot and came up with a solution. May be something similar could be useful done here as well? noripyt/django-cachalot#1
The text was updated successfully, but these errors were encountered: