New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Strange errors #4
Comments
Hello, |
A quick update: Installed more memory: 16 GB -> 32 GB (STILL FAILS) Thanks, |
Hello. Got this bug today: Django Version: 1.5 At some point this can be fixed by starting memcached with -I 2M key. But It's a bad option, i think. Memcached thinks like that too: "WARNING: Setting item max size above 1MB is not recommended! Raising this limit increases the minimum memory requirements and will decrease your memory efficiency." Strange part was that cached data was only 300kb in size and i can cache it without Johny Cache. UPD: After some investigations i have found that problem not in the Johny Cache directly. JK caches all database reads (select something from database) so if fetched data more the 1M in size than you would see this error: error 10 from memcached_set: SERVER ERROR. Because value that JK wants to store in cache bigger than memcached can accept per key. So JK was trying to store all data in queryset and this data can be more in size than 1M. Well, as you can see technically JK is not wrong. But what it must do when fetched queryset is big. |
I also have this problem. Unfortunately, python-memcached is even worse in that it silently hides the error and simply does nothing for |
Why is this issue closed? What is the recommended solution? |
This is a memcache limitation, not a johnny cache issue necessarily. Johnny could shard the data, but that would require an additional memcache hit. Since Johnny is a pass-through cache, I imagine that the savings probably aren't worth it, since the primary benefits of memcache are quick lookups, and memory based storage. Large files fit more into the database realm. The easiest quick fix is to use -I on memcache to specify the max size - see http://code.google.com/p/memcached/wiki/ReleaseNotes142 I haven't done tests with redis at all, but using a redis backend could be a potential solution as well. |
We had a similar problem when johnny cache tried to store large querysets (e.g. during site mapping), but we didn't want to permanently blacklist the associated tables. In case it helps anyone else, I wrote this decorator to temporarily disable caching on a table for a specific view call: from johnny import cache
def add_to_blacklist(blacklisted_tables):
cache.blacklist.update(blacklisted_tables)
def remove_from_blacklist(blacklisted_tables):
for table in blacklisted_tables:
cache.blacklist.remove(table)
def johnny_be_good(blacklisted_tables,view_func=None):
def blacklist_decorator(view_func):
def _wrapped_view_func(*args, **kwargs):
add_to_blacklist(blacklisted_tables)
response = view_func(*args, **kwargs)
remove_from_blacklist(blacklisted_tables)
return response
return _wrapped_view_func
if view_func:
return blacklist_decorator(view_func)
return blacklist_decorator |
Hi,
I'm relatively new to Johnny Cache and memcached so maybe it's a simple problem, but I have a report app that is giving me problems. For 2011, for example, I have 14 teams on the landing page. Clicking on most work fine, but one always has this error:
I'm wondering if it's something to do with my version of memcached on my server.
I'm running Ubuntu. Can you tell me what the best version of memcached to use?
Also, I'm using pylibmc==1.2.3
Thanks.
The text was updated successfully, but these errors were encountered: