Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strange errors #4

Closed
typeshige opened this issue Aug 10, 2012 · 7 comments
Closed

Strange errors #4

typeshige opened this issue Aug 10, 2012 · 7 comments

Comments

@typeshige
Copy link

Hi,

I'm relatively new to Johnny Cache and memcached so maybe it's a simple problem, but I have a report app that is giving me problems. For 2011, for example, I have 14 teams on the landing page. Clicking on most work fine, but one always has this error:

MemcachedError at /nai/reports/annual-reports/2011/gsfc/
error 37 from memcached_set: SYSTEM ERROR(Resource temporarily unavailable), host: 127.0.0.1:11211 -> libmemcached/io.cc:358

I'm wondering if it's something to do with my version of memcached on my server.

I'm running Ubuntu. Can you tell me what the best version of memcached to use?

Also, I'm using pylibmc==1.2.3

Thanks.

@jself
Copy link
Collaborator

jself commented Aug 10, 2012

Hello,
It sounds like your memcached is either going down or you're losing connectivity, maybe with pylibmc. My suggestions are to try a different version of memcached, make sure your memcached maximum size is less than 90-85% of your physical memory size, and to try a different memcache library. This is probably a question better answered though by the memcached peeps or pylibmc people. I run memcached 1.4 for what it's worth

@jself jself closed this as completed Aug 10, 2012
@typeshige
Copy link
Author

A quick update:

Installed more memory: 16 GB -> 32 GB (STILL FAILS)
Upgraded to the latest (1.4.15) memcached (STILL FAILS)
Installed the latest (1.0.10) libmemcached and reinstall pylibmc (STILL FAILS)
Installed python-memcached 1.48 and updated the johnny cache backend settings (SUCCESS!!)

Thanks,
Shige

@vovkd
Copy link

vovkd commented Oct 6, 2013

Hello. Got this bug today:

Django Version: 1.5
Exception Type: ServerError
Exception Value: error 10 from memcached_set: SERVER ERROR

At some point this can be fixed by starting memcached with -I 2M key. But It's a bad option, i think. Memcached thinks like that too: "WARNING: Setting item max size above 1MB is not recommended! Raising this limit increases the minimum memory requirements and will decrease your memory efficiency."

Strange part was that cached data was only 300kb in size and i can cache it without Johny Cache.

UPD: After some investigations i have found that problem not in the Johny Cache directly. JK caches all database reads (select something from database) so if fetched data more the 1M in size than you would see this error: error 10 from memcached_set: SERVER ERROR. Because value that JK wants to store in cache bigger than memcached can accept per key.

So JK was trying to store all data in queryset and this data can be more in size than 1M. Well, as you can see technically JK is not wrong. But what it must do when fetched queryset is big.

@chrisspen
Copy link

I also have this problem. Unfortunately, python-memcached is even worse in that it silently hides the error and simply does nothing for set(). Took me a while to realize why my caching was having no effect until I switched to pylibmc. Unfortunately, the error message is still next to useless.

@mihow
Copy link

mihow commented Mar 28, 2014

Why is this issue closed? What is the recommended solution?

@jself
Copy link
Collaborator

jself commented Mar 28, 2014

This is a memcache limitation, not a johnny cache issue necessarily. Johnny could shard the data, but that would require an additional memcache hit. Since Johnny is a pass-through cache, I imagine that the savings probably aren't worth it, since the primary benefits of memcache are quick lookups, and memory based storage. Large files fit more into the database realm. The easiest quick fix is to use -I on memcache to specify the max size - see http://code.google.com/p/memcached/wiki/ReleaseNotes142

I haven't done tests with redis at all, but using a redis backend could be a potential solution as well.

@goconnectome
Copy link

We had a similar problem when johnny cache tried to store large querysets (e.g. during site mapping), but we didn't want to permanently blacklist the associated tables. In case it helps anyone else, I wrote this decorator to temporarily disable caching on a table for a specific view call:

from johnny import cache

def add_to_blacklist(blacklisted_tables):
    cache.blacklist.update(blacklisted_tables)

def remove_from_blacklist(blacklisted_tables):
    for table in blacklisted_tables:
        cache.blacklist.remove(table)

def johnny_be_good(blacklisted_tables,view_func=None):
    def blacklist_decorator(view_func):
        def _wrapped_view_func(*args, **kwargs):
            add_to_blacklist(blacklisted_tables)
            response = view_func(*args, **kwargs)
            remove_from_blacklist(blacklisted_tables)
            return response
        return _wrapped_view_func
    if view_func:
        return blacklist_decorator(view_func)
    return blacklist_decorator

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants