Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-processing aiocache #426

Closed
John-Gee opened this issue Nov 29, 2018 · 9 comments
Closed

Multi-processing aiocache #426

John-Gee opened this issue Nov 29, 2018 · 9 comments

Comments

@John-Gee
Copy link

John-Gee commented Nov 29, 2018

Hello,

this very well may not be an issue but a misconfiguration on my own.
I'd appreciate help if that's the case.

I'm using aiocache and aiohttp with Redis, all on the same host.
I have decorated a wrapper around aiohttp.get as such:

@cached(ttl=604800, cache=RedisCache, serializer=PickleSerializer(),
        port=6379, timeout=0)
async def get_page(url):
    async with session.get(url) as resp:
        dostuff()

My problem is that I call this get_page function from different processes in a processpool, all with their own event loop and either aiocache or redis seems to not like that as I get:

2018-11-28 20:03:44,266 aiocache.decorators ERROR Couldn't retrieve get_page('https://www.site.com/')[], unexpected error
Traceback (most recent call last):
File "/usr/lib/python3.7/site-packages/aiocache/decorators.py", line 124, in get_from_cache
value = await self.cache.get(key)
File "/usr/lib/python3.7/site-packages/aiocache/base.py", line 61, in _enabled
return await func(*args, **kwargs)
File "/usr/lib/python3.7/site-packages/aiocache/base.py", line 44, in _timeout
return await func(self, *args, **kwargs)
File "/usr/lib/python3.7/site-packages/aiocache/base.py", line 75, in _plugins
ret = await func(self, *args, **kwargs)
File "/usr/lib/python3.7/site-packages/aiocache/base.py", line 192, in get
value = loads(await self._get(ns_key, encoding=self.serializer.encoding, _conn=_conn))
File "/usr/lib/python3.7/site-packages/aiocache/backends/redis.py", line 24, in wrapper
return await func(self, *args, _conn=_conn, **kwargs)
File "/usr/lib/python3.7/site-packages/aiocache/backends/redis.py", line 100, in _get
return await _conn.get(key, encoding=encoding)
RuntimeError: Task <Task pending coro=<func() running at file.py:88>> got Future attached to a different loop.

Here's how I setup each new loop in the sub processes:

    loop = asyncio.new_event_loop()
    asyncio.set_event_loop(loop)
    session = aiohttp.ClientSession()
    tasks = []
    tasks.append(asyncio.ensure_future(dostuff2_that_calls_get_page(),
                                       loop=loop))
    loop.run_until_complete(asyncio.gather(*tasks, loop=loop))
    loop.run_until_complete(session.close())

Thank you!

@John-Gee
Copy link
Author

John-Gee commented Nov 29, 2018

Interestingly if I don't use noself=true, redis cannot find the keys in its cache as the keys incorporate the objects' address which change after every run, but I get no more error.

@argaen
Copy link
Member

argaen commented Nov 29, 2018

Hey @John-Gee can you put a small working snippet that reproduces the issue so I can have a look?

@John-Gee
Copy link
Author

Hello,

here it is: https://gist.github.com/John-Gee/f93cb05acec1624c9db6df6bbf33effd

I hope it's not still too big, I tried to shorten it as much as I could without losing too much clarity in what I was trying to achieve.

A simple

more mylog.log | grep -i "different loop"

is useful to see if the error message is in the log or not. It does not show on the terminal by default.

I probably should have written that before, versions:
python v3.7.1
aiohttp v3.4.4
aiocache v10.1
aioredis v1.1
ujson v1.35
redis 5.0.2

In case it matters, all on Linux 64b.

Thank you Manuel!

@crisidev
Copy link

You need to instanciate an aiocache object per process. You cannot share a loop in a multiprocess pool. If you move the aiocache creation inside the process code, it will work fine.

@argaen
Copy link
Member

argaen commented Nov 29, 2018

Yep, that's true. When decorating a function with cached the cache instance is created at import time and thus using the loop in that specific process. When then the code is executed in the process pool with a different loop it crashes because aiocache is not using the same loop.

As @crisidev mentions, you have to move the cache creation inside the process code

@John-Gee
Copy link
Author

John-Gee commented Nov 30, 2018

Alright I was wondering if that was the case but I couldn't find out how to do so.
(Oh and since I'm asking about simple stuff, I couldn't find how to use a unix socket for Redis either with aiocache, how do I do that?).

Thank you!

edit: Yup I've tried something quickly and it works indeed. I'm still interested in the question above though. :)

Thanks guys!

@John-Gee
Copy link
Author

John-Gee commented Nov 30, 2018

In my wrapper I'm reusing your code as such:
https://gist.github.com/John-Gee/e55078b9eea523d33821c83bbc07e5f4

(I removed the self part as it's not useful to me, well for now.)

Is this ok with you? It's used in a project under the GNU GPL license, hosted here on GitHub.
I can add the license locally if needed of course.

Thank you!

@argaen
Copy link
Member

argaen commented Nov 30, 2018

Oh and since I'm asking about simple stuff, I couldn't find how to use a unix socket for Redis either with aiocache, how do I do that?

aiocache just passes the endpoint to the create_pool call for aioredis. Try to set address to the unix socket in the endpoint and maybe it works. Not sure how aioredis manages that. https://github.com/argaen/aiocache/blob/master/aiocache/backends/redis.py#L217
If you find its actually not supported, open an issue with that please :) (also happy to receive PRs)

Is this ok with you? It's used in a project under the GNU GPL license, hosted here on GitHub.
I can add the license locally if needed of course.

Yeah no worries :)

@argaen
Copy link
Member

argaen commented Dec 5, 2018

Closing as original issue was fixed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants