Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Curio-Asyncio Bridge (Discussion) #190

Closed
dabeaz opened this issue Mar 1, 2017 · 12 comments

Comments

@dabeaz
Copy link
Owner

@dabeaz dabeaz commented Mar 1, 2017

PR #188 added some initial support for an curio-asyncio bridge. I've extended it with a different approach in which you can create a Curio stand-in for the asyncio event loop and submit coroutines to it. Like this::

from curio.bridge import AsyncioLoop
import asyncio

loop = AsyncioLoop()

# an asyncio coroutine - not curio
async def aio_coro(x, y):
         await asyncio.sleep(2)
         return x + y

# A curio coroutine - not asyncio
async def main():
        result = await loop.run_until_complete(aio_coro(2, 2))
        print(result)

run(main())

Under the covers, asyncio is running the event loop in a different thread. The event loop gets shut down automatically when the Curio kernel gets shutdown.

This issue is further discussion about this. What should curio-asyncio bridging look like in a ideal world?

@jkbbwr

This comment has been minimized.

Copy link

@jkbbwr jkbbwr commented Mar 1, 2017

So the problem we are having at the moment using curio is we don't get the massive existing asyncio ecostructure. For example we need a http endpoint for something, we have a very fragile model at the moment that uses falcon and some thread and async magic to compat between them but it breaks and is not good. In an ideal world we could take something like sanic

import asyncio

import curio
from sanic import Sanic
from sanic.response import json

queue = curio.Queue()  # some sort of queue

app = Sanic()


@app.route("/")
async def asyncio_handler(request):
    await queue.put("hello")
    msg = await queue.get()
    assert msg == "hello"
    return json({"hello": "world"})


async def curio_thingy():
    msg = await curio.get()
    await curio.put(msg)


async def main():
    await curio.spawn(curio_thingy())
    await curio.spawn(app.run("127.0.0.1"))

    while True:
        await curio.sleep(1)
        print("Main loop!")


if __name__ == "__main__":
    curio.run(main())

And boom, we have async http gateway that doesn't hurt either side. But this is all quite hard.

I mean it would be killer if curio had some http framework itself, but thats past the point here, until it comes about we are stuck with compat...

@Fuyukai

This comment has been minimized.

Copy link
Contributor

@Fuyukai Fuyukai commented Mar 1, 2017

Personally, I feel like the version you've posted isn't much different from running await curio.abide(partial(loop.run_until_complete, aio_coro())). IMO the loop semantics, similar to the rest of Curio's design, should be invisible to the user.

@dabeaz

This comment has been minimized.

Copy link
Owner Author

@dabeaz dabeaz commented Mar 1, 2017

The modified version still has the background thread and allows thousands of Curio tasks to submit work simultaneously to the same event loop. So, in that sense, it's quite a different than calling loop.run_until_complete() in a thread using abide(). I think I mainly choose that interface to make it similar to what one normally uses to run a coroutine in asyncio.

One benefit to packaging this up into a "Loop" object is that it can accomplish the same thing as the acb function, but without modifying the Curio kernel.

@dabeaz

This comment has been minimized.

Copy link
Owner Author

@dabeaz dabeaz commented Mar 1, 2017

On asyncio-curio communication, a UniversalQueue could probably be used for this, although it might require some tweaking. I will investigate.

@dabeaz

This comment has been minimized.

Copy link
Owner Author

@dabeaz dabeaz commented Mar 1, 2017

I've made some changes to the AsyncioLoop idea. I've kept the special loop object, but I've given it a more explicit run_asyncio() method that more clearly indicates what is happening (a coroutine is being run in asyncio). Behind the scenes, there is only one asyncio event loop running. The method can be called by any number of Curio tasks concurrently. Here's an example involving 10000 Curio tasks:

import random
from curio.bridge import AsyncioLoop
from curio import run, spawn
import asyncio

async def aio_task(x, y):
     await asyncio.sleep(random.random())
     return x + y

async def child(loop, x, y):
      # Run a coroutine on the asyncio event loop                                                                                                            
      result = await loop.run_asyncio(aio_task(x, y))
      print(f'Child: {x}, {y} -> {result}')

async def main():
     loop = AsyncioLoop()
     # Spin up a large number of Curio tasks                                                                                                                 
     for n in range(10000):
         await spawn(child(loop, n, n))

if __name__ == '__main__':
    run(main())

Coroutines submitted to the loop do not line up in a queue. They all run at once. Total execution time of this code is about 2 seconds on my machine.

General thoughts: I think the API should be pretty explicit about what's happening (i.e., this coroutine is running on asyncio!). I'd like to keep asyncio-specific functionality out of the Curio core kernel. Encapsulating it into a AsyncioLoop object makes it possible. Again, looking for more thoughts.

@jkbbwr

This comment has been minimized.

Copy link

@jkbbwr jkbbwr commented Mar 2, 2017

So here is the code working actually flawlessly. The only problem is performance takes a bit of a hit going over universal queue. If the handler doesn't have to interact with the queue it can do roughly 33k requests on my machine, if the handler (like in the example) needs to interact with the queue it does only 3k. Any ideas?

import asyncio

import curio
from curio.bridge import AsyncioLoop
from sanic import Sanic
from sanic.response import json
import uvloop
import asyncio 

queue = curio.UniversalQueue()  # some sort of queue

app = Sanic()


@app.route("/")
async def asyncio_handler(request):
    await queue.put("hello")
    msg = await queue.get()
    return json({"hello": msg})


async def curio_thingy():
    while True:
        msg = await queue.get()
        await queue.put("world")

async def run_sanic(loop):
    server = app.create_server(host="0.0.0.0", port=8001)
    await loop.run_asyncio(server)

async def main():
    asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
    loop = AsyncioLoop()
    await curio.spawn(curio_thingy())
    await curio.spawn(run_sanic(loop))

    while True:
        await curio.sleep(1)


if __name__ == "__main__":
    curio.run(main())
@dabeaz

This comment has been minimized.

Copy link
Owner Author

@dabeaz dabeaz commented Mar 2, 2017

The UniversalQueue implementation is not the fastest thing around at the moment. However, there might be some things that can be done to make it much faster (I have some ideas). In a situation where the curio_thingy() does much more work than echoing right away, I'd imagine that the queue overhead would wash out a bit as well.

@jkbbwr

This comment has been minimized.

Copy link

@jkbbwr jkbbwr commented Mar 2, 2017

But it does work which is grand. I am working on a curio web framework as a personal project but until it is done this will let us use asyncio web frameworks. Thanks for your involvement.

@dabeaz

This comment has been minimized.

Copy link
Owner Author

@dabeaz dabeaz commented Mar 3, 2017

I just pushed a re-envisioned UniversalQueue implementation. It is much faster.

One note: On the above code sample, it's not safe to use a single queue for back-and-forth interaction like that. Two queues should be used--one for each direction.

@jkbbwr

This comment has been minimized.

Copy link

@jkbbwr jkbbwr commented Mar 3, 2017

Have you used any python 3.6 syntax in this latest set of changes, they are super useful to our project but we are currently stuck on 3.5 would there be a release for this on pip ?

@dabeaz

This comment has been minimized.

Copy link
Owner Author

@dabeaz dabeaz commented Mar 3, 2017

Not aware of Python 3.6 syntax per-se (in the core) although there are some Python 3.6 features being tested in the test-suite.

@dabeaz

This comment has been minimized.

Copy link
Owner Author

@dabeaz dabeaz commented Mar 3, 2017

I will say the speedup is substantial. More than 4x faster on a back-and-forth example like you showed above. It's about 8-20x faster on producer-consumer queuing when I tested it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants
You can’t perform that action at this time.