New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
C#: Async support for better concurrency #93
Comments
async functions are supported. e.g. see the new WebHook template I just added. I assume this would also work for Storage related scenarios. Right @fabiocav? @cgillum maybe give it a quick try on your test function? If it works, we should change all the C# templates to be async. |
Yes. Use of "async" is supported in all scenarios, except when when using ref/out parameters (as it isn't supported by C#), which is a common pattern with WebJobs. In those cases, you can still return a Task, making sure you're not blocking in the function by using continuations when needed, and although the code is not as simple or readable as when using async/await, the performance benefits are the same. |
@fabiocav do you have a sample of a function using continuation that way? Looking at our templates, I don't see any that use It think it's important to have a model that easily leads users to write efficient code. If doing it is possible but really messy looking, then I think we may need to re-evaluate our signatures. e.g. the use of I suspect the "but the SDK does it like this" argument might come up, but in this new world, it's really important to do the right thing for users even if it means a break from the old pattern. |
I think the templates have been updated to remove input and output bindings, so they only show the trigger functionality; that's why none of them are using out. I (really) hope there are plans to have more complete templates, covering more complex scenarios with input and output bindings. I'll put some examples together using |
Thanks David and Fabio! I verified that changing the signature to return Task indeed does the right thing. Glad to know this is already supported. However, the concurrency was still being throttled by the batch size when using queue triggers. I'm not sure if this is intentional or desirable. In any case, that's probably a separate issue and we can close this one. |
@fabiocav yes, we do need more complete templates. I opened Azure/azure-functions-templates#32 to track. |
The signature for C# functions is synchronous. If I want to do I/O intensive tasks, I have to use blocking I/O. For queue-triggered functions, this limits my function execution concurrency to the batch size (i.e. 32 concurrent operations by default). If these are long-running I/O operations, then my per-VM throughput is pretty limited.
Ideally C# functions can be written async in a way that allows me to run far more functions (hundreds?) concurrently on a single VM.
The text was updated successfully, but these errors were encountered: