Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Beginner to nim/mummy - concurrency question #89

Closed
ahungry opened this issue Aug 13, 2023 · 5 comments
Closed

Beginner to nim/mummy - concurrency question #89

ahungry opened this issue Aug 13, 2023 · 5 comments

Comments

@ahungry
Copy link

ahungry commented Aug 13, 2023

If I want to build an endpoint that aggregates data from 3 different data sources (lets say, HTTP APIs) that do not depend on each other - would the concurrency model used with mummy allow me to make these requests at the same time, or would I need to perform them sequentially?

In a language like nodejs or python, I would use the equivalent of Promise.all() or asyncio.gather(), but both of these are async/await type calls in their respective languages.

@guzba
Copy link
Owner

guzba commented Aug 14, 2023

Right now I suggest just doing them sequentially. This is not because it is impossible to make them run concurrently, it is simply because I have not built any libs / tools to enable that yet.

The most probable solution I see right now is to add a proc to https://github.com/guzba/curly that would make use of libcurl multi requests. Then many requests could be done at once without any additional thread usage.

This is not without some implementation things to sort out -- if I use libcurl handles from the pool, I'd need to take the number needed (or some max) and then not be able to return them until all requests finished, which could cause issues if any of the requests in the multi request take a long time to complete. I'll need to think about how I want to have this work at some point.

If these are fast requests and there are only a couple, just doing them sequentially and moving on is by far the best idea.

@ahungry
Copy link
Author

ahungry commented Aug 14, 2023

Thanks @guzba (for the response here, and for what looks like a lot of nim packages you've provided the world)!

The example was a bit contrived - it could be 3, 10, 20, or even 100 aggregated API calls - the difference between sequential and concurrent is vast.

It sounds like there is no real way to accomplish this at the moment (at least, following mummy idioms? as it'd seem odd to try to pull async stuff on an explicitly non-async web server?) - but I look forward to the solution you proposed!

@guzba
Copy link
Owner

guzba commented Aug 15, 2023

If you are doing that many separate HTTP RPC calls in an API endpoint handler I suggest that is a bit unusual and that is unfortunately not a use-case that aligns with what Mummy is really focused on.

I know how to improve things for more HTTP requests, however nothing will prevent such an endpoint from being trivially vulnerable to abuse which would be a concern for me.

Closing this since there's no action to take in Mummy specifically and creating an issue on Curly.

@ahungry
Copy link
Author

ahungry commented Aug 16, 2023

The actual use case (which I was hoping to POC to my work, as an alternative to python+FastAPI) would be in a heavy microservice architecture, where a middleware/backend-for-frontend serves data tailored to a page or endpoint, which happens to be comprised of data from many different microservices.

Rate throttling and abuse potential is really a concern unrelated to a web server/framework I think - as those things tend to be locked down via other mechanisms (authentication/authorization/WAF level rules).

Thanks for making a follow up on your HTTP client lib!

@guzba
Copy link
Owner

guzba commented Aug 16, 2023

Ah yeah if you're going heavy microservices then Mummy is probably not a good fit. It's not an approach to server design I use so I haven't put in the work to facilitate it better. Other options out there though so hopefully you can find something that feels cozy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants