Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Concurrency inside of a given process #432

Closed
SlovakianCanon opened this issue Mar 16, 2021 · 0 comments
Closed

Concurrency inside of a given process #432

SlovakianCanon opened this issue Mar 16, 2021 · 0 comments

Comments

@SlovakianCanon
Copy link

I've been trying all kinds of patterns to achieve what I need, and I've had only limited success so far.

Let's say I have a job type that's very CPU-intensive. I want to be able to process multiple jobs of that type at the same time, so I scaled horizontally and created multiple worker instances in different servers.

The one particular constraint I have is that for each user (< 50 users, let's say 30 for this example) there must not be more than one job running at any given time. So if user1, user2, user3, and user4 all start a job at the same time, fine they can all be executed in parallel by the servers. But if user1 created 4 jobs in a row, I need to wait for job1 to finish before job2 starts (because the result of job1 can influence the execution of job2).

To achieve this, the only way I know of is to use one queue per user, with a concurrency of 1. The problem with this approach is that if I register a worker for each of those queues, and I have 30 queues, there's the potential of running 30 jobs at once on a server (one job in each queue). Since these jobs are CPU-intensive, they basically bring the server down to a crawl.

The fix seems easy in theory: limit the concurrency inside a given server to N, ideally the number of cores. So if there is only one server with two cores, and 4 users register jobs in 4 queues, the first two will be picked up and executed, and the third one will wait for the first one to finish.

Is there a way to do this? I feel like I tried everything but maybe there's a simpler solution I didn't think about.

@manast manast closed this as completed Mar 17, 2021
@taskforcesh taskforcesh locked and limited conversation to collaborators Mar 17, 2021

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants