You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am working on a heterogeneous computing that is:
10 GPU servers for GPU intensive loads, here we analyze video part of a MP4;
2 CPU servers for CPU intensive loads, here we analyze audio part of a MP4;
I want to submit video tasks to GPU servers and audio tasks to CPU servers.
So , I use one task just for just coordinating tasks: celery -A sigprocess worker -l info -P eventlet -c 1000 -Q coordinator
I tried group , I can get the final results, but it submit the 2 kinds of tasks to one server for a MP4 file.
I tried to submit two kinds of tasks to two different queue celery -A sigprocess worker -l info -P prefork -c 1 -Q video_analysis celery -A sigprocess worker -l info -P prefork -c 1 -Q audio_analysis
by using multiprocessing.dummy.Pool apply_async(queue='queue_name') and wait the results:
@shared_task(bind=True, base=SigTask, ignore_result=False)
def do_task(self, *args, **kwargs):
i = random.randint(0, 10)`
print(f"Sleeping {i} seconds")
time.sleep(i)
print(f"Sleeping Done")
return i
def worker(i):
print(f"Doing work {i}!!!")
if i % 2 == 0:
result = do_task.apply_async(queue="audio_analysis")
else:
result = do_task.apply_async(queue="video_analysis")
return result.get()
@shared_task(bind=True, base=CoordinatorTask, ignore_result=False)
def sig_task(self, *args, **kwargs):
with multiprocessing.dummy.Pool() as pool:
results = pool.map(worker, range(2))
return resultsI found that, the two tasks successfully executed on two celery workers, but I cannot read the results from the coordinator, it just stuck .
any suggestions on this issue ?
Thanks.
The text was updated successfully, but these errors were encountered:
I am working on a heterogeneous computing that is:
10 GPU servers for GPU intensive loads, here we analyze video part of a MP4;
2 CPU servers for CPU intensive loads, here we analyze audio part of a MP4;
I want to submit video tasks to GPU servers and audio tasks to CPU servers.
So , I use one task just for just coordinating tasks:
celery -A sigprocess worker -l info -P eventlet -c 1000 -Q coordinator
I tried
group
, I can get the final results, but it submit the 2 kinds of tasks to one server for a MP4 file.I tried to submit two kinds of tasks to two different queue
celery -A sigprocess worker -l info -P prefork -c 1 -Q video_analysis
celery -A sigprocess worker -l info -P prefork -c 1 -Q audio_analysis
by using multiprocessing.dummy.Pool
apply_async(queue='queue_name')
and wait the results:any suggestions on this issue ?
Thanks.
The text was updated successfully, but these errors were encountered: