You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 20, 2023. It is now read-only.
We currently run mutiple of the jobs via pre-configured Virtual Machines. The jobs are read off an Azure Storage Queue by a python script and executed as per the instructions in the queue. If I were to extend this to Azure Batch running the same job in a pre-configured docker job pool, what is the best way to pass such instructions? Is there a way to pass the job parameters directly via the queue?
The text was updated successfully, but these errors were encountered:
@setuc You may want to investigate using an Azure Function queue trigger which would automatically react to a queue message arriving. Your script (can be in python) that runs as part of the Azure Function trigger would then translate the queue message instructions to Batch Shipyard configuration files. You can then submit the job to an already pre-allocated pool (if you care about minimizing latency) or to an autoscale pool (if you care more about cost optimization) via Batch Shipyard within the Azure Function environment.
Batch Shipyard now has a site extension available that automates installing it into an Azure Function environment.
We currently run mutiple of the jobs via pre-configured Virtual Machines. The jobs are read off an Azure Storage Queue by a python script and executed as per the instructions in the queue. If I were to extend this to Azure Batch running the same job in a pre-configured docker job pool, what is the best way to pass such instructions? Is there a way to pass the job parameters directly via the queue?
The text was updated successfully, but these errors were encountered: