Skip to content

Conversation

@mihow
Copy link
Collaborator

@mihow mihow commented Nov 28, 2023

When a job is queued, the job.run() method is run as a background task with Celery. The task collects all images that need processing and makes batches of POST requests to the ML backend API configured on the selected Pipeline. Results returned from the ML backend API are interpreted and saved as Django model instances. The request and response objects are defined & validated using Pydantic model schemas.

At this time, requests are expected to be processed and returned in a reasonable amount of time (60 sec). So batches are kept very small. In the future, large batches will need a different method for checking the status & retrieving results from each ML backend API.

Also the location and method of where the celery background tasks are being called should be reviewed. Does each Pipeline type get it's own run() method? Should job.run() be synchronous and the methods within that function be the async functions? Then we can use group and chain based on the specific processing that needs to happen. It's important that we save results periodically and update the user interface.

@netlify
Copy link

netlify bot commented Nov 28, 2023

Deploy Preview for ami-storybook canceled.

Name Link
🔨 Latest commit 0c3269e
🔍 Latest deploy log https://app.netlify.com/sites/ami-storybook/deploys/656a8fbd88164500089df760

@netlify
Copy link

netlify bot commented Nov 28, 2023

Deploy Preview for ami-web canceled.

Name Link
🔨 Latest commit 0c3269e
🔍 Latest deploy log https://app.netlify.com/sites/ami-web/deploys/656a8fbd171b6d0008db91b7

@mihow mihow marked this pull request as ready for review December 2, 2023 02:14
@mihow mihow merged commit e692f5f into main Dec 2, 2023
@mihow mihow deleted the feat/ml-backend-results branch February 8, 2024 21:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants