Skip to content

Conversation

@vanessavmac
Copy link
Collaborator

@vanessavmac vanessavmac commented Apr 4, 2025

Summary

Create a template/framework for running ML backends locally. Users can define detectors, classifiers, and pipelines wrapped in the ML backend FastAPI.

For a detailed description of how to use the framework, see the READMEs.

  • README.md
  • processing_services/README.md

processing_services contains 2 apps:

  • example: demos how to add custom pipelines/algorithms.
  • minimal: a simple ML backend for basic testing of the processing service API. This minimal app also runs within the main Antenna docker compose stack.

The ML backends can now be run as a separate docker compose stack (i.e. with one or both of example and minimal processing service(s) running).

NOTE: The ml_backend service inside of the main Antenna docker compose stack is built from the minimal ML backend app.

Related Issues

#802

Screenshots

Screenshot 2025-04-13 at 2 33 50 PM

Deployment Notes

See processing_services/example/README.md

@netlify
Copy link

netlify bot commented Apr 4, 2025

Deploy Preview for antenna-preview canceled.

Name Link
🔨 Latest commit 1dbf3b1
🔍 Latest deploy log https://app.netlify.com/sites/antenna-preview/deploys/68102533f6fb650008434e4c

@vanessavmac
Copy link
Collaborator Author

@mohamedelabbas1996 I'll be opening this PR soon for review and it contains the changes i've made to make the ML backend framework easier to customize

@vanessavmac vanessavmac marked this pull request as ready for review April 5, 2025 23:37
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 20 out of 22 changed files in this pull request and generated 3 comments.

Files not reviewed (2)
  • processing_services/example/requirements.txt: Language not supported
  • processing_services/minimal/Dockerfile: Language not supported
Comments suppressed due to low confidence (1)

processing_services/example/api/test.py:30

  • The test hardcodes expected classification labels which might lead to flaky tests if the classifier output is non-deterministic; consider mocking the classifier output for consistent results.
expected_labels = ["lynx, catamount", "beaver"]

@vanessavmac vanessavmac requested a review from Copilot April 5, 2025 23:42
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 20 out of 22 changed files in this pull request and generated no comments.

Files not reviewed (2)
  • processing_services/example/requirements.txt: Language not supported
  • processing_services/minimal/Dockerfile: Language not supported
Comments suppressed due to low confidence (3)

processing_services/minimal/api/utils.py:39

  • Consider using a more specific exception type (e.g. ValueError) instead of a generic Exception for invalid input.
raise Exception("Specify a URL or path to fetch file from.")

processing_services/example/api/test.py:30

  • [nitpick] Hardcoded expected classification labels might be brittle if model outputs change; consider deriving expected values dynamically or using a fixed mock response.
expected_labels = ["lynx, catamount", "beaver"]

processing_services/docker-compose.yml:23

  • Verify that the port mapping change for the ml_backend_example service (mapping to port 2005) is intentional and does not conflict with other services.
- "2005:2000"

@vanessavmac
Copy link
Collaborator Author

vanessavmac commented Apr 6, 2025

@mohamedelabbas1996 I was playing around with adding the Darsa flat-bug detector. My latest commit does some preliminary basic work cloning the repo, setting up the docker environment, and adding the detector pipeline. The pipeline does run, but predicting on the test images in the database don't seem to be producing bounding boxes (might be a problem with the way I'm loading the image? since on colab, the image I'm using should produce 1 bounding box at least. In the pipeline I'm using the get_or_download_file util to download the image to the docker container's /tmp folder).

This is the logs showing what the detector is predicting when running the model on cpu locally:

2025-04-06 10:43:09 2025-04-06 14:43:09 - api.utils - INFO - Downloaded to /tmp/antennao8w9wnqn/session_2025-03-16_capture_20250316225100.jpg
2025-04-06 10:43:09 2025-04-06 14:43:09 - api.algorithms - INFO - Predicting /tmp/antennao8w9wnqn/session_2025-03-16_capture_20250316225100.jpg
2025-04-06 10:57:00 2025-04-06 14:57:00 - api.algorithms - INFO - Predicted: {'boxes': [], 'contours': [], 'confs': [], 'classes': [], 'scales': [], 'areas': [], 'image_path': '/tmp/antennao8w9wnqn/session_2025-03-16_capture_20250316225100.jpg', 'image_width': 1024, 'image_height': 768, 'mask_width': 1024, 'mask_height': 768, 'identifier': None}

(Here's a colab link showing what the expected output should be: https://colab.research.google.com/drive/1GNVH4y8hrG49-2kqVDy0oxrHGoXxHc_P?usp=sharing -- there should be 1 bounding box)

Relates to #412

@vanessavmac
Copy link
Collaborator Author

vanessavmac commented Apr 9, 2025

TODOs:

  • Clean up docker compose (Separate for each, 1 combined, and add back the minimal backend to the root compose)
  • Use the new schemas, update the pipeline to pass around Detection, not DetectionResponse
  • Instead of the current vit model, use the zero shot object detector and classifier: https://huggingface.co/docs/transformers/en/tasks/zero_shot_object_detection
  • Update README; it's an introduction to the concept of processing services; should be a good explanation for someone completely new to processing services (explain that example does zero shot detection, explain how someone can make a new PS by copying the example directory)

TLDR; To close this PR, goal is to have complete READMEs explaining how to make new PS and run it; have a practical example of a working PS (ideally zero shot detector AND classifier)

@vanessavmac vanessavmac changed the title 747 get antenna to work locally on laptops for panama trip Implement a complete example of an ML backend Apr 9, 2025
@vanessavmac vanessavmac changed the title Implement a complete example of an ML backend Implement complete example of an ML backend Apr 9, 2025
@vanessavmac
Copy link
Collaborator Author

vanessavmac commented Apr 13, 2025

@mihow PR has been updated to include the zero-shot object detector example. It works quite nicely! It identifies it as butterflies/insects/moths. I've also updated the READMEs with all the details.

Screenshot 2025-04-13 at 2 52 48 PM

In a follow up PR we can address the following points:

  • Update tests: processing_services/example/api/test.py
  • Flat bug: run using ngrok and colab(?)

Copy link
Collaborator

@mihow mihow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Epic work Vanessa! Merging!

@mihow mihow merged commit d700c55 into main Apr 30, 2025
6 checks passed
@mihow mihow deleted the 747-get-antenna-to-work-locally-on-laptops-for-panama-trip branch April 30, 2025 02:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants