Skip to content
@autodistill

Autodistill

Use bigger slower models to train smaller faster ones

Autodistill is an ecosystem for using big, slower foundation models to train small, faster supervised models. Using autodistill and its associated packages, you can go from unlabeled images to inference on a custom model running at the edge with no human intervention in between.

Pinned

  1. autodistill autodistill Public

    Images to inference with no labeling (use foundation models to train supervised models).

    Python 1.5k 121

  2. autodistill-metaclip autodistill-metaclip Public

    MetaCLIP module for use with Autodistill.

    Python 16 1

  3. autodistill-grounded-sam autodistill-grounded-sam Public

    GroundedSAM Base Model plugin for Autodistill

    Python 37 8

Repositories

Showing 10 of 53 repositories

Most used topics

Loading…