Convert models from source formats (PyTorch/ONNX) into optimized inference formats and publish them to Hugging Face Hub.
Each subdirectory under models/ is a self-contained conversion recipe:
models/
└── bge-base-en-v1.5-coreml/
├── convert.py # Conversion script
├── config.yaml # Source model, target format, HF repo
└── README_TEMPLATE.md # Model card for HF
- Create
models/<name>/config.yamlwith source model and target format - Write
models/<name>/convert.pyto produce the converted artifacts
cd models/bge-base-en-v1.5-coreml
uv run python convert.pyRequires a HF_TOKEN environment variable with write access to the target repo.
HF_TOKEN=hf_... uv run python scripts/publish.py \
--repo rsvalerio/bge-base-en-v1.5-coreml \
--artifacts-dir models/bge-base-en-v1.5-coreml \
--config models/bge-base-en-v1.5-coreml/config.yaml- uv
HF_TOKENwith write access to target HF repos (for publishing)