Skip to content

Trying to convert dinov2 model #1305

@jdp8

Description

@jdp8

Question

I tried to convert this model using the following command:

python -m scripts.convert --model_id nguyenkhoa/dinov2_Liveness_detection_v2.2.3 --quantize --task image-classification

but got the following error:

ValueError: Trying to export a dinov2 model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type dinov2 to be supported natively in the ONNX export.

I looked a bit into the custom_onnx_configs flag and found this conversion example. My question is regarding what should I pass to custom_onnx_configs for the conversion to work? I could pass gpt2 as used in the example but I'm wondering what is the correct custom_onnx_configs input for dinov2 models.

Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions