Skip to content
This repository was archived by the owner on Sep 13, 2023. It is now read-only.
This repository was archived by the owner on Sep 13, 2023. It is now read-only.

Add hardware requirements to serialized MLEM model #379

@aguschin

Description

@aguschin

Would be cool if we could add hardware requirements for inference to .mlem file.

Something like "to run this NN you need GPU with 8GB of Ram (for batch size 16)" or "to run this XGBoost model you need 16GB RAM" would be useful to have. It could help users and in future this could help us in running models from Studio.

@mike0sv, do you think it's feasible?

@daavoo, I see potential overlap with DVCLive here, e.g. you should log RAM/CPU/GPU required for training. But I guess you don't have plans for logging what's required for inference.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions