This repository was archived by the owner on Sep 13, 2023. It is now read-only.

Description
Would be cool if we could add hardware requirements for inference to .mlem file.
Something like "to run this NN you need GPU with 8GB of Ram (for batch size 16)" or "to run this XGBoost model you need 16GB RAM" would be useful to have. It could help users and in future this could help us in running models from Studio.
@mike0sv, do you think it's feasible?
@daavoo, I see potential overlap with DVCLive here, e.g. you should log RAM/CPU/GPU required for training. But I guess you don't have plans for logging what's required for inference.