You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for this nice work! I saw the checkpoints are already pushed to the 🤗 hub which is great: https://huggingface.co/ICTNLP/StreamSpeech_Models/tree/main, however there are a few things that could be improved which will help in making more people discover your models.
the model card is currently empty: https://huggingface.co/ICTNLP/StreamSpeech_Models => we could add appropriate tags to the README like "speech-translation", "speech-to-speech" which enables people to discover your models more easily
download metrics aren't working so far, since there's no integration with a library yet.
To make download stats work for your models, there are a few options.
in case your models are regular nn.Module classes, one can leverage the PyTorchModelHubMixin which automatically adds push_to_hub and from_pretrained to your custom PyTorch models, ensuring download stats will work. This also uses safetensors by default rather than pickle to store weights, which is considered safer.
Dear authors,
Thanks for this nice work! I saw the checkpoints are already pushed to the 🤗 hub which is great: https://huggingface.co/ICTNLP/StreamSpeech_Models/tree/main, however there are a few things that could be improved which will help in making more people discover your models.
To make download stats work for your models, there are a few options.
nn.Module
classes, one can leverage the PyTorchModelHubMixin which automatically adds push_to_hub and from_pretrained to your custom PyTorch models, ensuring download stats will work. This also usessafetensors
by default rather than pickle to store weights, which is considered safer.Usage is as follows:
We also offer upload_file, upload_folder for pushing to the hub.
Let me know if you need any help!
Kind regards,
Niels
ML Engineer @ HF 🤗
The text was updated successfully, but these errors were encountered: