You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran memory profiling for the code #103 and spago version uses 3.9 GB when compared to 1.2 GB of python. The model sizes are similar valhalla/distilbart-mnli-12-3 , it is 2.5 GB after transforming (hf-importer) to spago and where as upstream python version is 2.1 GB.
Is this expected?
Spago can be very useful in low memory environments like ARM SBC to conducted CPU bound inference, But the memory usage needs to optimized.
Python version seems to be faster in overall operation timing as well because loading of configuration, model weights takes variable timing in spago.
The text was updated successfully, but these errors were encountered:
I ran memory profiling for the code #103 and spago version uses 3.9 GB when compared to 1.2 GB of python. The model sizes are similar
valhalla/distilbart-mnli-12-3
, it is 2.5 GB after transforming (hf-importer) to spago and where as upstream python version is 2.1 GB.Memory profiling in spago
Memory profiling in Python
Is this expected?
Spago can be very useful in low memory environments like ARM SBC to conducted CPU bound inference, But the memory usage needs to optimized.
Python version seems to be faster in overall operation timing as well because loading of configuration, model weights takes variable timing in spago.
The text was updated successfully, but these errors were encountered: