You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I use the default model, then compression is somewhat effective (I have mostly English text).
However, if I train a model first using the data I am about to compress, then use this model to do the compression, the result is 3x bigger than the initial data I started out with.
I used the default options, just fed it an IEnumerable with all the strings that I want to encode. About 72 megabytes of data.
The text was updated successfully, but these errors were encountered:
If I use the default model, then compression is somewhat effective (I have mostly English text).
However, if I train a model first using the data I am about to compress, then use this model to do the compression, the result is 3x bigger than the initial data I started out with.
I used the default options, just fed it an IEnumerable with all the strings that I want to encode. About 72 megabytes of data.
The text was updated successfully, but these errors were encountered: