You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It's a bug here anyway. The default gamma was not been set successfully (so it was 0), and then error occurred.
"out of host memory" comes from the default batch size (10000) of prediction after training. Since news20 has high dimension (1,355,191), 10000 * 1355191 is too large.
Hi,
I just ran
./thundersvm-train /tmp2/b03902086/data/news20.binary
on a workstation machine, and resulted in the following:The machine has 70G memory, and the dataset news20 is just 134M large.
Is there anything I miss?
Thanks in advance!
The text was updated successfully, but these errors were encountered: