-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Zeroed weights for entire class #50
Comments
The per-class weight is multiplied with the C value to obtain the penalty factor for a class. Setting the weight for label 2 to zero makes that class unweighted in the optimization problem. This in turn means that during the training procedure, classifying wrongly as class 2 is not penalized. Minimizing the objective during training can therefore be accomplished by assigning all samples to that class. |
libsvm uses 1-vs-1 for multiclass. For each binary problem, if one class has olologin writes:
|
Thanks for explanation, do i need to close it? Because it seems that it's not issue, at least in mathematical terms. |
Yes, please.
|
That's contents of model which i used in first post:
I have a question again: Thanks in advance. |
please check sec 8 of libsvm paper
|
I know that it's weird usage of class weights, but stil, could it be explained somehow? Or fixed?
dataset.txt:
code:
It produces in predictions.out:
The text was updated successfully, but these errors were encountered: