OneClassSVM returns float in predict #8676

Closed
amueller opened this Issue Mar 31, 2017 · 7 comments

Comments

Projects
None yet
5 participants
@amueller
Member

amueller commented Mar 31, 2017

It should return int to be consistent with the rest.

@VathsalaAchar

This comment has been minimized.

Show comment
Hide comment
@VathsalaAchar

VathsalaAchar Mar 31, 2017

Contributor

I'm a new contributor to this repo, could I work on this?

Contributor

VathsalaAchar commented Mar 31, 2017

I'm a new contributor to this repo, could I work on this?

@amueller

This comment has been minimized.

Show comment
Hide comment
@amueller

amueller Mar 31, 2017

Member

Hey @VathsalaAchar. Sure, go for it.
Make sure you include a regression test that ensures the correct behavior.

Member

amueller commented Mar 31, 2017

Hey @VathsalaAchar. Sure, go for it.
Make sure you include a regression test that ensures the correct behavior.

@mppaskov

This comment has been minimized.

Show comment
Hide comment
@mppaskov

mppaskov Apr 6, 2017

Contributor

I am new contributor and I am trying to learn my way around the code.

I think I might have found the problem, in svm.cpp. Although the labels are 'int'
model->label = Malloc(int,nr_class);
in the function predict the value is cast into a double:
double PREFIX(predict_values)(..){ ... return model->label[vote_max_idx];}
I believe that in order to fix that the function definition and all top level calls would have to changed to int.

Does it seem that I am on the right track?

Contributor

mppaskov commented Apr 6, 2017

I am new contributor and I am trying to learn my way around the code.

I think I might have found the problem, in svm.cpp. Although the labels are 'int'
model->label = Malloc(int,nr_class);
in the function predict the value is cast into a double:
double PREFIX(predict_values)(..){ ... return model->label[vote_max_idx];}
I believe that in order to fix that the function definition and all top level calls would have to changed to int.

Does it seem that I am on the right track?

@rem2016

This comment has been minimized.

Show comment
Hide comment
@rem2016

rem2016 Apr 6, 2017

@mppaskov I don't think so. I think that we can just overide the predict function and change the type to int.
Because that is how BaseSVC do the job. You can find these lines in predict function:

y = super(BaseSVC, self).predict(X)
return self.classes_.take(np.asarray(y, dtype=np.intp))

BaseSVC's predict function use the predict from BaseLibSVM, and BaseLibSVM is the only super class of OneClassSVM.

rem2016 commented Apr 6, 2017

@mppaskov I don't think so. I think that we can just overide the predict function and change the type to int.
Because that is how BaseSVC do the job. You can find these lines in predict function:

y = super(BaseSVC, self).predict(X)
return self.classes_.take(np.asarray(y, dtype=np.intp))

BaseSVC's predict function use the predict from BaseLibSVM, and BaseLibSVM is the only super class of OneClassSVM.

@mppaskov

This comment has been minimized.

Show comment
Hide comment
@mppaskov

mppaskov Apr 6, 2017

Contributor

@rem2016 Thank you for clarifying, I think went too deep into the rabbit hole. I was a bit skeptical of changing the underlying API calls in libsvm.
But essentially just applying the same fix from BaseSVC to OneClassSVM. I also believe that this is exactly what @VathsalaAchar implemented in her PR.

Contributor

mppaskov commented Apr 6, 2017

@rem2016 Thank you for clarifying, I think went too deep into the rabbit hole. I was a bit skeptical of changing the underlying API calls in libsvm.
But essentially just applying the same fix from BaseSVC to OneClassSVM. I also believe that this is exactly what @VathsalaAchar implemented in her PR.

@VathsalaAchar

This comment has been minimized.

Show comment
Hide comment
@VathsalaAchar

VathsalaAchar Apr 7, 2017

Contributor

@amueller I made the changes and the PR has been merged, just letting you know in case you want to close this issue.

Contributor

VathsalaAchar commented Apr 7, 2017

@amueller I made the changes and the PR has been merged, just letting you know in case you want to close this issue.

@jmschrei

This comment has been minimized.

Show comment
Hide comment
Member

jmschrei commented Apr 7, 2017

Thanks @VathsalaAchar !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment