Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FastTree doesnt support changing default calibrator . #4643

Open
vnarula opened this issue Jan 10, 2020 · 3 comments
Open

FastTree doesnt support changing default calibrator . #4643

vnarula opened this issue Jan 10, 2020 · 3 comments

Comments

@vnarula
Copy link

@vnarula vnarula commented Jan 10, 2020

System information

  • OS version/distro: Windows
  • .NET Version (eg., dotnet --info): .NET core

Issue

  • What did you do? Using FastTree model for fraud protection.
  • What happened? FastTree uses default calibrator FixedPlatCailberator and Data Scientist want to use different calibrator and additional calibrator is not same as using just one calibrator ,
  • What did you expect?
    so there must have requirement is to just able to use one calibrator .

Source code / logs

Please paste or attach the code or logs or traces that would be helpful to diagnose the issue you are reporting.

@antoniovs1029

This comment has been minimized.

Copy link
Member

@antoniovs1029 antoniovs1029 commented Jan 10, 2020

Hi, @vnarula . Thanks for opening this feature request.

As I suggested you offline, currently there's no way to change the calibrator returned by the FastTree trainer. You could only add one calibrator on top of the pipeline, like this:

var pipeline = ML.BinaryClassification.Trainers.FastTree().Append(ML.BinaryClassification.Calibrators.Platt());

As you mention here, adding an "additional calibrator is not same as using just one calibrator". I still don't understand why is it the case for you that adding a new calibrator isn't desired or enough. Can you please clarify this by explaining us your usecase? Thanks!

@justinormont

This comment has been minimized.

Copy link
Member

@justinormont justinormont commented Jan 17, 2020

Adding another calibrator on top of an existing calibrator should work correctly.

I would have to review the code, but my expectation is the calibrator doesn't modify the Score column, which is the input to a calibrator. The calibrator then produces a Probability column. Adding a second calibrator should only overwrite (hide) the existing Probability column. -- I'd recommend verifying the Score column is untouched, as reality may not match my expectation.

@antoniovs1029

This comment has been minimized.

Copy link
Member

@antoniovs1029 antoniovs1029 commented Jan 17, 2020

Hi, @justinormont , I've tested this (adding another calibrator at the end of the pipeline) and what happens is what you've just said, that it adds another Probability column (hiding the previous one) and not modifying the Score column That's why I still think that this should be enough.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants
You can’t perform that action at this time.