Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add generic interface tests from MLJTestIntegration #22

Closed
wants to merge 1 commit into from

Conversation

ablaom
Copy link
Member

@ablaom ablaom commented Sep 19, 2022

No description provided.

@codecov-commenter
Copy link

codecov-commenter commented Sep 19, 2022

Codecov Report

Merging #22 (0125d75) into master (d44debb) will increase coverage by 3.84%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master      #22      +/-   ##
==========================================
+ Coverage   89.74%   93.58%   +3.84%     
==========================================
  Files           1        1              
  Lines         156      156              
==========================================
+ Hits          140      146       +6     
+ Misses         16       10       -6     
Impacted Files Coverage Δ
src/MLJXGBoostInterface.jl 93.58% <0.00%> (+3.84%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@ablaom
Copy link
Member Author

ablaom commented Sep 19, 2022

@ExpandingMan

@ExpandingMan
Copy link
Collaborator

Thanks, I'll add these in my PR because that'll just be easier.

@ablaom ablaom marked this pull request as draft September 21, 2022 23:47
@ablaom
Copy link
Member Author

ablaom commented Sep 21, 2022

Marking as draft as #21 to incorporate.

@ExpandingMan
Copy link
Collaborator

Yeah, sorry, I was distracted by my XGBoost.jl PR. I will return my attention to this when that is done.

@ExpandingMan
Copy link
Collaborator

I'm very stumped as to what's going on here... it seems to be complaining that it is getting the wrong num_class argument which apparently defaults to 1. Part of the problem here is that with current XGBoost.jl it tries to return a completely wrong array for classification in which it cuts off the multi-dimensional output. My guess is that this was never working properly but in the binary case it didn't matter because the second output didn't matter because it had to satisfy $1 - p_{1}$. This is all going to have to get changed anyway if my PR gets merged because I'm outputting proper array sizes whereas the old version seems to be improperly flattening the arrays.

So if you don't mind I think I'd prefer to see if my PR to XGBoost.jl has a chance of being merged sometime soon before picking up the pieces of whatever happened here.

@ablaom
Copy link
Member Author

ablaom commented Sep 26, 2022

Yes, as I recall, the existing XBoost.jl flattens the output of probability arrays and special cases binary probabilities (only one instead of two, per observation). This required some logic to sort out, and it is quite possible your refactor has broken this.

@ablaom
Copy link
Member Author

ablaom commented Nov 16, 2022

Close as added in another merged PR

@ablaom ablaom closed this Nov 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants