Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removed unnecessary PredictBatch calls #6700

Merged
merged 3 commits into from Feb 10, 2021

Conversation

ShvetsKS
Copy link
Contributor

in #6693
lines were deleted:

if (tree_end - tree_begin == 0) {
      return;
    }

it brings some overheads on training stage:

dataset #6693 current PR
mortgage1q 20.85s 15.78s
airline-ohe 79.32s 37.028s
higgs1m 18.87s 15.73s
msrank 73.82s 66.7s

Copy link
Member

@trivialfis trivialfis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, rebasing gone wrong.

@trivialfis
Copy link
Member

It was supposed to be moved into GBTree to avoid duplicated code. Somehow lost in the progress. Could you please help moving it into the PredictBatch in GBTree and Dart? I can open a PR too, my bad. ;-(

@ShvetsKS
Copy link
Contributor Author

It was supposed to be moved into GBTree to avoid duplicated code. Somehow lost in the progress. Could you please help moving it into the PredictBatch in GBTree and Dart? I can open a PR too, my bad. ;-(

no problem. will be moved :)

@ShvetsKS
Copy link
Contributor Author

ShvetsKS commented Feb 10, 2021

@trivialfis seems for Dart it's not a problem as there is no such situation when tree_begin == tree_end
https://github.com/dmlc/xgboost/blob/master/src/gbm/gbtree.cc#L662

For GBTree yes it's moved

@trivialfis
Copy link
Member

Great! Let's just put it in GBTree.

@trivialfis trivialfis merged commit 9a0399e into dmlc:master Feb 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants