Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding KGB support #2408

Closed
wants to merge 2 commits into from
Closed

Adding KGB support #2408

wants to merge 2 commits into from

Conversation

TakeOver
Copy link
Contributor

I hereby agree to the terms of the CLA available at: https://yandex.ru/legal/cla/.

Based on https://arxiv.org/abs/2206.05608

I've a bit forgot codebase for C++ part and some changes are needed to fully satisfy what is implemented in the paper, though they are not essential parts of the approach and fairly it can be added without them without violating theory in the paper.
Nevertheless, here is list of changes that are needed to added:

  1. add option that modified random_strength behavior by removing dependence of model_length and Gumbel noise (log(log(1/uniform))), instead of normal noise
  2. current prior sampling is done by somewhat tricky procedure (though correct one) that ensures through parameter eps that prior trees are uniformly distributed and then by manually overriding leaf values to make them distributed as gaussians. I think that this step can be incorporated directly into train-one-iteration but amount of changes needed to do that probably not worth it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant