Replies: 1 comment
-
I created a GitHub issue to track adding support for this: #3890 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We're trying to train our customized models, and then use GBM to prune features. However, after feature pruning, GBM will be one of base models in the stage of ensemble. Is it possible to not use GBM when performing ensemble?
Settings:
# use gbm for feature_prune custom_model_hyperparameters.update({ 'GBM':{} }) # Define feature_prune_kwargs feature_prune_kwargs = { 'n_train_subsample': 50000, 'n_fi_subsample': 10000, 'prune_ratio': 0.05, 'stopping_round': 10, 'min_improvement': 1e-6, 'force_prune': True, }
Results:
![image](https://private-user-images.githubusercontent.com/31725302/299868536-b1127e16-f496-4132-91d0-f87711bd5647.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTk5NzkzNTAsIm5iZiI6MTcxOTk3OTA1MCwicGF0aCI6Ii8zMTcyNTMwMi8yOTk4Njg1MzYtYjExMjdlMTYtZjQ5Ni00MTMyLTkxZDAtZjg3NzExYmQ1NjQ3LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA3MDMlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNzAzVDAzNTczMFomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTM5YjJlOTI2ZTZhOGQ0NzgxNGE4NjQwZjc2ZTk3ZTNmY2I0M2YzMmM2MDRlNWRmMmYxMGM1OTg5MjFiYWNlZjkmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.Cx-QK-Sbb3EGpf2kwNJfnIB-i7B2ANph1PqGSy4SXjU)
Expcected Results:
Only CustomPipelineRFModel are used as base models in WeightedEnsmeble.
Beta Was this translation helpful? Give feedback.
All reactions