You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current implementation of CART (Classification and Regression Tree) only utilizes pre-pruning tactics to control overfitting. It has been shown that combining pre-pruning and post-pruning can achieve better regularization effects while speeding up inference. I propose a lightweight heuristic that will work with the impurity system already in place such that nodes with minimal impurity decrease can be pruned.
I am open to other suggestions.
The text was updated successfully, but these errors were encountered:
The current implementation of CART (Classification and Regression Tree) only utilizes pre-pruning tactics to control overfitting. It has been shown that combining pre-pruning and post-pruning can achieve better regularization effects while speeding up inference. I propose a lightweight heuristic that will work with the impurity system already in place such that nodes with minimal impurity decrease can be pruned.
I am open to other suggestions.
The text was updated successfully, but these errors were encountered: