You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Let's say my classifier is LearningWithNoisyLabels(GridSearchCV(estimator = RandomForestClassifier(), param_grid = ..., cv = ...), cv_n_fold = ...)
What will happen here? Will I get the best parameters from GridSearchCV's cross validation, and then re-train this model on the best set of data using LearningWithNoisyLabels's cross validation? Will this potentially produce bad results?
The text was updated successfully, but these errors were encountered:
It might work fine if you're dataset is relatively large compared to model complexity, but if you want to implement this for harder classification problems, you'll want as much data as possible.
Let's say my classifier is
LearningWithNoisyLabels(GridSearchCV(estimator = RandomForestClassifier(), param_grid = ..., cv = ...), cv_n_fold = ...)
What will happen here? Will I get the best parameters from
GridSearchCV
's cross validation, and then re-train this model on the best set of data usingLearningWithNoisyLabels
's cross validation? Will this potentially produce bad results?The text was updated successfully, but these errors were encountered: