-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tune DNN hyperparameters #145
Comments
Done for this round of active learning, see #181 and |
Reopening to obtain better model performance on inference fields. |
Closing again; currently using the full training set for each classifier rather than limiting sources with the |
@bfhealy In principle we can turn this on or off at the class level? |
@mcoughlin Yes, the config file allows us to customize these parameters for each class going forward. |
@bfhealy cool! |
The config file contains hyperparameters that may be adjusted for each class to attain optimal DNN performance. We should experiment with these values to determine an appropriate combination for each class. The hyperparameters include:
-
threshold
specifying the minimum probability to determine positive examples of a class-
balance
specifying the ratio of over-to-underrepresented examples in a class-
weight_per_class
boolean; if True the loss function responds differently to pos. vs. neg. class examples based on numberThe architecture of the DNN is also controlled by hyperparameters (number of neurons, dropout fractions, convolution windows, etc). For now I think we should maintain the DNN structure as-is (consistent with the published architecture) and experiment with the above values in the config file. An alternative architecture is shown in panel (b) of Fig. 8, but with the warning that this architecture tends to have greater variance than the current model.
The text was updated successfully, but these errors were encountered: