Skip to content

learn-co-students/dsc-tuning-neural-networks-recap-online-ds-pt-100719

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tuning Neural Networks - Recap

Key Takeaways

The key takeaways from this section include:

  • Validation and test sets are used when iteratively building deep neural networks
  • Like traditional machine learning models, we need to watch out for the bias variance trade-off when building deep learning models
  • Several regularization techniques can help us limit overfitting: L1 Regularization, L2 Regularization, Dropout Regularization, etc ...
  • Training of deep neural networks can be sped up by using normalized inputs
  • Normalized inputs can also help mitigate a common issue of vanishing or exploding gradients
  • Examples of alternatives for gradient descent are: RMSprop, Adam, Gradient Descent with Momentum, etc.
  • Hyperparameter tuning is of crucial importance when working with deep learning models, as setting the parameters right can lead to great improvements in model performance

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •