Skip to content
Permalink
Browse files

fixed minor typo

  • Loading branch information...
anestisdotpy committed Jun 3, 2019
1 parent 38f1757 commit 34fe611c537aceac88795b4c530cbcbf62000ef4
Showing with 1 addition and 1 deletion.
  1. +1 −1 site/en/tutorials/keras/overfit_and_underfit.ipynb
@@ -109,7 +109,7 @@
"\n",
"In both of the previous examples—classifying movie reviews, and predicting fuel efficiency—we saw that the accuracy of our model on the validation data would peak after training for a number of epochs, and would then start decreasing.\n",
"\n",
"In other words, our model would *overfit* to the training data. Learning how to deal with overfitting is important. Although it's often possible to achieve high accuracy on the *training set*, what we really want is to develop models that generalize well to a *testing data* (or data they haven't seen before).\n",
"In other words, our model would *overfit* to the training data. Learning how to deal with overfitting is important. Although it's often possible to achieve high accuracy on the *training set*, what we really want is to develop models that generalize well to a *testing set* (or data they haven't seen before).\n",
"\n",
"The opposite of overfitting is *underfitting*. Underfitting occurs when there is still room for improvement on the test data. This can happen for a number of reasons: If the model is not powerful enough, is over-regularized, or has simply not been trained long enough. This means the network has not learned the relevant patterns in the training data.\n",
"\n",

0 comments on commit 34fe611

Please sign in to comment.
You can’t perform that action at this time.