Skip to content

Commit

Permalink
update doc
Browse files Browse the repository at this point in the history
  • Loading branch information
aksnzhy committed Dec 9, 2017
1 parent ba569a5 commit d791603
Show file tree
Hide file tree
Showing 6 changed files with 4 additions and 4 deletions.
Binary file modified _build/doctrees/command_line.doctree
Binary file not shown.
Binary file modified _build/doctrees/environment.pickle
Binary file not shown.
2 changes: 1 addition & 1 deletion _build/html/_sources/command_line.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ A portion of xLearn's output: ::
9 0.404554 0.546218 0.00

Here we can see that, the training loss continuously goes down. While, the validation loss (test loss) goes
down first, and then goes up. This is because our model is overfit current training data set.
down first, and then goes up. This is because our model has already overfitted current training data set.



Expand Down
2 changes: 1 addition & 1 deletion _build/html/command_line.html
Original file line number Diff line number Diff line change
Expand Up @@ -287,7 +287,7 @@ <h2>Set Validation Dataset<a class="headerlink" href="#set-validation-dataset" t
</pre></div>
</div>
<p>Here we can see that, the training loss continuously goes down. While, the validation loss (test loss) goes
down first, and then goes up. This is because our model is overfit current training data set.</p>
down first, and then goes up. This is because our model has already overfitted current training data set.</p>
<blockquote>
<div><div class="toctree-wrapper compound">
</div>
Expand Down
2 changes: 1 addition & 1 deletion _build/html/searchindex.js

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion command_line.rst
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ A portion of xLearn's output: ::
9 0.404554 0.546218 0.00

Here we can see that, the training loss continuously goes down. While, the validation loss (test loss) goes
down first, and then goes up. This is because our model is overfit current training data set.
down first, and then goes up. This is because our model has already overfitted current training data set.



Expand Down

0 comments on commit d791603

Please sign in to comment.