Skip to content

Commit

Permalink
updated docs
Browse files Browse the repository at this point in the history
  • Loading branch information
davisking committed Jun 6, 2020
1 parent 3d738e1 commit 610aa63
Show file tree
Hide file tree
Showing 4 changed files with 59 additions and 1 deletion.
8 changes: 8 additions & 0 deletions docs/docs/main_menu.xml
Original file line number Diff line number Diff line change
Expand Up @@ -224,10 +224,18 @@
<name>Deep Learning Introduction Part 2</name>
<link>dnn_introduction2_ex.cpp.html</link>
</item>
<item>
<name>Deep Learning Introduction Part 3</name>
<link>dnn_introduction3_ex.cpp.html</link>
</item>
<item>
<name>Deep Learning Imagenet Classifier</name>
<link>dnn_imagenet_ex.cpp.html</link>
</item>
<item>
<name>Deep Learning DCGAN</name>
<link>dnn_dcgan_train_ex.cpp.html</link>
</item>
<item>
<name>Deep Learning Imagenet Trainer </name>
<link>dnn_imagenet_train_ex.cpp.html</link>
Expand Down
18 changes: 18 additions & 0 deletions docs/docs/ml.xml
Original file line number Diff line number Diff line change
Expand Up @@ -204,6 +204,14 @@ Davis E. King. <a href="http://jmlr.csail.mit.edu/papers/volume10/king09a/king09
<name>avg_pool</name>
<link>dlib/dnn/layers_abstract.h.html#avg_pool_</link>
</item>
<item>
<name>leaky_relu</name>
<link>dlib/dnn/layers_abstract.h.html#leaky_relu_</link>
</item>
<item>
<name>mish</name>
<link>dlib/dnn/layers_abstract.h.html#mish_</link>
</item>
<item>
<name>relu</name>
<link>dlib/dnn/layers_abstract.h.html#relu_</link>
Expand Down Expand Up @@ -269,6 +277,14 @@ Davis E. King. <a href="http://jmlr.csail.mit.edu/papers/volume10/king09a/king09
<name>loss_binary_log</name>
<link>dlib/dnn/loss_abstract.h.html#loss_binary_log_</link>
</item>
<item>
<name>loss_multiclass_log_weighted</name>
<link>dlib/dnn/loss_abstract.h.html#loss_multiclass_log_weighted_</link>
</item>
<item>
<name>loss_binary_log_per_pixel</name>
<link>dlib/dnn/loss_abstract.h.html#loss_binary_log_per_pixel_</link>
</item>
<item>
<name>loss_multimulticlass_log</name>
<link>dlib/dnn/loss_abstract.h.html#loss_multimulticlass_log_</link>
Expand Down Expand Up @@ -527,6 +543,8 @@ Davis E. King. <a href="http://jmlr.csail.mit.edu/papers/volume10/king09a/king09
<examples>
<example>dnn_introduction_ex.cpp.html</example>
<example>dnn_introduction2_ex.cpp.html</example>
<example>dnn_introduction3_ex.cpp.html</example>
<example>dnn_dcgan_train_ex.cpp.html</example>
<example>dnn_inception_ex.cpp.html</example>
<example>dnn_imagenet_ex.cpp.html</example>
<example>dnn_imagenet_train_ex.cpp.html</example>
Expand Down
30 changes: 29 additions & 1 deletion docs/docs/release_notes.xml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,34 @@
<!-- ************************************************************************************** -->

<current>
New Features and Improvements:
- Added CUDA implementation for loss_mean_squared_per_channel_and_pixel.
- Added DCGAN example in examples/dnn_dcgan_train_ex.cpp
- Added transfer learning example in examples/dnn_introduction3_ex.cpp
- Added leaky_relu activation layer
- Added mish activation layer
- Added loss_multiclass_log_weighted
- Added loss_binary_log_per_pixel
- Minor API simplifications in deep learning tooling.
- imglab can automatically select the number of image clusters when --cluster 0 is given.
- Added a relative epsilon termination option to svm_c_linear_trainer
- Support new version of OpenCV that doesn't have IplImage anymore.

Non-Backwards Compatible Changes:

Bug fixes:
- Corrected interpolate_bilinear for lab_pixel.
- Fix build errors in CUDA 10.2
- Make equal_error_rate() handle degenerate case where all scores are equal.
- Fix DLIB_ISO_CPP_ONLY not working
- Fix build errors in C++20 build mode.
- Fixed function_evaluation_request::set() invalidating function_evaluation_request::x()

</current>

<!-- ************************************************************************************** -->

<old name="19.19" date="Dec 14, 2019">
New Features and Improvements:
- Made find_min_global() and find_max_global() much faster when using them
to optimize inexpensive functions. These tools now measure the runtime of
Expand All @@ -29,7 +57,7 @@ Bug fixes:
- Fix find_max() going into an infinite loop in some cases when a
non-differentiable function is given.

</current>
</old>

<!-- ************************************************************************************** -->

Expand Down
4 changes: 4 additions & 0 deletions docs/docs/term_index.xml
Original file line number Diff line number Diff line change
Expand Up @@ -132,6 +132,8 @@
<term file="dlib/dnn/loss_abstract.h.html" name="loss_binary_hinge_" include="dlib/dnn.h"/>
<term file="dlib/dnn/loss_abstract.h.html" name="loss_binary_log_" include="dlib/dnn.h"/>
<term file="dlib/dnn/loss_abstract.h.html" name="loss_multimulticlass_log_" include="dlib/dnn.h"/>
<term file="dlib/dnn/loss_abstract.h.html" name="loss_multiclass_log_weighted_" include="dlib/dnn.h"/>
<term file="dlib/dnn/loss_abstract.h.html" name="loss_binary_log_per_pixel_" include="dlib/dnn.h"/>
<term file="dlib/dnn/loss_abstract.h.html" name="loss_multiclass_log_" include="dlib/dnn.h"/>
<term file="dlib/dnn/loss_abstract.h.html" name="loss_multiclass_log_per_pixel_" include="dlib/dnn.h"/>
<term file="dlib/dnn/loss_abstract.h.html" name="loss_multiclass_log_per_pixel_weighted_" include="dlib/dnn.h"/>
Expand Down Expand Up @@ -170,6 +172,8 @@
<term file="dlib/dnn/layers_abstract.h.html" name="max_pool_" include="dlib/dnn.h"/>
<term file="dlib/dnn/layers_abstract.h.html" name="avg_pool_" include="dlib/dnn.h"/>
<term file="dlib/dnn/layers_abstract.h.html" name="relu_" include="dlib/dnn.h"/>
<term file="dlib/dnn/layers_abstract.h.html" name="leaky_relu_" include="dlib/dnn.h"/>
<term file="dlib/dnn/layers_abstract.h.html" name="mish_" include="dlib/dnn.h"/>
<term file="dlib/dnn/layers_abstract.h.html" name="prelu_" include="dlib/dnn.h"/>
<term file="dlib/dnn/layers_abstract.h.html" name="sig_" include="dlib/dnn.h"/>
<term file="dlib/dnn/layers_abstract.h.html" name="htan_" include="dlib/dnn.h"/>
Expand Down

0 comments on commit 610aa63

Please sign in to comment.