Skip to content

Commit

Permalink
figured out good way to handle footnotes/references
Browse files Browse the repository at this point in the history
  • Loading branch information
bfortuner committed Apr 23, 2017
1 parent dfd287c commit 77b1da9
Show file tree
Hide file tree
Showing 20 changed files with 396 additions and 380 deletions.
11 changes: 7 additions & 4 deletions docs/activation_functions.rst
Original file line number Diff line number Diff line change
@@ -1,12 +1,10 @@
.. _activation_functions:

====================
Activation functions
Activation Functions
====================

.. toctree::
:maxdepth: 1
:titlesonly:
.. contents:: :local:

ELU
===
Expand Down Expand Up @@ -95,3 +93,8 @@ Tanh
====

Be the first to contribute!


.. rubric:: References

.. [1] Example
5 changes: 3 additions & 2 deletions docs/backpropagation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@
Backpropagation
===============

.. contents:: :local:

The goals of backpropagation are straightforward: adjust each weight in the network in proportion to how much it contributes to overall error. If we iteratively reduce each weight's error, eventually we’ll have a series of weights the produce good predictions.


Expand Down Expand Up @@ -179,5 +181,4 @@ Code example

.. rubric:: References

- Reference1

.. [1] Example
43 changes: 12 additions & 31 deletions docs/calculus.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
Calculus
========

Brief overview of derivatives, gradients and the chain rule.
.. contents:: :local:

.. _derivative:

Expand Down Expand Up @@ -279,7 +279,7 @@ There are two additional properties of gradients that are especially useful in d

.. _chain_rule:

Chain Rule
Chain rule
==========

The chain rule is a formula for calculating the derivatives of composite functions. Composite functions are functions composed of functions inside other function(s).
Expand Down Expand Up @@ -418,35 +418,16 @@ We then input the derivatives and simplify the expression:
\end{align}
.. rubric:: References

.. [#] `Khan Academy Derivatives Introduction <https://www.khanacademy.org/math/multivariable-calculus/multivariable-derivatives/partial-derivative-and-gradient-articles/a/directional-derivative-introduction>`_
* https://en.wikipedia.org/wiki/Derivative
* https://en.wikipedia.org/wiki/Partial_derivative
* https://en.wikipedia.org/wiki/Gradient
* https://betterexplained.com/articles/vector-calculus-understanding-the-gradient/ Understanding
* https://www.khanacademy.org/math/multivariable-calculus/multivariable-derivatives/partial-derivative-and-gradient-articles/a/the-gradient
* https://www.khanacademy.org/math/multivariable-calculus/multivariable-derivatives/gradient-and-directional-derivatives/v/gradient-and-contour-maps
* https://www.mathsisfun.com/calculus/derivatives-introduction.html
* http://tutorial.math.lamar.edu/Classes/CalcI/DefnOfDerivative.aspx
* http://www.algebrahelp.com/lessons/simplifying/foilmethod/pg2.htm
* http://www.sosmath.com/calculus/diff/der00/der00.html
* http://csrgxtu.github.io/2015/03/20/Writing-Mathematic-Fomulars-in-Markdown/
* https://en.wikipedia.org/wiki/Chain_rule#Higher_dimensions
* https://www.khanacademy.org/math/calculus-home/taking-derivatives-calc/chain-rule-calc/v/chain-rule-introduction
.. rubric:: References

* http://tutorial.math.lamar.edu/Classes/CalcI/ChainRule.aspx
.. [1] https://en.wikipedia.org/wiki/Derivative
.. [2] https://www.khanacademy.org/math/multivariable-calculus/multivariable-derivatives/partial-derivative-and-gradient-articles/a/directional-derivative-introduction
.. [3] https://en.wikipedia.org/wiki/Partial_derivative
.. [4] https://en.wikipedia.org/wiki/Gradient
.. [5] https://betterexplained.com/articles/vector-calculus-understanding-the-gradient
.. [6] https://www.mathsisfun.com/calculus/derivatives-introduction.html
.. [7] http://tutorial.math.lamar.edu/Classes/CalcI/DefnOfDerivative.aspx
.. [8] https://www.khanacademy.org/math/calculus-home/taking-derivatives-calc/chain-rule-calc/v/chain-rule-introduction
.. [9] http://tutorial.math.lamar.edu/Classes/CalcI/ChainRule.aspx
6 changes: 4 additions & 2 deletions docs/cnn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,10 @@
CNNs
====

.. contents:: :local:

Intro to CNNs

.. toctree::
:maxdepth: 2
.. rubric:: References

.. [1] Example
7 changes: 7 additions & 0 deletions docs/contribute.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
.. _contribute:

==========
Contribute
==========

Become a contributor! Check out our `github <http://github.com/bfortuner/ml-cheatsheet/>`_ for more information.
8 changes: 7 additions & 1 deletion docs/forwardpropagation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,11 @@
Forwardpropagation
==================

.. contents:: :local:

Overview
========

.. image:: images/neural_network_simple.png
:align: center

Expand Down Expand Up @@ -33,7 +38,8 @@ Let’s write a method feed_forward() to propagate input data through our simple

.. :pyobject: MyClass #Target a specific class.function in a file
.. rubric:: References

- Reference1
.. [1] Example
8 changes: 5 additions & 3 deletions docs/gan.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,11 @@
GANs
====

Intro to Generative Adversarial Networks
.. contents:: :local:


.. toctree::
:maxdepth: 2
Intro to Generative Adversarial Networks

.. rubric:: References

.. [1] Example

0 comments on commit 77b1da9

Please sign in to comment.