Skip to content

Commit

Permalink
Update 1.Background.rst
Browse files Browse the repository at this point in the history
  • Loading branch information
Sahar Niknam committed Dec 12, 2018
1 parent 8dbd9d2 commit 263ba34
Showing 1 changed file with 16 additions and 4 deletions.
20 changes: 16 additions & 4 deletions docs/1.Background.rst
Original file line number Diff line number Diff line change
Expand Up @@ -768,17 +768,29 @@ But...

Entropy and mutual information
------------------------------
Entropy for a piece of data translates the level of uncertainty about the content of that piece, which is the same quantity as the maximum number of binary bits required to code that piece of data,

|
|
|
.. image:: https://latex.codecogs.com/gif.latex?\dpi{150}&space;\text{Entropy}&space;=&space;H(X)&space;=&space;-\sum_{x\in&space;X}p(x)log(p(x))&space;\\

|
|
.. image:: https://latex.codecogs.com/gif.latex?\dpi{150}&space;\text{Conditional&space;Entropy}&space;=&space;H(Y|X)&space;=&space;-\sum_{x\in&space;X}p(x)log(H(Y|X=x))&space;\\&space;\text{&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;}&space;=&space;-\sum_{x\in&space;X}p(x)\sum_{y\in&space;Y}p(y|x)log(p(y|x))&space;\\&space;\text{&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;}&space;=&space;-\sum_{x\in&space;X}\sum_{y\in&space;Y}p(x,y)log(p(y|x))&space;\\

.. image:: https://images.mysafetysign.com/img/lg/K/Slow-Construction-Area-Sign-K-5798.gif

|
|
|
.. image:: https://latex.codecogs.com/gif.latex?\dpi{150}&space;\text{Joint&space;Entropy}&space;=&space;H(X,Y)&space;=&space;-\sum_{x\in&space;X}\sum_{y\in&space;Y}p(x,y)log(p(x,y))&space;\\&space;\text{\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;}&space;=&space;H(X)&space;+&space;H(Y|X)

|
|
.. image:: https://latex.codecogs.com/gif.latex?\dpi{150}&space;\text{Mutual&space;Information}&space;=&space;I(X;Y)&space;=&space;-\sum_{x\in&space;X}\sum_{y\in&space;Y}p(x,y)log(\frac{p(x,y)}{p(x)p(y)})&space;\\&space;\text{\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;}&space;=&space;H(X)&space;-&space;H(X|Y)\\&space;\text{\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;\&space;}&space;=&space;H(Y)&space;-&space;H(Y|X)

|
|
Entropy for understanding artificial neural networks
----------------------------------------------------
Expand All @@ -804,4 +816,4 @@ Entropy for understanding artificial neural networks
.. [#] And provided that the nodes’ activation functions are nonlinear.
.. [#] Both in an abstract and also a physical sense.
.. [#] Xu, K., Ba, J., Kiros, R., Cho, K., Courville, A., Salakhudinov, R., ... & Bengio, Y. (2015, June). Show, attend and tell: Neural image caption generation with visual attention. In *International conference on machine learning* (pp. 2048-2057).
.. [#] Compare with the fact that you can use, say, a sigmoid neuron, almost wherever in a network that you want, without being sure of what you are doing!
.. [#] Compare with the fact that you can use, say, a sigmoid neuron, almost wherever in a network that you want, without being sure of what you are doing!

0 comments on commit 263ba34

Please sign in to comment.