Skip to content

Commit

Permalink
Update 1.Background.rst
Browse files Browse the repository at this point in the history
  • Loading branch information
Sahar Niknam committed Feb 10, 2019
1 parent 6340c2e commit 02b97f0
Showing 1 changed file with 23 additions and 8 deletions.
31 changes: 23 additions & 8 deletions docs/1.Background.rst
Original file line number Diff line number Diff line change
Expand Up @@ -758,17 +758,16 @@ But...
.. image:: https://i.ebayimg.com/images/g/n9EAAOSwvc1ZaCei/s-l300.jpg


Entropy and mutual information
------------------------------
Statistical Dependence
----------------------
If a random variable, called X, could give any information about another random variable, say Y, we consider them dependent. The dependency of two random variables means knowing the state of one will affect the probability of the possible states of the other one. In the same way, the dependency of a random variable could be passed and also defined for a probability distribution.
Investigating a possible dependancy between two random variables is a difficult task. A more specific and more difficult task is to determine the level of that dependency. There are two main categories of techniques for measuring statistical dependency between two random variables. The first category mainly deals with the linear dependency and includes basic techniques like the Pearson Correlation and the Spearman’s Measure. But these techniques do not have a good performance measuring nonlinear dependencies which are more frequent in data.
The second category, however, include more general techniques that also cover nonlinear dependencies.

Nonlinear Dependence Measure
^^^^^^^^^^^^^^^^^^^^^^^^^^^^

**Distance Correlation**

Distance correlation (dCor) works for both linear and nonlinear dependence and it can handle random variables with arbitrary dimensions. Not surprisingly, dCor works with the distance; the Euclidean distance. Assume we have two random variables X and Y. The first step is to form their corresponding transformed matrices, TMx and TMy. Then we calculate the distance covariance:
Distance correlation (dCor) is a nonlinear dependence measure and it can handle random variables with arbitrary dimensions. Not surprisingly, dCor works with the distance; the Euclidean distance. Assume we have two random variables X and Y. The first step is to form their corresponding transformed matrices, TMx and TMy. Then we calculate the distance covariance:

.. image:: https://user-images.githubusercontent.com/27868570/51983505-1ad0d280-2499-11e9-9890-bfaf186753c3.png

Expand All @@ -783,18 +782,34 @@ The dCor value is a real number between 0 and 1 (inclusively), and 0 means that
**Mutual Information**

Mutual information (MI) is a measure of dependance based on the core concept of information theory, entropy. Entropy is a measure of uncertainty, and is formulated based on the information content, which is a measure of information.
Mutual information (MI) is a measure of dependance based on the core concept of information theory, entropy. Entropy is a measure of uncertainty, and is formulated based on the average information content of a set of possible events, which is, in turn, a measure of information.

Information content of event x with probability P(x):

.. image:: https://user-images.githubusercontent.com/27868570/52379671-6c65f800-2a6b-11e9-97b2-0dd7e05b510c.png

Entropy of a sample set of N event with pribabilities P1...Pn:

.. image:: https://user-images.githubusercontent.com/27868570/52380294-87396c00-2a6d-11e9-8d82-acba394783db.png

Mutual information between two random variable is defined as follows:

.. image:: https://user-images.githubusercontent.com/27868570/52519670-42752700-2c5f-11e9-97f6-7630757d8bff.png

Mutual information is a symmetric relation between two variables and it indicates the amount of information that one random variable reveals about the other. Or in other words:

.. image:: https://user-images.githubusercontent.com/27868570/52527839-a0d9ee00-2ccf-11e9-9d48-e29b53a1f688.png

|
|
**Maximal Information Coefficient**

|
|
How does entropy help understanding artificial neural networks?
---------------------------------------------------------------
How does 'Statistical Dependance' help understanding artificial neural networks?
--------------------------------------------------------------------------------
|
|
Expand Down

0 comments on commit 02b97f0

Please sign in to comment.