Implementation of an accelerated version of the Blahut Arimoto algorithm. A full description of the algorithm and its steps are provided in the attached notebook.
On each iteration we perform the two following calculations:
The algorithm is implemented in python, by constructing a trainer class that has initialization, channel matrxi construction, training and visualization methods.
I've decided to visualize the results by evaluating several metrics.
- The mutual information induced by the current PMF vs. iteration:
- The total variation distance between input PMFs on two conseccutive iterations:
- The KL divergence between input PMFs on two conseccutive iterations: