Skip to content

Latest commit

 

History

History
36 lines (28 loc) · 1.07 KB

ent-cross.md

File metadata and controls

36 lines (28 loc) · 1.07 KB
layout mathjax author affiliation e_mail date title chapter section topic definition sources def_id shortcut username
definition
true
Joram Soch
BCCN Berlin
joram.soch@bccn-berlin.de
2020-07-27 19:51:00 -0700
Cross-entropy
General Theorems
Information theory
Shannon entropy
Cross-entropy
authors year title in pages url
Wikipedia
2020
Cross entropy
Wikipedia, the free encyclopedia
retrieved on 2020-07-28
D85
ent-cross
JoramSoch

Definition: Let $X$ be a discrete random variable with possible outcomes $\mathcal{X}$ and let $P$ and $Q$ be two probability distributions on $X$ with the probability mass functions $p(x)$ and $q(x)$. Then, the cross-entropy of $Q$ relative to $P$ is defined as

$$ \label{eq:ent-cross} \mathrm{H}(P,Q) = - \sum_{x \in \mathcal{X}} p(x) \cdot \log_b q(x) $$

where $b$ is the base of the logarithm specifying in which unit the cross-entropy is determined.