We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
markdown原文: $$ 相对熵,又称KL散度,可以用于衡量两个分布的差异\ 假设真是模型为p(x),而我们求解得到的模型是q(x),\ 那么我们就可以用p(x)与q(x)的相对熵作为LOSS函数\ D_{KL}(p||q) =-\sum\limits_{x}^{}p(x)log_b\enspace p(x)-\sum\limits_{x}^{}p(x)log_b\enspace q(x)\ 其中p(x)为常数,我们仅需使-\sum\limits_{x}^{}p(x)log_b\enspace q(x)部分最小,即可获得最优模型 $$
The text was updated successfully, but these errors were encountered:
网页地址贴一下
Sorry, something went wrong.
https://kureisersen.github.io/post/%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0/%E8%A5%BF%E7%93%9C%E4%B9%A6/%E7%BA%BF%E6%80%A7%E6%A8%A1%E5%9E%8B/%E5%AF%B9%E6%95%B0%E5%87%A0%E7%8E%87%E5%9B%9E%E5%BD%92/
主要是把所有内容都被视为数学公式处理了。 理论上 $$ 应该只用来包裹公式,不包裹文字。
$$
No branches or pull requests
markdown原文:
$$
相对熵,又称KL散度,可以用于衡量两个分布的差异\
假设真是模型为p(x),而我们求解得到的模型是q(x),\
那么我们就可以用p(x)与q(x)的相对熵作为LOSS函数\
D_{KL}(p||q) =-\sum\limits_{x}^{}p(x)log_b\enspace p(x)-\sum\limits_{x}^{}p(x)log_b\enspace q(x)\
其中p(x)为常数,我们仅需使-\sum\limits_{x}^{}p(x)log_b\enspace q(x)部分最小,即可获得最优模型
$$
The text was updated successfully, but these errors were encountered: