?? cross_entropy.m
字號:
function kl = cross_entropy(p, q, symmetric)
% CROSS_ENTROPY Compute the Kullback-Leibler divergence between two discrete prob. distributions
% kl = cross_entropy(p, q, symmetric)
%
% If symmetric = 1, we compute the symmetric version. Default: symmetric = 0;
tiny = exp(-700);
if nargin < 3, symmetric = 0; end
p = p(:);
q = q(:);
if symmetric
kl = (sum(p .* log((p+tiny)./(q+tiny))) + sum(q .* log((q+tiny)./(p+tiny))))/2;
else
kl = sum(p .* log((p+tiny)./(q+tiny)));
end
?? 快捷鍵說明
復(fù)制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -