您的位置:首页 > 其它

Kullback Leibler Distance (or divergence)

2010-01-07 15:35 435 查看
K-L距离(相对熵)
K-L距离是两个完全确定的概率分布的非相似性度量。
1.定义:
设p1(x)和p2(x)是两个连续概率分布,根据定义,p1(x)和 p2(x)的K-L距离D(p1,p2)为:

离散情况也类似。
如果用E表示p1分布的期望,上述表达式也可写作:

2.基本属性:
l D(p1,p2)是 的平均值,其中p1(x)为参考分布
l K-L距离总是非负的。当两个分布想同时它的值是零。
l K-L距离不是关于p1(x)和 p2(x)对称的,他也不是一个数学意义上的距离。
我们常常见到的是p1和 p2的K-L距离的的对称形式,即:

K-L距离通常很难精确地计算出来,但是两个分部都是正态分布是计算还是可以实现的。计算过程有点长但是相当有益的。

Kullback Leibler Distance (KL)

The Kullback Leibler distance (KL-distance) is a natural distance function from a "true" probability distribution, p, to a "target" probability distribution, q. It can be interpreted as the expected extra message-length per datum due to using a code based on the wrong (target) distribution compared to using a code based on the true distribution. For discrete (not necessarily finite) probability distributions, p={p1, ..., pn} and q={q1, ..., qn}, the KL-distance is defined to be KL(p, q) = Σi pi . log2( pi / qi ) For continuous probability densities, the sum is replaced by an integral. KL(p, p) = 0 KL(p, q) ≥ 0 Note that the KL-distance is not, in general, symmetric.
From:
http://www.aiaccess.net/English/Glossaries/GlosMod/e_gm_kullbak.htm#tut_1_basic

http://www.csse.monash.edu.au/~lloyd/tildeMML/KL/
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: