Approximation of Kullback-Leibler Divergence between Two Gaussian Mixture Distributions
-
摘要: 高斯混合分布之间的 K-L 散度没有闭式解, 通常采用其上界来近似. 对于具有相同高斯数的混合分布, 基于相对熵链规则推导其 K-L 散度上界, 提出一种更紧上界的计算方法. 为计算具有不同高斯数的混合分布之间的 K-L 散度上界, 提出基于最佳高斯分量复制的方法. 在中文声韵母声学模型上的实验结果显示, 所提出方法可更好地近似等高斯数的混合分布之间的 K-L 散度, 并能有效处理具有不同高斯数的混合分布.
-
关键词:
- K-L散度(KLD) /
- 高斯混合分布(GMD) /
- 相对熵 /
- K-L散度上界
Abstract: For no closed-form expression is available for Kullback-Leibler divergence (KLD) between two Gaussian mixture distributions (GMDs), the upper-bound of its solution is used toapproximate it. In this paper, the upper-bound of KLD of two GMDs with the same number of components is deduced according to the relative entropy link rule, and then a tighter upper-bound is proposed. In the case that two GMDs have different numbers of components, a method, named optimal Gaussian duplication (OGD), is proposed to approximate their KLD. The evaluation experiments are performed on the acoustic models of the initial and the final, which all are modeled by GMD based HMM in speech recognition. The experimental results show that the tighter upper-bound can more perfectly approximate the KLD than other methods, and the proposed OGD method can effectively compute the upper-bound of KLD between two GMDs with different numbers of components.
点击查看大图
计量
- 文章访问数: 2402
- HTML全文浏览量: 32
- PDF下载量: 2424
- 被引次数: 0