摘要翻译:
在概率知识表示框架中,难处理分布是一个常见的推理困难,而变分方法近年来在提供近似解方面得到了广泛的应用。本文以累积量展开的形式描述了一种微扰方法,该方法在最低阶上恢复了标准的Kullback-Leibler变分界。高阶项描述了变分方法上的修正,而不会引起更多的计算代价。本文还阐明了它与其它微扰方法如TAP的关系。我们在一类特殊的无向图形模型Boltzmann机器上演示了该方法,我们的仿真结果证实了该方法在学习过程中提高了精度和稳定性。
---
英文标题:
《Variational Cumulant Expansions for Intractable Distributions》
---
作者:
D. Barber, P. de van Laar
---
最新提交年份:
2011
---
分类信息:
一级分类:Computer Science 计算机科学
二级分类:Artificial Intelligence
人工智能
分类描述:Covers all areas of AI except Vision, Robotics, Machine Learning, Multiagent Systems, and Computation and Language (Natural Language Processing), which have separate subject areas. In particular, includes Expert Systems, Theorem Proving (although this may overlap with Logic in Computer Science), Knowledge Representation, Planning, and Uncertainty in AI. Roughly includes material in ACM Subject Classes I.2.0, I.2.1, I.2.3, I.2.4, I.2.8, and I.2.11.
涵盖了人工智能的所有领域,除了视觉、机器人、机器学习、多智能体系统以及计算和语言(自然语言处理),这些领域有独立的学科领域。特别地,包括专家系统,定理证明(尽管这可能与计算机科学中的逻辑重叠),知识表示,规划,和人工智能中的不确定性。大致包括ACM学科类I.2.0、I.2.1、I.2.3、I.2.4、I.2.8和I.2.11中的材料。
--
---
英文摘要:
Intractable distributions present a common difficulty in inference within the probabilistic knowledge representation framework and variational methods have recently been popular in providing an approximate solution. In this article, we describe a perturbational approach in the form of a cumulant expansion which, to lowest order, recovers the standard Kullback-Leibler variational bound. Higher-order terms describe corrections on the variational approach without incurring much further computational cost. The relationship to other perturbational approaches such as TAP is also elucidated. We demonstrate the method on a particular class of undirected graphical models, Boltzmann machines, for which our simulation results confirm improved accuracy and enhanced stability during learning.
---
PDF链接:
https://arxiv.org/pdf/1105.5455