全部版块 我的主页
论坛 经济学人 二区 外文文献专区
587 0
2022-03-17
摘要翻译:
现有的贝叶斯模型,尤其是非参数贝叶斯方法,依赖于专门构思的先验知识来结合领域知识来发现改进的潜在表示。虽然先验可以通过贝叶斯规则影响后验分布,但施加后验正则可以说更直接,在某些情况下更自然和更普遍。本文提出了正则化贝叶斯推理(RegBayes),它是一种新的计算框架,在信息论的形式下,对期望的后数据后验分布进行带有正则化项的后验推理。RegBayes比通过先验知识获取专家知识的过程更加灵活,它涵盖了有向贝叶斯网络和无向马尔可夫网络,后者的贝叶斯公式导致了混合链图模型。当正则化由后验分布上的线性算子如期望算子导出时,我们给出了刻画Regbayes解的一般凸分析定理。此外,我们给出了两个具体的RegBayes实例:无限潜在支持向量机(iLSVM)和多任务无限潜在支持向量机(MT-iLSVM),它们分别结合非参数贝叶斯模型探索了大边界思想,用于发现分类和多任务学习的预测潜在特征。我们提出了有效的推理方法,并报告了在几个基准数据集上的实证研究,这些研究似乎证明了从大范围学习和贝叶斯非参数中继承的优点。这些结果直到现在还没有得到,并有助于推动这两个重要的子领域之间的接口,这两个子领域在很大程度上被视为孤立的社区。
---
英文标题:
《Bayesian Inference with Posterior Regularization and applications to
  Infinite Latent SVMs》
---
作者:
Jun Zhu, Ning Chen, and Eric P. Xing
---
最新提交年份:
2014
---
分类信息:

一级分类:Computer Science        计算机科学
二级分类:Machine Learning        机器学习
分类描述:Papers on all aspects of machine learning research (supervised, unsupervised, reinforcement learning, bandit problems, and so on) including also robustness, explanation, fairness, and methodology. cs.LG is also an appropriate primary category for applications of machine learning methods.
关于机器学习研究的所有方面的论文(有监督的,无监督的,强化学习,强盗问题,等等),包括健壮性,解释性,公平性和方法论。对于机器学习方法的应用,CS.LG也是一个合适的主要类别。
--
一级分类:Computer Science        计算机科学
二级分类:Artificial Intelligence        人工智能
分类描述:Covers all areas of AI except Vision, Robotics, Machine Learning, Multiagent Systems, and Computation and Language (Natural Language Processing), which have separate subject areas. In particular, includes Expert Systems, Theorem Proving (although this may overlap with Logic in Computer Science), Knowledge Representation, Planning, and Uncertainty in AI. Roughly includes material in ACM Subject Classes I.2.0, I.2.1, I.2.3, I.2.4, I.2.8, and I.2.11.
涵盖了人工智能的所有领域,除了视觉、机器人、机器学习、多智能体系统以及计算和语言(自然语言处理),这些领域有独立的学科领域。特别地,包括专家系统,定理证明(尽管这可能与计算机科学中的逻辑重叠),知识表示,规划,和人工智能中的不确定性。大致包括ACM学科类I.2.0、I.2.1、I.2.3、I.2.4、I.2.8和I.2.11中的材料。
--
一级分类:Statistics        统计学
二级分类:Methodology        方法论
分类描述:Design, Surveys, Model Selection, Multiple Testing, Multivariate Methods, Signal and Image Processing, Time Series, Smoothing, Spatial Statistics, Survival Analysis, Nonparametric and Semiparametric Methods
设计,调查,模型选择,多重检验,多元方法,信号和图像处理,时间序列,平滑,空间统计,生存分析,非参数和半参数方法
--
一级分类:Statistics        统计学
二级分类:Machine Learning        机器学习
分类描述:Covers machine learning papers (supervised, unsupervised, semi-supervised learning, graphical models, reinforcement learning, bandits, high dimensional inference, etc.) with a statistical or theoretical grounding
覆盖机器学习论文(监督,无监督,半监督学习,图形模型,强化学习,强盗,高维推理等)与统计或理论基础
--

---
英文摘要:
  Existing Bayesian models, especially nonparametric Bayesian methods, rely on specially conceived priors to incorporate domain knowledge for discovering improved latent representations. While priors can affect posterior distributions through Bayes' rule, imposing posterior regularization is arguably more direct and in some cases more natural and general. In this paper, we present regularized Bayesian inference (RegBayes), a novel computational framework that performs posterior inference with a regularization term on the desired post-data posterior distribution under an information theoretical formulation. RegBayes is more flexible than the procedure that elicits expert knowledge via priors, and it covers both directed Bayesian networks and undirected Markov networks whose Bayesian formulation results in hybrid chain graph models. When the regularization is induced from a linear operator on the posterior distributions, such as the expectation operator, we present a general convex-analysis theorem to characterize the solution of RegBayes. Furthermore, we present two concrete examples of RegBayes, infinite latent support vector machines (iLSVM) and multi-task infinite latent support vector machines (MT-iLSVM), which explore the large-margin idea in combination with a nonparametric Bayesian model for discovering predictive latent features for classification and multi-task learning, respectively. We present efficient inference methods and report empirical studies on several benchmark datasets, which appear to demonstrate the merits inherited from both large-margin learning and Bayesian nonparametrics. Such results were not available until now, and contribute to push forward the interface between these two important subfields, which have been largely treated as isolated in the community.
---
PDF链接:
https://arxiv.org/pdf/1210.1766
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群