摘要翻译:
我们介绍了一类以贝叶斯网络形式从概率模型导出的神经网络。通过对网络中所表示的概率模型的性质施加额外的假设,我们导出了具有标准动力学的神经网络,它不需要训练就可以确定突触权重,执行随机变量均值的精确计算,能够汇集多个证据源,并且能够干净一致地处理不一致或矛盾的证据。所提出的
神经网络捕获了贝叶斯网络的许多特性,提供了分布版本的概率模型。
---
英文标题:
《Designing neural networks that process mean values of random variables》
---
作者:
Michael J. Barber and John W. Clark
---
最新提交年份:
2010
---
分类信息:
一级分类:Physics 物理学
二级分类:Disordered Systems and Neural Networks 无序系统与神经网络
分类描述:Glasses and spin glasses; properties of random, aperiodic and quasiperiodic systems; transport in disordered media; localization; phenomena mediated by defects and disorder; neural networks
眼镜和旋转眼镜;随机、非周期和准周期系统的性质;无序介质中的传输;本地化;由缺陷和无序介导的现象;神经网络
--
一级分类:Computer Science 计算机科学
二级分类:Artificial Intelligence
人工智能
分类描述:Covers all areas of AI except Vision, Robotics, Machine Learning, Multiagent Systems, and Computation and Language (Natural Language Processing), which have separate subject areas. In particular, includes Expert Systems, Theorem Proving (although this may overlap with Logic in Computer Science), Knowledge Representation, Planning, and Uncertainty in AI. Roughly includes material in ACM Subject Classes I.2.0, I.2.1, I.2.3, I.2.4, I.2.8, and I.2.11.
涵盖了人工智能的所有领域,除了视觉、机器人、机器学习、多智能体系统以及计算和语言(自然语言处理),这些领域有独立的学科领域。特别地,包括专家系统,定理证明(尽管这可能与计算机科学中的逻辑重叠),知识表示,规划,和人工智能中的不确定性。大致包括ACM学科类I.2.0、I.2.1、I.2.3、I.2.4、I.2.8和I.2.11中的材料。
--
一级分类:Computer Science 计算机科学
二级分类:Machine Learning
机器学习
分类描述:Papers on all aspects of machine learning research (supervised, unsupervised, reinforcement learning, bandit problems, and so on) including also robustness, explanation, fairness, and methodology. cs.LG is also an appropriate primary category for applications of machine learning methods.
关于机器学习研究的所有方面的论文(有监督的,无监督的,强化学习,强盗问题,等等),包括健壮性,解释性,公平性和方法论。对于机器学习方法的应用,CS.LG也是一个合适的主要类别。
--
---
英文摘要:
We introduce a class of neural networks derived from probabilistic models in the form of Bayesian networks. By imposing additional assumptions about the nature of the probabilistic models represented in the networks, we derive neural networks with standard dynamics that require no training to determine the synaptic weights, that perform accurate calculation of the mean values of the random variables, that can pool multiple sources of evidence, and that deal cleanly and consistently with inconsistent or contradictory evidence. The presented neural networks capture many properties of Bayesian networks, providing distributed versions of probabilistic models.
---
PDF链接:
https://arxiv.org/pdf/1004.5326