全部版块 我的主页
论坛 经济学人 二区 外文文献专区
549 0
2022-04-11
摘要翻译:
在机器学习中,一种特殊类型的神经网络--受限玻尔兹曼机(RBM)被实现用于分类和特征检测。RBM的特点是由可见单元和隐藏单元分开,它们能够有效地学习观测数据的生成模型。我们研究了RBM的“混合”形式,其中隐单元是模拟的,可见单元是二元的,我们证明了可见单元的热力学等效于Hopfield网络的热力学,其中N个可见单元是神经元,P个隐单元是学习模式。我们应用随机稳定性的方法推导了该模型的热力学,并考虑了该技术在多个存储模式集合情况下的形式推广,这可以作为研究相关集合的基准。我们的结果表明,Hopfield网络的动力学模拟需要N个神经元的更新和N(N-1)/2个突触的存储,可以用混合Boltzmann机完成,需要N+P个神经元的更新和NP个突触的存储。此外,众所周知的Hopfield网络的玻璃化转变在玻尔兹曼机器中也有对应的:它对应于选择隐层和可见层相对尺寸的最佳准则,解决了模型灵活性和通用性之间的权衡。Hopfield模型的低存储相对应于很少的隐藏单元,因此是一个过度约束的RBM,而自旋玻璃相(隐藏单元太多)对应于不受约束的RBM,容易导致观测数据的过拟合。
---
英文标题:
《On the equivalence of Hopfield Networks and Boltzmann Machines》
---
作者:
Adriano Barra, Alberto Bernacchia, Enrica Santucci, Pierluigi Contucci
---
最新提交年份:
2012
---
分类信息:

一级分类:Physics        物理学
二级分类:Disordered Systems and Neural Networks        无序系统与神经网络
分类描述:Glasses and spin glasses; properties of random, aperiodic and quasiperiodic systems; transport in disordered media; localization; phenomena mediated by defects and disorder; neural networks
眼镜和旋转眼镜;随机、非周期和准周期系统的性质;无序介质中的传输;本地化;由缺陷和无序介导的现象;神经网络
--
一级分类:Computer Science        计算机科学
二级分类:Artificial Intelligence        人工智能
分类描述:Covers all areas of AI except Vision, Robotics, Machine Learning, Multiagent Systems, and Computation and Language (Natural Language Processing), which have separate subject areas. In particular, includes Expert Systems, Theorem Proving (although this may overlap with Logic in Computer Science), Knowledge Representation, Planning, and Uncertainty in AI. Roughly includes material in ACM Subject Classes I.2.0, I.2.1, I.2.3, I.2.4, I.2.8, and I.2.11.
涵盖了人工智能的所有领域,除了视觉、机器人、机器学习、多智能体系统以及计算和语言(自然语言处理),这些领域有独立的学科领域。特别地,包括专家系统,定理证明(尽管这可能与计算机科学中的逻辑重叠),知识表示,规划,和人工智能中的不确定性。大致包括ACM学科类I.2.0、I.2.1、I.2.3、I.2.4、I.2.8和I.2.11中的材料。
--

---
英文摘要:
  A specific type of neural network, the Restricted Boltzmann Machine (RBM), is implemented for classification and feature detection in machine learning. RBM is characterized by separate layers of visible and hidden units, which are able to learn efficiently a generative model of the observed data. We study a "hybrid" version of RBM's, in which hidden units are analog and visible units are binary, and we show that thermodynamics of visible units are equivalent to those of a Hopfield network, in which the N visible units are the neurons and the P hidden units are the learned patterns. We apply the method of stochastic stability to derive the thermodynamics of the model, by considering a formal extension of this technique to the case of multiple sets of stored patterns, which may act as a benchmark for the study of correlated sets. Our results imply that simulating the dynamics of a Hopfield network, requiring the update of N neurons and the storage of N(N-1)/2 synapses, can be accomplished by a hybrid Boltzmann Machine, requiring the update of N+P neurons but the storage of only NP synapses. In addition, the well known glass transition of the Hopfield network has a counterpart in the Boltzmann Machine: It corresponds to an optimum criterion for selecting the relative sizes of the hidden and visible layers, resolving the trade-off between flexibility and generality of the model. The low storage phase of the Hopfield model corresponds to few hidden units and hence a overly constrained RBM, while the spin-glass phase (too many hidden units) corresponds to unconstrained RBM prone to overfitting of the observed data.
---
PDF链接:
https://arxiv.org/pdf/1105.2790
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群