摘要翻译:
提出并分析了离散隐马尔可夫模型(HMMs)中的三种在线学习算法,并与Baldi-肖万算法进行了比较。利用Kullback-Leibler散度作为推广误差的度量,我们画出了简化情况下的学习曲线。分析了其中一种算法在学习漂移概念时的性能,并与Baldi-Harris算法在相同情况下的学习性能进行了比较。根据我们的结果,对学习和对称破缺作了简要的讨论。
---
英文标题:
《Online Learning in Discrete Hidden Markov Models》
---
作者:
Roberto C. Alamino, Nestor Caticha
---
最新提交年份:
2007
---
分类信息:
一级分类:Statistics 统计学
二级分类:Machine Learning
机器学习
分类描述:Covers machine learning papers (supervised, unsupervised, semi-supervised learning, graphical models, reinforcement learning, bandits, high dimensional inference, etc.) with a statistical or theoretical grounding
覆盖机器学习论文(监督,无监督,半监督学习,图形模型,强化学习,强盗,高维推理等)与统计或理论基础
--
---
英文摘要:
We present and analyse three online algorithms for learning in discrete Hidden Markov Models (HMMs) and compare them with the Baldi-Chauvin Algorithm. Using the Kullback-Leibler divergence as a measure of generalisation error we draw learning curves in simplified situations. The performance for learning drifting concepts of one of the presented algorithms is analysed and compared with the Baldi-Chauvin algorithm in the same situations. A brief discussion about learning and symmetry breaking based on our results is also presented.
---
PDF链接:
https://arxiv.org/pdf/708.2377