摘要翻译:
本文在对相应假设集的Rademacher复杂度分析的基础上,给出了学习核问题的几个新的推广界。对于具有p个基核的凸组合的学习核,我们的界对核数p仅有对数(p)依赖关系,这比以前给出的对同一问题的最佳界要好得多。我们还给出了一个新的学习界,即p基核与L_2正则化的线性组合,该正则化对p的依赖仅在p^{1/4}内。
---
英文标题:
《New Generalization Bounds for Learning Kernels》
---
作者:
Corinna Cortes, Mehryar Mohri and Afshin Rostamizadeh
---
最新提交年份:
2009
---
分类信息:
一级分类:Computer Science 计算机科学
二级分类:Artificial Intelligence
人工智能
分类描述:Covers all areas of AI except Vision, Robotics, Machine Learning, Multiagent Systems, and Computation and Language (Natural Language Processing), which have separate subject areas. In particular, includes Expert Systems, Theorem Proving (although this may overlap with Logic in Computer Science), Knowledge Representation, Planning, and Uncertainty in AI. Roughly includes material in ACM Subject Classes I.2.0, I.2.1, I.2.3, I.2.4, I.2.8, and I.2.11.
涵盖了人工智能的所有领域,除了视觉、机器人、机器学习、多智能体系统以及计算和语言(自然语言处理),这些领域有独立的学科领域。特别地,包括专家系统,定理证明(尽管这可能与计算机科学中的逻辑重叠),知识表示,规划,和人工智能中的不确定性。大致包括ACM学科类I.2.0、I.2.1、I.2.3、I.2.4、I.2.8和I.2.11中的材料。
--
---
英文摘要:
This paper presents several novel generalization bounds for the problem of learning kernels based on the analysis of the Rademacher complexity of the corresponding hypothesis sets. Our bound for learning kernels with a convex combination of p base kernels has only a log(p) dependency on the number of kernels, p, which is considerably more favorable than the previous best bound given for the same problem. We also give a novel bound for learning with a linear combination of p base kernels with an L_2 regularization whose dependency on p is only in p^{1/4}.
---
PDF链接:
https://arxiv.org/pdf/0912.3309