全部版块 我的主页
论坛 经济学人 二区 外文文献专区
204 0
2022-03-07
摘要翻译:
距离度量在最近邻分类中起着重要的作用。为了提高神经网络的性能,通常采用欧几里得距离度量或优化马氏距离度量。本文研究了在欧氏空间中嵌入任意度量空间的问题,目的是提高神经网络分类器的精度。利用再生核Hilbert空间中的正则化框架给出了一个解决方案,并证明了神经网络分类的一个类表示定理。然后通过求解一个半定程序来确定嵌入函数,该程序与软边界线性二进制支持向量机分类器有有趣的联系。虽然本文的重点是给出一个在神经网络环境下度量嵌入的一般理论框架,但我们在一些基准数据集上证明了该方法的性能,并表明它在保留一出和泛化误差方面优于马氏度量学习算法。
---
英文标题:
《Metric Embedding for Nearest Neighbor Classification》
---
作者:
Bharath K. Sriperumbudur and Gert R. G. Lanckriet
---
最新提交年份:
2007
---
分类信息:

一级分类:Statistics        统计学
二级分类:Machine Learning        机器学习
分类描述:Covers machine learning papers (supervised, unsupervised, semi-supervised learning, graphical models, reinforcement learning, bandits, high dimensional inference, etc.) with a statistical or theoretical grounding
覆盖机器学习论文(监督,无监督,半监督学习,图形模型,强化学习,强盗,高维推理等)与统计或理论基础
--

---
英文摘要:
  The distance metric plays an important role in nearest neighbor (NN) classification. Usually the Euclidean distance metric is assumed or a Mahalanobis distance metric is optimized to improve the NN performance. In this paper, we study the problem of embedding arbitrary metric spaces into a Euclidean space with the goal to improve the accuracy of the NN classifier. We propose a solution by appealing to the framework of regularization in a reproducing kernel Hilbert space and prove a representer-like theorem for NN classification. The embedding function is then determined by solving a semidefinite program which has an interesting connection to the soft-margin linear binary support vector machine classifier. Although the main focus of this paper is to present a general, theoretical framework for metric embedding in a NN setting, we demonstrate the performance of the proposed method on some benchmark datasets and show that it performs better than the Mahalanobis metric learning algorithm in terms of leave-one-out and generalization errors.
---
PDF链接:
https://arxiv.org/pdf/706.3499
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群