期刊文献+

最大方差展开的快速松弛算法 被引量:1

Fast Relaxed Algorithms of Maximum Variance Unfolding
在线阅读 下载PDF
导出
摘要 最大方差展开(maximum variance unfolding,MVU)是在流形局部等距概念基础上提出的一种新的非线性维数约减算法,能有效学习出隐含在高维数据集中的低维流形结构.但MVU的优化需要解决一个半定规划(semidefinite programming,SDP)问题,巨大的计算和存储复杂度限制了它在大规模数据集上的可行性.提出了MVU的一种快速算法——松弛最大方差展开(relaxed maximum variance unfolding,RMVU),算法基于Laplacian特征映射(Laplacian eigenmap)近似保留数据集局部结构的思想,对MVU中严格的局部距离保留约束进行松弛;算法求解转变为一个广义特征分解问题,大大降低了运算强度和存储需求.为了适应更大规模数据集的处理需求,同时提出了RMVU的一种改进算法——基于基准点的松弛最大方差展开(landmark-based relaxed MVU,LRMVU).在模拟数据集和COLT-20数据库上的实验验证了算法的有效性. Nonlinear dimensionality reduction is a challenging problem encountered in a variety of high dimensional data analysis, including machine learning, pattern recognition, scientific visualization, and neural computation. Based on the notion of local isometry, maximum variance unfolding (MVU) is proposed recently for nonlinear dimensionality reduction. By pulling the input patterns as far apart as possible subject to the strict local distance-preserving constraints, MVU can learn the faithful low dimensional structure of the manifold embedded in the high dimensional space. However, the final optimization of MVU needs to solve a semidefinite programming (SDP) problem; the huge computational and memory complexity makes MVU infeasible for large-scale implementations. To solve the problem, a fast algorithm of MVU, called relaxed MVU (RMVU), is proposed. In RMVU, originated from the approximate local structure preservation idea of Laplacian eigenmaps, the strict local distance-preserving constraints of MVU are relaxed. The optimization finally becomes a generalized eigen-decomposition problem and the computational intensity is significantly reduced. For addressing the more large-scale data sets, an improved RMVU, called landmark-based relaxed MVU (LRMVU), is also proposed. The theoretical analysis and experiments on the artificial data set and actual images show that the proposed algorithms RMVU and LRMVU can largely reduce the computational and memory complexity of MVU.
出处 《计算机研究与发展》 EI CSCD 北大核心 2009年第6期988-994,共7页 Journal of Computer Research and Development
基金 重庆市自然科学基金项目(CSTC2006BB2152 CSTC2008BB2160)~~
关键词 维数约减 最大方差展开 Laplacian特征映射 流形学习 方差分析 learning s dimensionality reduction maximum variance unfolding Laplacian eigenmap manifold variance analysis
  • 相关文献

参考文献11

  • 1Jolliffe I T. Principal Component Analysis [M]. Berlin.. Springer, 1989.
  • 2Cox T, Cox M. Multidimensional Scaling [M]. London: Chapman& Hall, 1994.
  • 3张军平.流形学习若干问题研究[C]//机器学习及其应用.北京:清华大学出版社,2006.
  • 4Tenenbaum J B, Silva V de, Langford J C. A global geometric framework for nonlinear dimensionality reduction [J]. Science, 2000, 290:2319-2323.
  • 5Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding [J]. Science, 2000, 290: 2323- 2326.
  • 6Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation [J]. Neural Computation, 2003, 15 (6): 1373-1396.
  • 7Zhang Zhenyue, Zha Hongyuan. Principal manifolds and nonlinear dimensionality reduction via tangent space alignment [J]. SIAM Journal of Scientific Computing, 2004, 26 (1): 313-338.
  • 8Weinberger K, Saul L. Unsupervised learning of image manifolds by semidefinite programming [C]/Proc of CVPR 2004. Los Alamitos: IEEE Computer Society, 2004: 988- 995.
  • 9Weinberger K, Saul L. Unsupervised learning of image manifolds by semidefinite programming [J]. International Journal of Computer Vision, 2006, 70(1): 11-90.
  • 10罗四维,赵连伟.基于谱图理论的流形学习算法[J].计算机研究与发展,2006,43(7):1173-1179. 被引量:76

二级参考文献43

  • 1张振跃,查宏远.线性低秩逼近与非线性降维[J].中国科学(A辑),2005,35(3):273-285. 被引量:8
  • 2杨剑,李伏欣,王珏.一种改进的局部切空间排列算法[J].软件学报,2005,16(9):1584-1590. 被引量:36
  • 3H. Sebastian Seung, Daniel D. Lee. The manifold ways of perception [J]. Science, 2000, 290(12): 2268-2269
  • 4Andrew Y. Ng, Michael I. Jordan, Yair Weiss. On spectral clustering: Analysis and an algorithm [G]. In: Advances in NIPS 14. Cambridge, MA: MIT Press, 2001. 849-856
  • 5J. Shawe-Taylor, N. Cristianini, J. Kandola. On the concentration of spectral properties [G]. In: Advances in NIPS 14. Cambridge, MA: MIT Press, 2001. 511-517
  • 6F. R. K. Chung. Spectral Graph Theory [M]. Fresno:American Mathematical Society. 1997
  • 7B. Scholkopf, A. Smola, K-R. Muller. Nonlinear component analysis as a kemel eigenvalue problem[J].Neural Computacation, 1998, 10(5): 1299-1319
  • 8T. Hastie, W. Stuetzle. Principal curves[J]. Journal of the American Statistical Association, 1989, 84(406) : 502-516
  • 9T. Cox, M. Cox. Multidimensional Scaling [M]. London:Chapman & Hall, 1994
  • 10J. B. Tenenbaum, V. de Silva, J. C. Langford. A global geometric framework for nonlinear dimensionality reduction [J].Science, 2000, 290(12): 2319-2323

共引文献76

同被引文献8

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部