期刊文献+

集成学习中特征选择技术 被引量:7

Feature Selection for Ensemble Learning
在线阅读 下载PDF
导出
摘要 集成学习和特征选择是当前机器学习领域中的研究热点.集成学习通过重复采样可产生个体学习器之间差异度,从而提高个体学习器的泛化能力,特征选择应用到集成学习可进一步提高集成学习技术的效果,该研究有3个方面:数据子集的特征选择、个体学习器的选择和多任务学习.该文对近几年集成学习中特征选择技术的研究进行回顾,尤其对以上3个方面的研究分别进行总结,提出一些共性的技术指导以后的研究. Ensemble learning and feature selection are hot improvement of generalization performance of individuals comes sampling the training set. Feature selection for ensemble learning topics in machine learning studies. The primarily from the diversity caused by recan also improve diversity in three aspects: feature selection for individuals, selective ensemble learning, and multi-task learning. This paper gives an overview of feature selection methods for ensemble learning in recent years, and summarize some general techniques useful in the further studies.
作者 李国正 李丹
出处 《上海大学学报(自然科学版)》 CAS CSCD 北大核心 2007年第5期598-604,共7页 Journal of Shanghai University:Natural Science Edition
基金 国家自然科学基金资助项目(20503015) 上海市教委自然科学基金资助项目(05AZ67)
关键词 集成学习 特征选择 多任务学习 ensemble learning feature selection multi-task learning
  • 相关文献

参考文献29

  • 1DIETTERICH T. Machine-learning research: four current directions [J]. The AI Magazine, 1998, 18(4) :97-136.
  • 2BREIMAN L. Bagging predictors [J]. Machine Learning, 1996, 24(2) : 123-140.
  • 3BAUER E, KOHAVI R. An empirical comparison of voting classification algorithms: bagging, boosting, and variants [J]. Machine Learning, 1999, 36(2):105-139.
  • 4BROWN G, WYATT J L, TIFFNO P. Managing diversity in regression ensembles [ J ]. Journal of Machine Learning Research, 2005, 6 : 1621-1650.
  • 5GUYON I , ELISSEEFF A. An introduction to variable and feature selection [ J ]. Journal of Machine Learning Research, 2003, 3:1157-1182.
  • 6YU L, LIU H. Efficient feature selection via analysis of relevance and redundancy [J]. Journal of Machine Learning Research, 2004, 5(Oct) :1205-1224.
  • 7LIU H, YU L. Toward integrating feature selection algorithms for classification and clustering [ J ]. IEEE Transactions on Knowledge and Data Engineering, 2005, 17 (3):1-12.
  • 8HO T. The random subspace method for construction decision forests [J]. IEEE Transaction Pattern Analysis and Machine Intelligence, 1998, 20(8) :832-844.
  • 9SIMON G, HORST B. Feature selection algorithms for the generation of multiple classier systems and their application to handwritten word recognition [ J]. Pattern Recognition Letters, 2004, 25 ( 1 ) : 1323-1336.
  • 10OPTIZ D. Feature selection for ensembles [ C ]// International Conference on Artificial Intelligence. 1999: 379-384.

二级参考文献7

  • 1Schapire R E. The strength of weak learnability[J].Machine Learning, 1990, 5(2): 197-227.
  • 2Breiman L. Bagging predictors [J]. Machine Learning, 1996, 24(2): 123-140.
  • 3Zhou Z H, Wu J X, Jiang Y, et al. Genetic algorithm based selective neural network ensemble [A]. In: Cohn A G, ed.Proceedings of the 17th International Joint Conference on Artificial Intelligence [C]. Seattle, WA:Morgan Kaufmann Publishers,2001. 797-802.
  • 4Zhou Z H, Wu J X, Tang W. Ensembling neural networks: Many could be better than all[J]. Artificial Intelligence, 2002, 137(1,2): 239-263.
  • 5Mitra P, Murthy C A, Pal S K. Unsupervised feature selection using feature similarity[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002, 24(3): 301-312.
  • 6Blake C, Keogh E, Merz C J. UCI repository of machine learning databases[EB/OL]. Http://www.ics.uci.edu/ mlearn/MLRepository.htm,2003-12-12.
  • 7Demuth H, Beale M. Neural network toolbox User's guide for use with MATLAB [M].4th Ed.Natick,USA:The Mathworks Inc, 2001.

共引文献14

同被引文献64

引证文献7

二级引证文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部