期刊文献+

最大相关最小冗余限定性贝叶斯网络分类器学习算法 被引量:4

Max-relevance min-redundancy restrictive BAN classifier learning algorithm
在线阅读 下载PDF
导出
摘要 朴素贝叶斯分类器(nave bayes)是一种简单而有效的基于贝叶斯思想的分类方法,但它的属性条件独立性假设并不符合实际,影响了它的分类性能。BAN(bayesian network augmented nave bayes)分类器扩展了朴素贝叶斯分类器,使其表示属性之间依赖关系的能力增强,但是其学习算法需要大量的高维计算,在小采样数据集上,影响BAN分类器的分类性能。基于改进的最大相关最小冗余特征选择技术,提出限定性贝叶斯网络分类器学习算法(k-BAN)。本算法使用改进的最大相关最小冗余特征选择技术,通过选择属性结点的连接关系集合建立属性之间的依赖性关系。将该分类方法与NB,TAN和BAN分类器进行实验比较。实验结果表明,在小采样数据集上,本算法获得的限定性贝叶斯网络分类器具有更高的分类准确性。 NB (Naive Bayes) classifier is a simple and effective classification method, which is based on Bayes theorem. However, its attribute conditional independence assumption usually doesn't correspond to reality, which affects its classification performance. BAN (Bayesian network Augmented Naive Bayes) classifier extends the ability to represent the dependence among attributes. However, BAN learning algorithms need a large amount of high dimensional computations, which impairs the classification accuracy of BAN,especially on small sample datasets. Based on the variant of max-relevance rain-redundancy feature selection technology, a new restrictive BAN classifier learning algorithm (k-BAN), which builds the dependence by selecting the set of edges for each attribute node,is proposed. Compared with NB, TAN and BAN classifiers by an experiment, the restrictive BAN classifier of our algorithm has better classification accuracy,especially on small sample datasets.
出处 《重庆大学学报(自然科学版)》 EI CAS CSCD 北大核心 2014年第6期71-77,共7页 Journal of Chongqing University
基金 国家自然科学基金资助项目(61172168)
关键词 朴素贝叶斯 贝叶斯网络分类器 最大相关性 最小冗余性 依赖性 Naive Bayes Bayesian network Augmented Naive Bayes max-relevance min-redundancy dependence
  • 相关文献

参考文献13

  • 1周国强,崔荣一.基于朴素贝叶斯分类器的朝鲜语文本分类的研究[J].中文信息学报,2011,25(4):16-19. 被引量:13
  • 2李冠广,王占杰.贝叶斯分类器在入侵检测中的应用[J].信息安全与技术,2010,1(9):63-66. 被引量:2
  • 3Pearl J. Probabilistic reasoning III intelligent systems: networks of plausible inference [M]. San Francisco: Morgan Kaufman, 1998.
  • 4Muralidharan V, Sugumaran V. A comparative study of Naive Bayes classifier and Bayes net classifier for fault diagnosis of monoblock centrifugal pump using wavelet analysis[J]. Applied Soft Computing, 2012,12(8) : 2023-2029.
  • 5David R, Telesca Ds Iohnson V E. High-dimensional Bayesian classifiers using non-local priors[J]. Statistical Models for Data Analysis,2013:305-313.
  • 6Chen L F, Wang S R. Automated feature weighting in naive bayes for high-dimensional data classification[C]//Proceedings of the 21st ACM International Conference on Information and Knowledge Management, October 29- November2, Maui , USA. New York:ACM,2012:1243-1252.
  • 7Malovini H, Barbarini N, Bellazzi R, et al. Hierarchical naive Bayes for genetic association studies[J]. BMC Bioinforrnatics , 2012,13(SupI4):I-II.
  • 8Singh M, Provan G M. Efficient learning of selective Bayesian network classifiers[M]. Italy: Morgan Kaufmann, 1996.
  • 9Lin H C, Su C T. A selective Bayes classifier with meta-heuristics for incomplete data [J J. Neurocomputing, 2013, 106 : 95-102.
  • 10Peng H Cv Long F M,Ding C. Feature selection based on mutual information: criteria of max-dependency,max-relevance, and min-redundancy[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005,27 (8) : 1226-1233.

二级参考文献33

共引文献59

同被引文献31

  • 1王双成,苑森淼.具有丢失数据的贝叶斯网络结构学习研究[J].软件学报,2004,15(7):1042-1048. 被引量:62
  • 2Wang S C, Xu G L, Du R J. Restricted Bayesian classifi- cation networks. Science China Information Sciences, 2013, 56(7): 1-15.
  • 3Friedman N, Geiger D, Goldszmidt M. Bayesian network classifiers. Machine Learning, 1997, 29(2-3): 131-163.
  • 4Jing Y S, Pavlovid V, Rehg J M. Boosted Bayesian network classifiers. Machine Learning, 2008, 73(2): 155-184.
  • 5Webb G I, Boughton J R, Zheng F, Ting K M, Salem H. Learning by extrapolation from marginal to full-multivariate probability distributions: decreasingly naive Bayesian clas- sification. Machine Learning, 2012, 86(2): 233-272.
  • 6Ldpez-Cruz P L, Larrafiaga P, DeFelipe J, Bielza C. Bayesian network modeling of the consensus between ex- perts: an application to neuron classification, fnternational Journal of Approximate Reasoning, 2014, 55(1): 3-22.
  • 7John G H, Langley P. Estimating continuous distributions in Bayesian classifiers. In: Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence (UAI-1995). San Francisco, CA, USA: Morgan Kaufmann Publishers Inc., 1995. 338-345.
  • 8P@rez A, Larrafiaga P, Inza I. Supervised classification with conditional Gaussian networks: increasing the structure complexity from naive Bayes. International Journal of Ap- proximate Reasoning, 2006, 43(1): 1-25.
  • 9Prez A, Larrafiaga P, Inza I. Bayesian classifiers based on kernel density estimation: flexible classifiers. International Journal of Approximate Reasoning, 2009, 50(2): 341-362.
  • 10Bounhas M, Mellouli K, Prade H, Serrurier M. Possibilistic classifiers for numerical data. Soft Computing, 2013, 17(5): 733 - 751.

引证文献4

二级引证文献29

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部