ZHANG Dong-bo, WANG Yao-nan. On Dynamic Adaptive Selection Method for Neural Network Ensemble[J]. INFORMATION AND CONTROL, 2007, 36(4): 434-440.
Citation: ZHANG Dong-bo, WANG Yao-nan. On Dynamic Adaptive Selection Method for Neural Network Ensemble[J]. INFORMATION AND CONTROL, 2007, 36(4): 434-440.

On Dynamic Adaptive Selection Method for Neural Network Ensemble

More Information
  • Received Date: July 16, 2006
  • Published Date: August 19, 2007
  • Dynamic adaptive selection ensemble based on local classification accuracy estimation is introduced to improve the performance of neural network ensemble.Based on Bayesian theory,it can be proved that the perfor-mance of the dynamic adaptive selection ensemble can approximate the optimal Bayesian classifier if certain hypotheses are met.According to this conclusion,member network selection methods based on hard decision and soft decision are introduced.Experiment is made on five data sets selected from the UCI machine learning database.The experimental results show that the dynamic adaptive selection ensemble is better than conventional voting and averaging methods,and the performance is not sensitive to the size of neighborhood.Furthermore,the soft decision method is of better performance than the hard decision method.
  • [1]
    Hansen L K,Salamon P.Neural network ensembles[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,1990,12(10):993~1001.
    [2]
    周志华,陈世福.神经网络集成[J].计算机学报,2002,25(1):1~8.
    [3]
    Freund Y,Schapire R E.A decision-theoretic generalization of on-line learning and an application to boosting[J].Journal of Computer and System Sciences,1997,55 (1):119~139.
    [4]
    Breiman L.Bagging predictors[J].Machine Learning,1996,24(2):123~140.
    [5]
    Jang M,Cho S.Observational learning algorithm for an ensemble of neural networks[J].Pattern Analysis & Applications,2002,5(2):154~167.
    [6]
    Ridgeway G,Madigan D,Richardson T.Boosting methodology for regression problems[A].Proceedings of the 7th International Workshop on Artificial Intelligence and Statistics[C].San Francisco,CA,USA:Morgan Kanfmann Publishers,1999.152~161.
    [7]
    Zhou Z H,Wu J X,Tang W.Ensemble neural networks:Many could be better than all[J].Artificial Intelligence,2002,137 (1-2):239~263.
    [8]
    Zhou Z H,Wu J X,Tang W,et al.Combining regression estimator:GA based selective neural network ensemble[J].International Journal of Computational Intelligence and Applications,2001,1(4):341~356.
    [9]
    Lam L,Suen C Y.A theoretical analysis of the application of majority voting to pattern recognition[A].Proceedings of the 12th International Conference on Pattern Recognition[C].Piscataway,NJ,USA:IEEE,1994.418~420.
    [10]
    Krogh A,Vedelsby J.Neural network ensembles,cross validation,and active learning[A].Advances in Neural Information Processing Systems[C].Cambridge,MA:MIT Press,1995.231~238.
    [11]
    UCI machine learning[DB/CD].http://www.ics.uei.edu/~mlearn/MLRepository.html,2006-06-01/2006-07-17.
    [12]
    Opitz D,Maclin R.Popular ensemble methods:An empirical study[J].Journal of Artificial Intelligence Research,1999,11(1):169~198.

Catalog

    Article views (2266) PDF downloads (426) Cited by()
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return