ABSTRACT
Recently, multiple classifier systems (MCS) have been used for practical applications to improve classification accuracy. Self-generating neural networks (SGNN) are one of the suitable base-classifiers for MCS because of their simple setting and fast learning. However, the computational cost of the MCS increases in proportion to the number of SGNN. In this paper, we propose a novel pruning method for the structure of the SGNN in the MCS. Experiments have been conducted to compare the pruned MCS with an unpruned MCS, the MCS based on C4.5, and k-nearest neighbor method. The results show that the pruned MCS can improve its classification accuracy as well as reducing the computational cost.
- J. Han and M. Kamber. Data Mining: Concepts and Techniques. Morgan Kaufmann Publishers, San Francisco, CA, 2000. Google Scholar
Digital Library
- J. R. Quinlan. Bagging, Boosting, and C4.5. In Proceedings of the Thirteenth National Conference on Artificial Intelligence, pages 725-730, Portland, OR, 1996. Google Scholar
Digital Library
- G. Rätsch, T. Onoda, and K.-R. Müller. Soft margins for AdaBoost. Machine Learning, 42(3):287-320, 2001. Google Scholar
Digital Library
- C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, New York, 1995. Google Scholar
Digital Library
- R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. John Wiley & Sons Inc., New York, 2nd ed., 2000. Google Scholar
Digital Library
- W. X. Wen, A. Jennings, and H. Liu. Learning a neural tree. In the International Joint Conference on Neural Networks, Beijing, China, 1992. This paper is available at ftp://ftp.cis.ohio-state.edu/pub/neuroprose/wen.sgnt-learn.ps.Z.Google Scholar
- T. Kohonen. Self-Organizing Maps. Springer-Verlag, Berlin, 1995. Google Scholar
Digital Library
- H. Inoue and H. Narihisa. Improving generalization ability of self-generating neural networks through ensemble averaging. In T. Terano, H. Liu, and A. L P. Chen, eds, The Fourth Pacific-Asia Conference on Knowledge Discovery and Data Mining, vol. 1805 of LNAI, pages 177-180, Springer-Verlag, 2000. Google Scholar
Digital Library
- M. Stone. Cross-validation: A review. Math. Operationsforsch. Statist., Ser. Statistics , 9(1):127-139, 1978.Google Scholar
Cross Ref
- L. Breiman. Bagging predictors. Machine Learning, 24:123-140, 1996. Google Scholar
Digital Library
- J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, USA, 1993. Google Scholar
Digital Library
- C.L. Blake and C.J. Merz. UCI repository of machine learning databases, University of California, Irvine, Dept of Information and Computer Science, 1998. Datasets is available at http://www.ics.uci.edu/~mlearn/MLRepository.html.Google Scholar
- E. A. Patrick and F. P. Fischer. A generalized k-nearest neighbor rule. Information and Control, 16(2):128-152, 1970.Google Scholar
Cross Ref
Index Terms
- Effective pruning method for a multiple classifier system based on self-generating neural networks
Recommendations
Improving performance of a multiple classifier system using self-generating neural networks
MCS'03: Proceedings of the 4th international conference on Multiple classifier systemsRecently, multiple classifier systems (MCS) have been used for practical applications to improve classification accuracy. Self-generating neural networks (SGNN) are one of the suitable base-classifiers for MCS because of their simple setting and fast ...
Optimizing a Multiple Classifier System
PRICAI '02: Proceedings of the 7th Pacific Rim International Conference on Artificial Intelligence: Trends in Artificial IntelligenceRecently, multiple classifier systems (MCS) have been used for practical applications to improve classification accuracy. Self-generating neural networks (SGNN) are one of the suitable base-classifiers for MCS because of their simple setting and fast ...
Pruning GP-Based classifier ensembles by bayesian networks
PPSN'12: Proceedings of the 12th international conference on Parallel Problem Solving from Nature - Volume Part IClassifier ensemble techniques are effectively used to combine the responses provided by a set of classifiers. Classifier ensembles improve the performance of single classifier systems, even if a large number of classifiers is often required. This ...




Comments