ABSTRACT
One of the most relevant tasks concerning Machine Learning is the induction of classifiers, which can be used to classify or to predict. Those classifiers can be used in an isolated way, or can be combined to build a multiple classifier system. Building many-layered systems or knowing relation between different base classifiers are of special interest. Thus, in this paper we will use the HECIC system which consists of two layers: the first layer is a multiple classifier system that processes all the examples and tries to classify them; the second layer is an individual classifier that learns using the examples that are not unanimously classified by the first layer (incorporating new information). While using this system in a previous work we detected that some combinations that hybridize artificial neural networks (ANN) in one of the two layers seemed to get high-accuracy results. Thus, in this paper we have focused on the study of the improvement achieved by using different kinds of ANN in this two-layered system.
- Kononenko, I., Kukar, M.: Machine Learning and Data Mining: Introduction to Principles and Algorithms. Horwood Publishing (2007) Google Scholar
Digital Library
- Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1-15. Springer, Heidelberg (2000) Google Scholar
Digital Library
- Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence 12, 993-1001 (1990) Google Scholar
Digital Library
- Gama, J., Brazdil, P.: Cascade generalization. Machine Learning 41, 315-343 (2000) Google Scholar
Digital Library
- Utgoff, P.E., Stracuzzi, D.J.: Many-layered learning. Neural Computation 14(10), 2497-2529 (2002) Google Scholar
Digital Library
- Chindaro, S., Sirlantzis, K., Fairhurst, M.C.: Modelling multiple-classifier relationships using bayesian belief networks. In: Haindl, M., Kittler, J., Roli, F. (eds.) MCS 2007. LNCS, vol. 4472, pp. 312-321. Springer, Heidelberg (2007) Google Scholar
Digital Library
- Ramos-Jiménez, G., del Campo-ávila, J., Morales-Bueno, R.: Hybridizing ensemble classifiers with individual classifiers. In: International Conference on Intelligent Systems Design and Applications. Workshop on Hybrid Learning for Artificial Neural Networks: Architectures and Applications, pp. 199-202. IEEE Computer Society, Los Alamitos (2009) Google Scholar
Digital Library
- Ramos-Jiménez, G., del Campo-ávila, J., Morales-Bueno, R.: ML-CIDIM: Multiple layers of multiple classifier systems based on CIDIM. In: Šlězak, D., Yao, J., Peters, J.F., Ziarko, W.P., Hu, X. (eds.) RSFDGrC 2005. LNCS (LNAI), vol. 3642, pp. 138-146. Springer, Heidelberg (2005) Google Scholar
Digital Library
- Freund, Y.: Boosting a weak learning algorithm by majority. Information and Computation 121, 256-285 (1995) Google Scholar
Digital Library
- Breiman, L.: Bagging predictors. Machine Learning 24(2), 123-140 (1996) Google Scholar
Cross Ref
- Aslam, J.A., Decatur, S.E.: General bounds on statistical query learning and pac learning with noise via hypothesis boosting. Information and Computation 141, 85-118 (1998) Google Scholar
Digital Library
- Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Machine Learning 36, 105-142 (1999) Google Scholar
Digital Library
- Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference on Machine Learning (ICML-1996), pp. 146-148 (1996)Google Scholar
- Aha, D.W., Kibler, D., Albert, M.K.: Instance-based learning algorithms. Machine Learning 6, 37-66 (1991) Google Scholar
Digital Library
- Quinlan, J.R.: C4.5: Programs for machine learning. Morgan Kaufmann, San Francisco (1993) Google Scholar
Digital Library
- John, G.H., Langley, P.: Estimating continuous distributions in bayesian classifiers. In: Proceedings of the Eleventh Annual Conference on Uncertainty in Artificial Intelligence (UAI- 1995), pp. 338-345. Morgan Kaufmann, San Francisco (1995) Google Scholar
Digital Library
- Gill, P.E., Murray, W., Wright, M.H.: Practical optimization. Academic Press, London (1981)Google Scholar
- Kohonen, T.: Self-organizing maps. Springer-Verlag, New York, Inc. (1997) Google Scholar
Digital Library
- Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA data mining software: An update. In: SIGKDD Explorations, vol. 11, pp. 10-18 (2009) Google Scholar
Digital Library
Index Terms
- Studying the hybridization of artificial neural networks in HECIC
Recommendations
Hybridizing Ensemble Classifiers with Individual Classifiers
ISDA '09: Proceedings of the 2009 Ninth International Conference on Intelligent Systems Design and ApplicationsTwo extensive research areas in Machine Learning are classification and prediction. Many approaches have been focused in the induction of ensemble to increase learning accuracy of individual classifiers. Recently, new approaches, different to those that ...
Ensembling neural networks: many could be better than all
Neural network ensemble is a learning paradigm where many neural networks are jointly used to solve a problem. In this paper, the relationship between the ensemble and its component neural networks is analyzed from the context of both regression and ...
Multi-class pattern classification using neural networks
Multi-class pattern classification has many applications including text document classification, speech recognition, object recognition, etc. Multi-class pattern classification using neural networks is not a trivial extension from two-class neural ...




Comments