ABSTRACT
This paper gives an overview of how visualization techniques can help us to improve an evolutionary algorithm that trains artificial neural networks. Kohonen's self-organizing maps (SOM) are used for multidimensional scaling and projection of high dimensional search spaces. The SOM visualization technique used here makes visualization of the evolution process easy and intuitive.
- T.F. Cox and M.A.A. Cox. Multidimensional Scaling. London: Chapman & Hall, 1994.Google Scholar
- B.D. Ripley. Pattern Recognition and Neural Networks. Cambridge, GB: Cambridge University Press, 1996. Google Scholar
Digital Library
- L. Tsogo and M. Masson. Multidimensional scaling methods for many-objects sets: a review. Multivariate Behavioral Research, 35(3):307-320, 2000.Google Scholar
Cross Ref
- Andreas König. A survey of methods for multivariate data projection, visualisation and interactive analysis. In Proceedings of the 5th International Conference on Soft Computing and Information/Intelligent Systems (IIZUKA'98), pages 55-59, October 1998.Google Scholar
- Christopher M. Bishop. Neural Networks for Pattern Recognition. Clarendon Press, Oxford, 1995. Google Scholar
Digital Library
- J.W. Sammon Jr. A nonlinear mapping for data structure analysis. IEEE Transactions on Computers, 18:401-409, 1969. Google Scholar
Digital Library
- Pierre Demartines and Jeanny Hrault. Curvilinear component analysis: a self organizing neural network for non linear mapping of data sets. IEEE Transactions on Neural Networks, 8:148-154, 1997. Google Scholar
Digital Library
- John Aldo Lee, Amaury Lendasse, and Michael Verleysen. Curvilinear Distance Analysis versus Isomap. In ESANN 2002, 10th European Symposium on Artificial Neural Networks, pages 185-192, Bruges (Belgium), April 2002.Google Scholar
- Arthur Flexer. On the use of self-organizing maps for clustering and visualization. In Principles of Data Mining and Knowledge Discovery, pages 80-88, 1999. Google Scholar
Digital Library
- Teuvo Kohonen. The Self-Organizing Maps, volume 30 of Information Sciences. Springer-Verlag, second extended edition, 1997. Google Scholar
Digital Library
- Sam T. Roweis and Lawrence K. Saul. Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science, 290:2323-2336, December 2000.Google Scholar
- S.E. Fahlman. Faster-Learning Variations on Back-Propagation: An Empirical Study. In Proceedings of the 1988 Connectionist Models Summer School. Morgan Kaufmann, 1988.Google Scholar
- Paul Horton and Kenta Nakai. Better Prediction of Protein Cellular Localization Sites with the k Nearest Neighbors Classifier. In In Proceeding of the Fifth International Conference on Intelligent Systems for Molecular Biology, pages 147-152, Menlo Park. USA. AAAI Press. Google Scholar
Digital Library
Index Terms
- Separable recurrent neural networks treated with stochastic velocities
Recommendations
Stochastic high-order hopfield neural networks
ICNC'05: Proceedings of the First international conference on Advances in Natural Computation - Volume Part IIn 1984 Hopfield showed that the time evolution of a symmetric Hopfield neural networks are a motion in state space that seeks out minima in the energy function (i.e., equilibrium point set of Hopfield neural networks). Because high-order Hopfield ...
Minimal gated unit for recurrent neural networks
Recurrent neural networks (RNN) have been very successful in handling sequence data. However, understanding RNN and finding the best practices for RNN learning is a difficult task, partly because there are many competing and complex hidden units, such ...




Comments