AN ATTEMPT FOR 2-LAYER PERCEPTRON HIGH PERFORMANCE IN CLASSIFYING SHIFTED MONOCHROME 60-BY-80-IMAGES VIA TRAINING WITH PIXEL-DISTORTED SHIFTED IMAGES ON THE PATTERN OF 26 ALPHABET LETTERS

V. V. Romanuke

Abstract


Object classification problem is considered, where neocognitron and multilayer perceptron may be applied. As neocognitron, solving almost any classification problem, performs too slowly and expensively, then for recognizing shifted monochrome images there is attempted the 2-layer perceptron, being fast only for pixel-distorted monochrome images, though. Having assumed the original images set of 26 monochrome 60-by-80-images of the English alphabet letters, there is formulated the task to clear out whether the 2-layer perceptron is capable to ensure high performance in classifying shifted monochrome images. Thus it is disclosed that the 2-layer perceptron performs as the good classifier of shifted monochrome images, when in training its input is fed with training samples from shifted images, being pixel-distorted. For this, however, it may need more passes of training samples through the 2-layer perceptron, but nevertheless the total traintime will be shorter than for training the 2-layer perceptron with only pixel-distortion-free shifted monochrome images.

Keywords


object classification, neocognitron, perceptron, pixel distortion, shift, monochrome images, shifted monochrome images.

References


Arulampalam, G. A generalized feedforward neural network architecture for classification and regression / G. Arulampalam, A. Bouzerdoum // Neural Networks. – 2003. – Volume 16, Issues 5–6. – P. 561–568.

Hagan, M. T. Neural Networks for Control / M. T. Hagan, H. B. Demuth // Proceedings of the 1999 American Control Conference, San Diego, CA. – 1999. – P. 1642–1656.

Benardos, P. G. Optimizing feedforward artificial neural network architecture / P. G. Benardos, G.-C. Vosniakos // Engineering Applications of Artificial Intelligence. – 2007. – Volume 20, Issue 3. – P. 365–382.

Hagan, M. Т. Training feedforward networks with the Marquardt algorithm / M. Т. Hagan, M. B. Menhaj // IEEE Transactions on Neural Networks. – 1994. – Volume 5, Issue 6. – P. 989–993.

Yu, C. An efficient hidden layer training method for the multilayer perceptron / C. Yu, M. T. Manry, J. Li, P. L. Narasimha // Neurocomputing. – 2006. – Volume 70, Issues 1 – 3. – P. 525–535.

Fukushima, K. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position / K. Fukushima // Biological Cybernetics. – 1980. – Volume 36, Issue 4. – P. 193–202.

Fukushima, K. Neocognitron: A hierarchical neural network capable of visual pattern recognition / K. Fukushima // Neural

Networks. – 1988. – Volume 1, Issue 2. – P. 119–130.

Hagiwara, K. Upper bound of the expected training error of neural network regression for a Gaussian noise sequence / K. Hagiwara, T. Hayasaka, N. Toda, S. Usui, K. Kuno //Neural Networks. – 2001. – Volume 14, Issue 10. – P. 1419–1429.


GOST Style Citations






DOI: https://doi.org/10.15588/1607-3274-2013-2-18



Copyright (c) 2014 V. V. Romanuke

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Address of the journal editorial office:
Editorial office of the journal «Radio Electronics, Computer Science, Control»,
Zaporizhzhya National Technical University, 
Zhukovskiy street, 64, Zaporizhzhya, 69063, Ukraine. 
Telephone: +38-061-769-82-96 – the Editing and Publishing Department.
E-mail: rvv@zntu.edu.ua

The reference to the journal is obligatory in the cases of complete or partial use of its materials.