THE INSTANCE INDIVIDUAL INFORMATIVITY EVALUATION FOR THE SAMPLING IN NEURAL NETWORK MODEL SYNTHESIS

S. A. Subbotin

Abstract


The problem of mathematical support development is solved to automate the sampling at diagnostic and recognizing model building by precedents. The object of study is the process of diagnostic and recognizing neural network model building by precedents. The subject of study is the sampling methods for neural network model building by precedents. The purpose of the work is to increase the speed and quality of the formation process of selected training samples for neural network model building by precedents. The method of training sample selection is proposed which for a given initial sample of precedents and given feature space partition determines the weights characterizing the term and feature usefulness. It characterizes the individual absolute and relative informativity of instances relative to the centers and the boundaries of feature intervals based on the weight values. This allows to automate the sample analysis and its division into subsamples, and, as a consequence, to reduce the training data dimensionality. This in turn reduces the time and provides an acceptable accuracy of neural model training. The software implementing proposed indicators is developed. The experiments to study their properties are conducted. The experimental results allow to recommend the proposed indicators for use in practice, as well as to determine effective conditions for the application of the proposed indicators.


Keywords


sample, instance selection, data reduction, neural network, data dimensionality reduction.

References


Engelbrecht A. Computational intelligence: an introduction / A. Engelbrecht. – Sidney : John Wiley & Sons, 2007. – 597 p. DOI: 10.1002/9780470512517 2. Jankowski N. Comparison of instance selection algorithms I. Algorithms survey / N. Jankowski, M. Grochowski // Artificial Intelligence and Soft Computing : 7th International Conference ICAISC-2004, Zakopane, 7–11 June, 2004 : proceedings. – Berlin : Springer, 2004. – P. 598–603. – (Lecture Notes in Computer Science, Vol. 3070). DOI: 10.1007/978-3-540-24844-6_90 3. Reinartz T. A unifying view on instance selection / T. Reinartz // Data Mining and Knowledge Discovery. – 2002. – № 6. – P. 191–210. DOI: 10.1023/A:1014047731786 4. Hart P. E. The condensed nearest neighbor rule / P. E. Hart // IEEE Transactions on Information Theory. – 1968. – Vol. 14. – P. 515–516. DOI: 10.1109/TIT.1968.1054155 5. Aha D. W. Instance-based learning algorithms / D. W. Aha, D. Kibler, M. K. Albert // Machine Learning. – 1991. – № 6. – P. 37–66. DOI: 10.1023/A:1022689900470 6. Gates G. The reduced nearest neighbor rule / G. Gates // IEEE Transactions on Information Theory. – 1972, Vol. 18, № 3. – P. 431–433. DOI: 10.1109/TIT.1972.1054809 7. Kibbler D. Learning representative exemplars of concepts: an initial case of study / D. Kibbler, D. W. Aha // Machine Learning 4th International Workshop, Irvine, 22–25 June 1987 : proceedings. – Burlington : Morgan Kaufmann, 1987. – P. 24– 30. DOI: 10.1016/b978-0-934613-41-5.50006-4 8. Wilson D. L. Asymptotic properties of nearest neighbor rules using edited data / D. L. Wilson // IEEE Transactions on Systems, Man, Cybernetics. – 1972. – Vol. 2, № 3. – P. 408– 421. DOI: 10.1109/TSMC.1972.4309137 9. Tomek I. An experiment with the edited nearest-neighbor rule / I. Tomek // IEEE Transactions on Systems, Man, and Cybernetics. – 1976. – Vol. 6. – P. 448–452. DOI: 10.1109/ TSMC.1976.4309523 10. Jankowski N. Data regularization / N. Jankowski // Neural Networks and Soft Computing : Fifth Conference, Zakopane, 6–10 June 2000 : proceedings. – Czкstochowa : Polish Neural Networks Society, 2000. – P. 209–214.


GOST Style Citations






DOI: https://doi.org/10.15588/1607-3274-2014-2-10



Copyright (c) 2015 S. A. Subbotin

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Address of the journal editorial office:
Editorial office of the journal «Radio Electronics, Computer Science, Control»,
Zaporizhzhya National Technical University, 
Zhukovskiy street, 64, Zaporizhzhya, 69063, Ukraine. 
Telephone: +38-061-769-82-96 – the Editing and Publishing Department.
E-mail: rvv@zntu.edu.ua

The reference to the journal is obligatory in the cases of complete or partial use of its materials.