MODIFICATION AND PARALLELIZATION OF GENETIC ALGORITHM FOR SYNTHESIS OF ARTIFICIAL NEURAL NETWORKS

Authors

  • S. D. Leoshchenko National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine
  • A. O. Oliinyk National University “Zaporizhzhia Polytechnic”, Zaporizhzhia,, Ukraine
  • S. A. Subbotin National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine
  • V. A. Lytvyn National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine
  • V. V. Shkarupylo National University of Life and Environmental Sciences, Ukraine

DOI:

https://doi.org/10.15588/1607-3274-2019-4-7

Keywords:

Data sample, synthesis, artificial neural network, genetic algorithm, neuroevolution, mutation.

Abstract

Context. The problem of automation synthesis of artificial neural networks for further use in diagnosing, forecasting and pattern
recognition is solved. The object of the study was the process of synthesis of ANN using a modified genetic algorithm.
Objective. The goals of the work are the reducing the synthesis time and improve the accuracy of the resulting neural network.
Method. The method of synthesis of artificial neural networks on the basis of the modified genetic algorithm which can be implementing
sequentially and parallel using MIMD – and SIMD-systems is proposed. The use of a high probability of mutation can
increase diversity within the population and prevent premature convergence of the method. The choice of a new best specimen, as
opposed to a complete restart of the algorithm, significantly saves system resources and ensures the exit from the area of local extrema.
The use of new criteria for adaptive selection of mutations, firstly, does not limit the number of hidden neurons, and, secondly,
prevents the immeasurable increase in the network. The use of uniform crossover significantly increases the efficiency, as well as
allows emulating other crossover operators without problems. Moreover, the use of uniform crossover increases the flexibility of the
genetic algorithm. The parallel approach significantly reduces the number of iterations and significantly speedup the synthesis of
artificial neural networks.
Results. The software which implements the proposed method of synthesis of artificial neural networks and allows to perform
the synthesis of networks in sequentially and in parallel on the cores of the CPU or GPU.
Conclusions. The experiments have confirmed the efficiency of the proposed method of synthesis of artificial neural networks
and allow us to recommend it for use in practice in the processing of data sets for further diagnosis, prediction or pattern recognition.
Prospects for further research may consist in the introduction of the possibility of using genetic information of several parents to form a new individual and modification of synthesis methods for recurrent network architectures for big data processing.

Author Biographies

S. D. Leoshchenko, National University “Zaporizhzhia Polytechnic”, Zaporizhzhia

Postgraduate student of the Department of Software Tools

A. O. Oliinyk, National University “Zaporizhzhia Polytechnic”, Zaporizhzhia,

PhD, Associate Professor, Associate Professor of the Department of Software Tools

S. A. Subbotin, National University “Zaporizhzhia Polytechnic”, Zaporizhzhia

Dr. Sc., Professor, Head of the Department of Software Tools

V. A. Lytvyn, National University “Zaporizhzhia Polytechnic”, Zaporizhzhia

Postgraduate student of the Department of Software Tools,

V. V. Shkarupylo, National University of Life and Environmental Sciences

PhD, Associate Professor, Associate Professor of Computer Systems and Networks

References

Mocanu D., Mocanu E., Stone P., Nguyen P., Gibescu M., Liotta A. Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science, Nature Communications, 2018, No. 9, pp. 1–17. DOI:10.1038/s41467-018-04316-3.

Goodfellow I., Bengio Y., Courville A. Deep Learning, Deep Learning. Massachusetts, MIT Press, 2016, 800 p. (Adaptive Computation and Machine Learning series).

Cao W., Wang X., Ming Z., Gao J. A review on neural networks with random weights, Neurocomputing, 2018, Vol. 275, pp. 278–287. DOI:

1016/j.neucom.2017.08.040.

Kieffer B., Babaie M., Kalra S. et al. Convolutional neural networks for his-topathology image classification: Training vs. using pre-trained networks, Conference on Image Processing Theory, Tools and Applications (IPTA 2017) : Seventh International conference, 28–1 December 2017 : proceedings.

Montreal, IEEE, 2017, pp. 1–6. DOI:10.1109/IPTA.2017.8310149.

Oliinyk A. Production rules extraction based on negative selection, Radio Electronics, Computer Science, Control, 2016, No. 1, pp. 40–49. DOI: 10.15588/1607-3274-2016-1-5.

Oliinyk A. A., Skrupsky S. Yu., Shkarupylo V. V., Subbotin S. A. The model for estimation of computer system used resources while extracting production rules based on parallel computations, Radio Electronics, Computer Science, Control, 2017, No. 1, pp. 142–152. DOI: 10.15588/1607-

-2017-1-16.

Kolpakova T., Oliinyk A., Lovkin V. Integrated method of extraction, formalization and aggregation of competitive agents expert evaluations in group, Radio Electronics, Computer Science, Control, 2017, No. 2, pp. 100–108. DOI:10.15588/1607-3274-2017-2-11.

Yadav J., Rani A., Singh V., Murari B. Prospects and limitations of non-invasive blood glucose monitoring using nearinfrared spectroscopy, Biomedical Signal Processing and Control, 2015, Vol. 18, pp. 214–227. DOI:10.1016/j.bspc.2015.01.005

Lymariev I. O., Subbotin S. A., Oliinyk A. A., Drokin I. V. Methods of large-scale signals transformation for diagnosis in neural network models, Radio Electronics, Computer Science, Control, 2018, No. 4, pp. 113–121. DOI:10.15588/1607-3274-2018-4-11.

Smith-Miles K., Gupta J. Neural Networks in Business:Techniques and Applications for the Operations Researcher, Computers and Operations Research, 2000, Vol. 27, No. 11. pp. 1023–1044. DOI: 27. 10.1016/S0305-0548(99)00141-0.

Van Tuc N. Approximation contexts in addressing graph data structures. Wollongong, University of Wollongong Thesis Collection, 2015, 230 p.

Handa A., Patraucean V. Backpropagation in convolutional LSTMS, University of Cambridge, 2015, pp. 1–5.

Boden M. A guide to recurrent neural networks and backpropagation, Halmstad University, 2001, pp. 1–10.

Guo J. BackPropagation Through Time, The Harbin Institute of Technology, 2013, pp. 1–6.

Yue B., Fu J., Liang J. Residual Recurrent Neural Networks for Learning Sequential Representations, Information, 2018, Vol. 9, Issue 56, pp. 1–14.

Eluyodel O. S., Akomolafe D. T. Comparative study of biological and artificial neural networks, European Journal of Applied Engineering and Scientific Research, 2013, Vol. 2, No. 1, pp. 36–46.

Neural Networks: Is Your Brain Like A Computer? [Electronic resource]. Access mode:https://towardsdatascience.com/neural-networks-is-yourbrain-

like-a-computer-d76fb65824bf.

Jiang J., Trundle P., Ren J. Medical Imaging Analysis with Artificial Neural Networks, Computerized medical imaging and graphics: the official journal of the Computerized Medical Imaging Society, 2010, Vol. 34, No. 8, pp. 617–631.

Tahmasebi P., Hezarkhani A. A hybrid neural networksfuzzy logic-genetic algorithm for grade estimation, Computers & Geosciences, 2012, Vol. 42, pp. 18–27.

Oliinyk A., Skrupsky S., Subbotin S., Blagodariov O., Gofman Ye. Parallel computing system resources planning for neuro-fuzzy models synthesis and big data processing, Radio Electronics, Computer Science, Control, 2016, No. 4, pp. 61–69. DOI: 10.15588/1607-3274-2016-4-8.

Oliinyk A. A., Subbotin S. A., Skrupsky S. Yu., Lovkin V.M., Zaiko T. A. Information Technology of Diagnosis Model Synthesis Based on Parallel Computing, Radio Electronics, Computer Science, Control, 2017, No. 3, pp. 139–151. DOI: 10.15588/1607-3274-2017-3-16.

Oliinyk A. A. Subbotin S., Lovkin V., Blagodariov O., Zaiko T. The System of Criteria for Feature Informativeness Estimation in Pattern Recognition, Radio Electronics, Computer Science, Control, 2017, No. 4, pp. 85–96. DOI:10.15588/1607-3274-2017-4-10.

Oliinyk A., Subbotin S., Leoshchenko S., Ilyashenko M., Myronova N., Mastinovsky Y. Аdditional training of neurofuzzy diagnostic models, Radio Electronics, Computer Science, Control, 2018, No. 3, pp. 113–119. DOI:10.15588/1607-3274-2018-3-12.

Ding S. Hui L., Su C., Yu J., Jin F. Evolutionary artificial neural networks: A review, Artificial Intelligence Review, 2013, Vol. 39, No. 3, pp. 251–260.

Castillo P. A., Arenas M. G., Castillo-Valdivieso J. J., Merelo J. J., Prieto A., Romero G. Artificial Neural Networks Design using Evolutionary Algorithms, Advances in Soft Computing, 2003, pp. 43–52.

Davoian K. Advancing evolution of artificial neural networks through behavioral adaptation. Münster, Universität Münster, 2011, 131 p.

Nissen V. A Brief Introduction to Evolutionary Algorithms from the Perspective of Management Science, Innovative Research Methodologies in Management, 2018, pp. 165–210.

Soltoggio A., Stanley K. O., Risi S. Born to learn: The inspiration, progress, and future of evolved plastic artificial neural networks, Neural Networks, 2018, Vol. 108, pp. 48–67. DOI: 10.1016/j.neunet.2018.07.013.

Molfetas A. Multiple control levels in structured genetic alogorithms. A thesis submitted for the degree of Doctor of Philosophy. Sydney, University of Western Sydney, 2006, 302 p.

AI 101: Intro to Evolutionary Algorithms [Electronic resource]. Access mode: https://www.ev olv.ai/blog/ai-101-intro-to-evolutionary-algorithms/

Angeline P. J., Saunders G. M., Pollack J. B. An evolutionary algorithm that constructs recurrent neural networks, IEEE Transactions on Neural Networks, 1994, Vol. 5, Issue 1, pp. 54–65.

Baxter J. The evolution of learning algorithms for artificial neural networks, Complex Systems, 1992, pp. 313–326.

Belew R., McInerney J., Schraudolph N. Evolving networks:using the genetic algorithm with connectionist learning, CSE technical report CS90-174, 1991, pp. 511–547.

Kearney W. T. Using Genetic Algorithms to Evolve Artificial Neural Networks, Honors Theses, 2016, pp. 1–21.

Shabash B., Wiese K. EvoNN: a customizable evolutionary neural network with heterogenous activation functions, Tne Genetic and Evolutionary Computation Conference Companion : A Recombination of the 27th International Conference on Genetic Algorithms (ICGA) and the 23rd Annual

Genetic Programming Conference (GP), 15–19 July 2018 : proceedings. New York, ACM, 2018, pp. 1449–1456. DOI:

1145/3205651.3208282.

Rempis C., Pasemann F. An Interactively Constrained Neuro-Evolution Approach for Behavior Control of Complex Robots, Variants of Evolutionary Algorithms for Real-World Applications, 2012, pp. 305–341.

Baldominos A., Sáez Y., Isasi P. On the Automated, Evolutionary Design of Neural Networks-Past, Present, and Future, Neural Computing and Applications, 2019, pp. 1–45. DOI: 10.1007/s00521-019-04160-6.

Baldominos A., Sáez Y., Isasi P. Evolutionary Design of Convolutional Neural Networks for Human Activity Recognition in Sensor-Rich Environments, Sensors, 2018, Vol. 18, No. 4, pp. 1–24. DOI:10.3390/s18041288.

Schrum J., Miikkulainen R. Solving Multiple Isolated, Interleaved, and Blended Tasks through Modular Neuroevolution, Evolutionary computation, 2016, Vol. 24, № 3, pp. 1–29. DOI: 10.1162/EVCO_a_00181.

Gruau F. Genetic synthesis of Boolean neural networks with a cell rewriting developmental process, COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks, 1992, pp. 55–74.

Gruau F., Whitley D. Adding Learning to the Cellular Development of Neural Networks: Evolution and the Baldwin Effect, Evolutionary Computation, 1993, Vol. 1, No. 3, pp. 213–233. DOI: 10.1162/evco.1993.1.3.213.

Bayer J., Wierstra D., Togelius J., Schmidhuber J. Evolving Memory Cell Structures for Sequence Learning, Artificial Neural Networks, ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, 2009, Vol. 5769, pp. 755–764.

Moriarty D., David R., Miikkulainen R. Hierarchical evolution of neural networks, Evolutionary Computation Proceedings, IEEE International Conference, 4–9 May 1998. Anchorage, IEEE, 1998, pp. 428–433. DOI:10.1109/ICEC.1998.699793.

He H. Self-Adaptive Systems for Machine Intelligence. New York, Wiley-Interscience, 2011, 248 p.

Greer B., Hakonen H., Lahdelma R., Miikkulainen R. Numerical optimization with neuroevolution, Proceedings of the 2002 Congress on Evolutionary Computation. CEC’02 (Cat. No.02TH8600), 2002, pp. 396–401. DOI:10.1109/CEC.2002.1006267.

Braylan A., Hollenbeck M., Meyerson E. et al. Reuse of Neural Modules for General Video Game Playing, Conference on Artificial Intelligence : Thirtieth AAAI conference on artificial intelligence, Phoenix, Arizona, 12–17 February 2016 : proceedings. Arizona, AAAI 2016, pp. 353–359.

Enforced Subpopulations (ESP) neuroevolution algorithm for balancing inverted double pendulum [Electronic resource]. Access mode: http://blog.otoro.net/2015/03/10/espalgorithm-for-double-pendulum.

Stanley K. O., Miikkulainen R. Evolving Neural Networks through Augmenting Topologies, The MIT Press Journals, 2002, Vol. 10, No. 2, pp. 99–127.

Manning T., Walsh P. P. Automatic Task Decomposition for the NeuroEvolution of Augmenting Topologies (NEAT) Algorithm, Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics. EvoBIO 2012. Lecture Notes in Computer Science, 2012, Vol. 7246. pp. 1–12.

Whiteson S., Whiteson D. Stochastic optimization for collision selection in high energy physics, Conference on Artificial Intelligence : Twenty-second AAAI conference on artificial intelligence, Vancouver, British Columbia, 22–26 July 2007, proceedings. California, AAAI Press, 2007, Vol. 2,

pp. 1819–1825.

Tsoy Y. R. Evolutionary Algorithms Design: State of the Art and Future Perspectives, IEEE East-West Design and Test Workshop (EWDTW’06), 2006, pp. 375–379.

Behjat A., Chidambaran S., Chowdhury S. Adaptive Genomic Evolution of Neural Network Topologies (AGENT) for State-to-Action Mapping in Autonomous Agents, Conference on Robotics and Automation (ICRA’ 2019), IEEE International Conference, 20–24 May 2019, proceedings.

Montreal, IEEE, 2019, pp. 1–7. arXiv:1903.07107.

Pagliuca P., Milano N., Nolfi S. Maximizing adaptive power in neuroevolution, PLoS ONE, 2018, Vol. 13, No. 7, pp. 1–27. DOI: 10.1371/journal.pone.0198788.

Petroski Such F., Madhavan V., Conti E., Lehman J., Stanley K., Clune J., Jeff Deep Neuroevolution: Genetic Algorithms Are a Competitive Alternative for Training Deep Neural Networks for Reinforcement Learning, Technical report 1712.06567, 2017, pp. 1–16. arXiv:1712.06567.

Capcarrere M., Freitas A. A., Bentley P. J. et al. Advances in Artificial Life, Conference on Artificial Life, Eighth European Conference, 5–9 September 2005, proceedings. Canterbury, Springer, 2005, 949 p.

McQuesten P. H. Cultural Enhancement of Neuroevolution. Austin, The University of Texas, 2002, 123 p.

Tsoy Y., Spitsyn V. Using genetic algorithm with adaptive mutation mechanism for neural networks design and training, The 9th Russian-Korean International Symposium on Science and Technology, 2005. KORUS, 2005, pp. 709–714.

Marzullo A., Stamile C., Terracina G., Calimeri F., Huffel S. A tensor-based mutation operator for Neuroevolution of Augmenting Topologies (NEAT), IEEE Congress on Evolutionary Computation (CEC), 2017, pp. 681–687. DOI:10.1109/CEC.2017.7969376.

Papadimitriou F. Mathematical modelling of land use and landscape complexity with ultrametric topology, Journal of Land Use Science, 2011, Vol. 8, No. 2, pp. 1–21.

Papadimitriou F. Artificial Intelligence in modelling the complexity of Mediterranean landscape transformations, Computers and Electronics in Agriculture, 2012, Vol. 81, pp. 87–96.

Skillicorn D. Taxonomy for computer architectures, Computer, 1988, Vol. 21, No. 11, pp. 46–57.

Physical Unclonable Functions Data Set [Electronic resource]. Access mode:https://archive.ics.uci.edu/ml/datasets/Physical+Unclonable+Functions.

Aseeri A. O., Zhuang Y., Alkatheiri M. S. A Machine Learning-Based Security Vulnerability Study on XOR PUFs for Resource-Constraint Internet of Things / A. O. Aseeri, // 2018 IEEE International Congress on Internet of Things (ICIOT), 2018, pp. 49–56. DOI:10.1109/ICIOT.2018.00014.

Intel Turbo Boost Technology 2.0 [Electronic resource]. – Access mode:https://www.intel.com/content/www/us/en/architecture-andtechnology/

turbo-boost/turbo-boost-technology.html.

Oliinyk A. A., Skrupsky S. Yu., Shkarupylo V. V., Blagodariov O. Parallel multiagent method of big data reduction for pattern recognition, Radio Electronics, Computer Science, Control, 2017, No. 2, pp. 82–92. DOI: 10.15588/1607-3274-2017-2-9.

Oliinyk A., Subbotin S., Lovkin V., Ilyashenko M., Blagodariov O. Parallel method of big data reduction based on

stochastic programming approach, Radio Electronics, Computer Science, Control, 2018, No. 2, pp. 60–72. DOI:10.15588/1607-3274-2018-2-7

Omelianenko I. Performance evaluation of NEAT algorithm’s implementation in GO programming language, 2018, pp. 1–6. DOI: 10.13140/RG.2.2.28676.83844.

Kadish D. Clustering sensory inputs using NeuroEvolution of Augmenting Topologies, Tne Genetic and Evolutionary Computation Conference Companion : A Recombination of the 27th International Conference on Genetic Algorithms (ICGA) and the 23rd Annual Genetic Programming Conference

(GP), 15–19 July 2018, proceedings. New York, ACM, 2018, pp. 1–2. DOI: 10.1145/3205651.3205771.

Downloads

Published

2019-11-25

How to Cite

Leoshchenko, S. D., Oliinyk, A. O., Subbotin, S. A., Lytvyn, V. A., & Shkarupylo, V. V. (2019). MODIFICATION AND PARALLELIZATION OF GENETIC ALGORITHM FOR SYNTHESIS OF ARTIFICIAL NEURAL NETWORKS. Radio Electronics, Computer Science, Control, (4), 68–82. https://doi.org/10.15588/1607-3274-2019-4-7

Issue

Section

Neuroinformatics and intelligent systems