THE METHOD OF STRUCTURAL ADJUSTMENT OF NEURAL NETWORK MODELS TO ENSURE INTERPRETATION
DOI:
https://doi.org/10.15588/1607-3274-2021-3-8Keywords:
interpretation, topology, structural adjustment, neuroevolution, neural networks.Abstract
Context. The problem of structural modification of pre-synthesized models based on artificial neural networks to ensure the property of interpretation when working with big data is considered. The object of the study is the process of structural modification of artificial neural networks using adaptive mechanisms.
Objective of the work is to develop a method for structural modification of neural networks to increase their speed and reduce resource consumption when processing big data.
Method. A method of structural adjustment of neural networks based on adaptive mechanisms borrowed from neuroevolutionary synthesis methods is proposed. At the beginning, the method uses a system of indicators to evaluate the existing structure of an artificial neural network. The assessment is based on the structural features of neuromodels. Then the obtained indicator estimates are compared with the criteria values for choosing the type of structural changes. Variants of mutational changes from the group of methods of neuroevolutionary modification of the topology and weights of the neural network are used as variants of structural change. The method allows to reduce the resource intensity during the operation of neuromodels, by accelerating the processing of big data, which expands the field of practical application of artificial neural networks.
Results. The developed method is implemented and investigated by the example of using a recurrent artificial network of the long short-term memory type when solving the classification problem. The use of the developed method allowed speed up of the neuromodel with a test sample by 25.05%, depending on the computing resources used.
Conclusions. The conducted experiments confirmed the operability of the proposed mathematical software and allow us to recommend it for use in practice in the structural adjustment of pre-synthesized neuromodels for further solving problems of diagnosis, forecasting, evaluation and pattern recognition using big data. The prospects for further research may consist in a more fine-tuning of the indicator system to determine the connections encoding noisy data in order to further improve the accuracy of models based on neural networks.
References
Veneri G., Capasso A. Hands-On Industrial Internet of Things: Create a powerful Industrial IoT infrastructure using Industry 4.0. Birmingham, Packt Publishing, 2018, 556 p.
Roshak M. Artificial Intelligence for IoT Cookbook: Over 70 recipes for building AI solutions for smart homes, industrial IoT, and smart cities. Birmingham, Packt Publishing, 2021, 260 p.
Chantzis F., Stais I., Calderon P., Deirmentzoglou E., Woods B. Practical IoT Hacking: The Definitive Guide to Attacking the Internet of Things. San Francisco, No Starch Press, 2021, 464 p.
Nam C. S. Neuroergonomics: Principles and Practice (Cognitive Science and Technology). Berlin, Springer, 2020, 492 p.
Caballé S., Demetriadis S. N., Sánchez E. G., Papadopoulos P. M., Weinberger A. Intelligent Systems and Learning Data Analytics in Online Education (Intelligent Data-Centric Systems: Sensor Collected Intelligence). Cambridge, Academic Press, 2021, 424 p.
Aggarwal C. C. Neural Networks and Deep Learning: A Textbook, Berlin, Springer, 2018, 520 p.
Shikhman V., Müller D. Mathematical Foundations of Big Data Analytics. Berlin, Springer, 2021, 288 p.
Deitel P., Deitel H. Intro to Python for Computer Science and Data Science: Learning to Program with AI, Big Data and The Cloud. London, Pearson, 2019, 880 p.
D’souza R. N., Huang PY., Yeh FC. Structural Analysis and Optimization of Convolutional Neural Networks with a Small Sample Size, Scientific Reports, 2020, No. 10, pp. 1– 13. DOI: 10.1038/s41598-020-57866-2.
Nowakowski G., Dorogyy Y., Doroga-Ivaniuk O. Neural Network Structure Optimization Algorithm, Journal of Automation, Mobile Robotics and Intelligent Systems, 2018, No. 12, pp. 5–13. DOI: 10.14313/JAMRIS_1-2018/1.
Iba H. Evolutionary Approach to Machine Learning and Deep Neural Networks, Neuro-Evolution and Gene Regulatory Networks. New York, Springer, 2018, 258 p.
Izonin I., Tkachenko R., Kryvinska N., Gregus M., Tkachenko P., Vitynskyi P. Committee of SGTM NeuralLike Structures with RBF kernel for Insurance Cost Prediction Task, 2nd Ukraine Conference on Electrical and Computer Engineering (UKRCON), 2–6 July 2019, proceedings. Lviv, IEEE, 2019, pp. 1037–1040. DOI: 10.1109/UKRCON.2019.8879905.
Fultz N. E., Bonmassar G., Setsompop K., Stickgold R. A., Rosen B. R., Polimeni J. R., Lewis L. D. Coupled electrophysiological, hemodynamic, and cerebrospinal fluid oscillations in human sleep, Science, 2019, Vol. 366, Issue 6465, pp. 628–631. DOI: 628–631. 10.1126/science.aax5440.
Are We “Brain Washed” during Sleep? [Electronic resource], Access mode: https://www.bu.edu/articles/2019/cerebrospinal-fluidwashing-in-brain-during-sleep/
Deep Sleep Gives Your Brain a Deep Clean [Electronic resource], Access mode: https://www.scientificamerican.com/article/deep-sleepgives-your-brain-a-deep-clean1/
Christiansen N. H., Hultmann Job J., Klyver K., Høgsbrg J. Optimal Brain Surgeon on Artificial Neural Networks in Nonlinear Structural Dynamics, 25th Nordic Seminar on Computational Mechanics (NSCM), 25–26 October 2012, proceedings. Lund, Lund University, 2012, pp. 319–324.
Endisch C., Hackl C., Schröder D. Optimal Brain Surgeon for General Dynamic Neural Networks, Progress in Artificial Intelligence, 2007, Vol. 4874, pp. 15–28. DOI: 10.1007/978-3-540-77002-2_2.
Oppermann A. Regularization in Deep Learning – L1, L2, and Dropout [Electronic resource]. Access mode: https://www.deeplearning-academy.com/p/ai-wikiregularization
Classic Regularization Techniques in Neural Networks [Electronic resource]. Access mode: https://medium.com/@ODSC/classic-regularizationtechniques-in-neural-networks-68bccee03764
Weight Agnostic Neural Networks [Electronic resource]. Access mode: https://arxiv.org/pdf/1906.04358.pdf
Weight Agnostic Neural Networks [Electronic resource]. Access mode: https://weightagnostic.github.io/
Leoshchenko S. D., Oliinyk A. O., Subbotin S. A., Lytvyn V. A., Shkarupylo V. V. Modification and parallelization of genetic algorithm for synthesis of artificial neural networks, Radio Electronics, Computer Science, Control, 2019, № 4, P. 68–82. DOI: 10.15588/1607-3274-2018-3-12.
Leoshchenko S. D., Oliinyk A. O., Subbotin S. A., Narivs’kiy O. E. Implementation of the indicator system in modeling complex technical systems, Radio electronics, Computer science, Control, 2021, No. 1, pp. 117–127. DOI: 10.15588/1607-3274-2021-1-12.
Sattari M. T., Yurekli K., Pal M. Performance evaluation of artificial neural network approaches in forecasting reservoir inflow, Applied Mathematical Modelling, 2012, Vol. 36, Issue 6, pp. 2649–2657. DOI: 10.1016/j.apm.2011.09.048.
Hassan M., Hamada M. Evaluating the performance of a neural network-based multi-criteria recommender system, International Journal of Spatio-Temporal Data Science, 2019, Vol. 1 (54), pp. 54–66. DOI: 10.1504/IJSTDS.2019.10018848
Jamróz D. The examination of the effect of the criterion for neural network’s learning on the effectiveness of the qualitative analysis of multidimensional data, Knowledge and Information Systems, 2020, Vol. 62, pp. 3263–3289. DOI: 10.1007/s10115-020-01441-8.
Subbotin S.A. Criteria for Comparison of Recognition Models Based on Neural Networks and Analysis of Their Mutual Relations, Artificial Intelligent, 2014, Vol. 1, pp. 142–152.
Leoshchenko S., Oliinyk A., Subbotin S. Adaptive Mechanisms for Parallelization of the Genetic Method of Neural Network Synthesis. 10th International Conference on Advanced Computer Information Technologies (ACIT 2020), Deggendorf, 16–18 November, proceedings. Ternopil, IEEE, 2020, pp. 446–450. DOI: 10.1109/ACIT49673.2020.9208905.
Myocardial infarction complications Data Set [Electronic resource]. Access mode: https://archive.ics.uci.edu/ml/datasets/Myocardial+infarction +complications
Gorban A. N., Rossiev D. A., Butakova E. V., Gilev S. E. et al.] Medical, psychological and physiological applications of MultiNeuron neural simulator, The Second International Symposium on Neuroinformatics and Neurocomputers, Rostov-on-Don, 20–23 September : proceedings. Rostovon-Don, IEEE, 1995, pp. 7–14. DOI: 10.1109/ISNINC.1995.480831.
Golovenkin S. E., Bac J., Chervov A., Mirkes E. M., Orlova Y. V., Barillot E., Gorban A. N., Zinovyev A. Trajectories, bifurcations, and pseudo-time in large clinical datasets: applications to myocardial infarction and diabetes data, GigaScience, 2020, Vol. 9(11), pp. 1–20. DOI: 10.1093/gigascience/giaa128
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2021 S. D. Leoshchenko, A. O. Oliinyk, S. A. Subbotin, Ye. O. Gofman, O. V. Korniienko
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
-
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
-
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.