IMPLEMENTATION OF THE INDICATOR SYSTEM IN MODELING OF COMPLEX TECHNICAL SYSTEMS
Keywords:complexity assessment, indicator system, modeling, neuromodel, sampling, training, error, gradient.
Context. The problem of determining the optimal topology of a neuromodel, which is characterized by a high level of logical transparency in modeling complex technical systems, is considered. The object of research is the process of applying an indicator system to simplify and select the topology of neuromodels.
Objective of the work is to develop and use a system of indicators to determine the level of complexity of the modeling problem and gradually select the optimal logically transparent topology of the neuromodel.
Method. A method is proposed for selecting an optimal, logically transparent neural network topology for modeling complex technical systems using a system of corresponding indicators. At the beginning, the method determines the overall level of complexity of the modeling task and, using the obtained estimate, determines the method for further optimization of the neuromodel. Then, using Task data and input data characteristics, the method allows to obtain the most optimal structure of the neural model for further modeling of the system. The method reduces trainingvtime and increases the level of logical transparency of neuromodels, which significantly expands the practical use of such models, without using neuroevolution methods, which may not be justified by resource-intensive tasks.
Results. The developed method is implemented and investigated in solving the problem of modeling the dynamics of pitting processes of steel alloys. Using the developed method made it possible to reduce the training time of the model by 22%, depending on the computing resources used. The method also increased the level of logical transparency of the model by reducing the number of computing nodes by 50%, which also indicates faster and more efficient use of resources.
Conclusions. The conducted experiments confirmed the operability of the proposed mathematical support and allow us to recommend it for use in practice in the design of topologies of neuromodels for further solving modeling, diagnosis and evaluation problems. Prospects for further research may consist in the development of methods for structural optimization of previously synthesized models and the development of new methods for feature selection.
Goldberg Y. Neural Network Methods in Natural Language Processing (Synthesis Lectures on Human Language Technologies). California , Morgan & Claypool Publishers, 2017, 310 p.
Aggarwal C.C. Neural Networks and Deep Learning: A Textbook . Berlin, Springer, 2018, 520 p.
Valigi N. Zero to AI, A non-technical, hype-free guide to prospering in the AI. New York, Manning Publications, 2020, 264 p.
Artasanchez A. Artificial Intelligence with Python, Your complete guide to building intelligent apps using Python 3.x. Birmingham , Packt Publishing, 2020, 618 p.
Oliinyk A., Subbotin S., Leoshchenko S., Ilyashenko M., Myronova N., Mastinovsky Y. Аdditional training of neurofuzzy diagnostic models, Radio Electronics, Computer Science, Control, 2018, No. 3, pp. 113–119. DOI: 10.15588/1607-3274-2018-3-12.
Leoshchenko S., Oliinyk A., Subbotin S., Zaiko T. Using Modern Architectures of Recurrent Neural Networks for Technical Diagnosis of Complex Systems, 2018 International Scientific-Practical Conference Problems of Infocommunications. Science and Technology (PIC S&T), Kharkiv, 9–12 October 2018, proceedings. Kharkiv, IEEE, 2018, pp. 411–416. DOI: 10.1109/INFOCOMMST.2018.8632015
Iba H. Evolutionary Approach to Machine Learning and Deep Neural Networks, Neuro-Evolution and Gene Regulatory Networks. New York , Springer, 2018, 258 p.
Omelianenko I. Hands-On Neuroevolution with Python, Build high-performing artificial neural network architectures using neuroevolution-based algorithms. Birmingham , Packt Publishing, 2019, 368 p.
Blokdyk G. Neuroevolution of augmenting topologies, Second Edition. Ohio , 5STARCooks, 2018, 128 p.
How is Contrast Encoded in Deep Neural Networks? [Electronic resource]. Access mode, https://arxiv.org/abs/1809.01438
Senge P. M. The Fifth Discipline, The Art & Practice of The Learning Organization. New York, Doubleday, 2006, 445 p.
Weaver W. Science and complexity, American Scientist, 1948, Vol. 36, No. 4, pp. 536–544.
Flood R.L. Dealing with Complexity. Berlin, Springer, 1993, 296 p.
Spiegelhalter D. The Art of Statistics, How to Learn from Data. New York , Basic Books, 2019, 448 p.
Bruce P. Practical Statistics for Data Scientists, 50 Essential Concepts, 1st Edition. California , O’Reilly Media, 2017, 318 p.
Finch W. H. Exploratory Factor Analysis [Text] , 1st Edition. California , SAGE Publications, 2019, 144 p.
Rencher A. C. Methods of Multivariate Analysis [Text] , 3rd Edition. New Jersey , John Wiley & Sons, 2012. 800 p.
Dean A. Design and Analysis of Experiments (Springer Texts in Statistics), 2nd Edition. Berlin, Springer, 2017, 865 p.
Sewak M. Deep Reinforcement Learning, Frontiers of Artificial Intelligence / M. Sewak. Berlin, Springer, 2020. 220 p.
Calix R. A Deep Learning Algorithms, Transformers, gans, encoders, cnns, rnns, and more. Traverse City, Independently published, 2020, 428 p.
Narivs’kyi O.E. Corrosion fracture of platelike heat exchangers, Materials Science, 2005, Vol. 41, No. 1, pp. 122– 128.
Narivs’kyi O.E. Micromechanism of corrosion fracture of the plates of heat exchangers, Materials Science, 2007, Vol. 43, No. 1, pp. 124–132.
Kuhn M. Feature Engineering and Selection, A Practical Approach for Predictive Models (Chapman & Hall/CRC Data Science Series). London, Chapman and Hall (CRC Press), 2019, 310 p.
Oliinyk A., Subbotin S., Lovkin V., Ilyashenko M., Blagodariov O. Parallel method of big data reduction based on stochastic programming approach, Radio Electronics, Computer Science, Control, 2018, № 2, pp. 60–72.
Oliinyk A., Leoshchenko S., Lovkin V., Subbotin S., Zaiko T. Parallel data reduction method for complex technical objects and processes, Dependable Systems, Services and Technologies, (DESSERT’2018) , The 9th IEEE International Conference, Kyiv, 24–27 May, 2018 : proceedings. Los Alamitos, IEEE, 2018, pp. 528–531.
How to Cite
Copyright (c) 2021 С. Д. Леощенко , С. О. Субботін , А. О. Олійник , О. Е. Нарівський
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.