IMPLEMENTATION OF THE INDICATOR SYSTEM IN MODELING OF COMPLEX TECHNICAL SYSTEMS

Authors

  • S. D. Leoshchenko National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine.
  • S. A. Subbotin National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine.
  • A. O. Oliinyk National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine.
  • O. E. Narivs’kiy Ukrspecmash ltd., Berdyansk, Ukraine.

DOI:

https://doi.org/10.15588/1607-3274-2021-1-12

Keywords:

complexity assessment, indicator system, modeling, neuromodel, sampling, training, error, gradient.

Abstract

Context. The problem of determining the optimal topology of a neuromodel, which is characterized by a high level of logical transparency in modeling complex technical systems, is considered. The object of research is the process of applying an indicator system to simplify and select the topology of neuromodels.

Objective of the work is to develop and use a system of indicators to determine the level of complexity of the modeling problem and gradually select the optimal logically transparent topology of the neuromodel.

Method. A method is proposed for selecting an optimal, logically transparent neural network topology for modeling complex technical systems using a system of corresponding indicators. At the beginning, the method determines the overall level of complexity of the modeling task and, using the obtained estimate, determines the method for further optimization of the neuromodel. Then, using Task data and input data characteristics, the method allows to obtain the most optimal structure of the neural model for further modeling of the system. The method reduces trainingvtime and increases the level of logical transparency of neuromodels, which significantly expands the practical use of such models, without using neuroevolution methods, which may not be justified by resource-intensive tasks.

Results. The developed method is implemented and investigated in solving the problem of modeling the dynamics of pitting processes of steel alloys. Using the developed method made it possible to reduce the training time of the model by 22%, depending on the computing resources used. The method also increased the level of logical transparency of the model by reducing the number of computing nodes by 50%, which also indicates faster and more efficient use of resources.

Conclusions. The conducted experiments confirmed the operability of the proposed mathematical support and allow us to recommend it for use in practice in the design of topologies of neuromodels for further solving modeling, diagnosis and evaluation problems. Prospects for further research may consist in the development of methods for structural optimization of previously synthesized models and the development of new methods for feature selection.

Author Biographies

S. D. Leoshchenko , National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine.

Post-graduate student of the Department of Software Tools. 

S. A. Subbotin , National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine.

Dr. Sc., Professor, Head of the Department of Software Tools. 

A. O. Oliinyk , National University “Zaporizhzhia Polytechnic”, Zaporizhzhia, Ukraine.

PhD, Associate Professor, Associate Professor of the Department of Software Tools.

O. E. Narivs’kiy , Ukrspecmash ltd., Berdyansk, Ukraine.

Dr. Sc., Professor, Technical Director.

References

Goldberg Y. Neural Network Methods in Natural Language Processing (Synthesis Lectures on Human Language Technologies). California , Morgan & Claypool Publishers, 2017, 310 p.

Aggarwal C.C. Neural Networks and Deep Learning: A Textbook . Berlin, Springer, 2018, 520 p.

Valigi N. Zero to AI, A non-technical, hype-free guide to prospering in the AI. New York, Manning Publications, 2020, 264 p.

Artasanchez A. Artificial Intelligence with Python, Your complete guide to building intelligent apps using Python 3.x. Birmingham , Packt Publishing, 2020, 618 p.

Oliinyk A., Subbotin S., Leoshchenko S., Ilyashenko M., Myronova N., Mastinovsky Y. Аdditional training of neurofuzzy diagnostic models, Radio Electronics, Computer Science, Control, 2018, No. 3, pp. 113–119. DOI: 10.15588/1607-3274-2018-3-12.

Leoshchenko S., Oliinyk A., Subbotin S., Zaiko T. Using Modern Architectures of Recurrent Neural Networks for Technical Diagnosis of Complex Systems, 2018 International Scientific-Practical Conference Problems of Infocommunications. Science and Technology (PIC S&T), Kharkiv, 9–12 October 2018, proceedings. Kharkiv, IEEE, 2018, pp. 411–416. DOI: 10.1109/INFOCOMMST.2018.8632015

Iba H. Evolutionary Approach to Machine Learning and Deep Neural Networks, Neuro-Evolution and Gene Regulatory Networks. New York , Springer, 2018, 258 p.

Omelianenko I. Hands-On Neuroevolution with Python, Build high-performing artificial neural network architectures using neuroevolution-based algorithms. Birmingham , Packt Publishing, 2019, 368 p.

Blokdyk G. Neuroevolution of augmenting topologies, Second Edition. Ohio , 5STARCooks, 2018, 128 p.

How is Contrast Encoded in Deep Neural Networks? [Electronic resource]. Access mode, https://arxiv.org/abs/1809.01438

Senge P. M. The Fifth Discipline, The Art & Practice of The Learning Organization. New York, Doubleday, 2006, 445 p.

Weaver W. Science and complexity, American Scientist, 1948, Vol. 36, No. 4, pp. 536–544.

Flood R.L. Dealing with Complexity. Berlin, Springer, 1993, 296 p.

Spiegelhalter D. The Art of Statistics, How to Learn from Data. New York , Basic Books, 2019, 448 p.

Bruce P. Practical Statistics for Data Scientists, 50 Essential Concepts, 1st Edition. California , O’Reilly Media, 2017, 318 p.

Finch W. H. Exploratory Factor Analysis [Text] , 1st Edition. California , SAGE Publications, 2019, 144 p.

Rencher A. C. Methods of Multivariate Analysis [Text] , 3rd Edition. New Jersey , John Wiley & Sons, 2012. 800 p.

Dean A. Design and Analysis of Experiments (Springer Texts in Statistics), 2nd Edition. Berlin, Springer, 2017, 865 p.

Sewak M. Deep Reinforcement Learning, Frontiers of Artificial Intelligence / M. Sewak. Berlin, Springer, 2020. 220 p.

Calix R. A Deep Learning Algorithms, Transformers, gans, encoders, cnns, rnns, and more. Traverse City, Independently published, 2020, 428 p.

Narivs’kyi O.E. Corrosion fracture of platelike heat exchangers, Materials Science, 2005, Vol. 41, No. 1, pp. 122– 128.

Narivs’kyi O.E. Micromechanism of corrosion fracture of the plates of heat exchangers, Materials Science, 2007, Vol. 43, No. 1, pp. 124–132.

Kuhn M. Feature Engineering and Selection, A Practical Approach for Predictive Models (Chapman & Hall/CRC Data Science Series). London, Chapman and Hall (CRC Press), 2019, 310 p.

Oliinyk A., Subbotin S., Lovkin V., Ilyashenko M., Blagodariov O. Parallel method of big data reduction based on stochastic programming approach, Radio Electronics, Computer Science, Control, 2018, № 2, pp. 60–72.

Oliinyk A., Leoshchenko S., Lovkin V., Subbotin S., Zaiko T. Parallel data reduction method for complex technical objects and processes, Dependable Systems, Services and Technologies, (DESSERT’2018) , The 9th IEEE International Conference, Kyiv, 24–27 May, 2018 : proceedings. Los Alamitos, IEEE, 2018, pp. 528–531.

Downloads

Published

2021-03-27

How to Cite

Leoshchenko , S. D. ., Subbotin , S. A., Oliinyk , A. O., & Narivs’kiy , O. E. (2021). IMPLEMENTATION OF THE INDICATOR SYSTEM IN MODELING OF COMPLEX TECHNICAL SYSTEMS . Radio Electronics, Computer Science, Control, 1(1), 117–126. https://doi.org/10.15588/1607-3274-2021-1-12

Issue

Section

Neuroinformatics and intelligent systems

Most read articles by the same author(s)

> >>