STOCHASTIC PSEUDOSPIN NEURAL NETWORK WITH TRIDIAGONAL SYNAPTIC CONNECTIONS

Authors

  • R. М. Peleshchak Lviv Polytechnic National University, Lviv, Ukraine.
  • V. V. Lytvyn Lviv Polytechnic National University, Ukraine.
  • О. І. Cherniak Drohobych Ivan Franko State Pedagogical University, Drohobych, Ukraine.
  • І. R. Peleshchak Lviv Polytechnic National University, Lviv, Ukraine.
  • М. V. Doroshenko Drohobych Ivan Franko State Pedagogical University, Drohobych, Ukraine.

DOI:

https://doi.org/10.15588/1607-3274-2021-2-12

Keywords:

neural network, synaptic connections, connection matrix, tridiagonalization.

Abstract

Context. To reduce the computational resource time in the problems of diagnosing and recognizing distorted images based on a fully connected stochastic pseudospin neural network, it becomes necessary to thin out synaptic connections between neurons, which is solved using the method of diagonalizing the matrix of synaptic connections without losing interaction between all neurons in the network.

Objective. To create an architecture of a stochastic pseudo-spin neural network with diagonal synaptic connections without loosing the interaction between all the neurons in the layer to reduce its learning time.

Method. The paper uses the Hausholder method, the method of compressing input images based on the diagonalization of the matrix of synaptic connections and the computer mathematics system MATLAB for converting a fully connected neural network into a tridiagonal form with hidden synaptic connections between all neurons.

Results. We developed a model of a stochastic neural network architecture with sparse renormalized synaptic connections that take into account deleted synaptic connections. Based on the transformation of the synaptic connection matrix of a fully connected neural network into a Hessenberg matrix with tridiagonal synaptic connections, we proposed a renormalized local Hebb rule. Using the computer mathematics system “WolframMathematica 11.3”, we calculated, as a function of the number of neurons N, the relative tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network.

Conclusions. We found that with an increase in the number of neurons, the tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network, decreases according to a hyperbolic law. Depending on the direction of pseudospin neurons, we proposed a classification of a renormalized neural network with a ferromagnetic structure, an antiferromagnetic structure, and a dipole glass.

Author Biographies

R. М. Peleshchak, Lviv Polytechnic National University, Lviv, Ukraine.

Doctor of Physical and Mathematical Sciences, Professor of the Department of Information Systems and Networks.

V. V. Lytvyn, Lviv Polytechnic National University, Ukraine.

Dr. Sc., Professor, Head of the Department of Information Systems and Networks.

О. І. Cherniak, Drohobych Ivan Franko State Pedagogical University, Drohobych, Ukraine.

Postgraduate student of the Department Of Mathematics.

І. R. Peleshchak, Lviv Polytechnic National University, Lviv, Ukraine.

Postgraduate student of the Department Of Information Systems And Networks.

М. V. Doroshenko, Drohobych Ivan Franko State Pedagogical University, Drohobych, Ukraine.

PhD, Associate Professor of the Department of informatics and information systems.

References

Haykin S. Neural Networks and Learning Machines. Hamilton, Pearson, 2008, 936 p.

Datta B. N. Numerical Linear Algebra and Applications, Illinois: Second Ed. Society for Industrial and Applied Mathematics (SIAM), 2010, 554 p. DOI: 10.1137/1.9780898717655

Richard L., Burden J., Douglas F. Numerical Analysis. Boston, Cengage Learning, 2010, 888 p.

Korniienko O. V., Subbotin S. O. Zghortkova neiromerezha dlia vyiavlennia shakhraiskykh operatsii z kredytnymy kartkamy, Automation of technological and business processes, 2019, Vol. 11, Issue 3, pp. 65–74. DOI: 10.15673/atbp.v11i3.1503

Ismoilov N., Jang S. A Comparison of Regularization Techniques in Deep Neural Networks, Symmetry, 2018, Vol. 10, Issue 11, P. 648. DOI:10.3390/sym10110648

Belharbi S., Herault R., Chatelain C. et al. Deep Neural Networks Regularization for Structured Output Prediction, Neurocomputing, 2018, Vol. 281, pp. 169–177. DOI: 10.1016/j.neucom.2017.12.002

Slyadnikov E. E. Fizicheskaya model i assotsiativnaya pamyat dipolnoy sistemyi mikrotrubochki tsitoskeleta, Zhurnal tehnicheskoy fiziki, 2007, Vol. 77, Issue 7, pp. 77–86.

Penrose R. Shadows of the Mind: A Search for the Missing Science of Consciousness. Oxford, Oxford University Press, 1996, 480 p.

Weigened A. S., Rumelhart D. E., Huberman B. A. Generalization by weight-elimination with application to forecasting, Advances in neural information processing systems, 1991, Vol. 3, pp. 875–882. DOI: 10.1109/ijcnn.1991.155287

Yang J., Ma J., Berryman M. et al. A structure optimization algorithm of neural networks for large-scale data sets, Fuzzy Systems: First International Conference, China, 6–11 July 2014: proceedings. Beijing, IEEE, 2014, pp. 956–961. DOI: 10.1109/fuzz-ieee.2014.6891662

Luo R., Tian F., Qin T. et al. Neural Architecture Optimization, Neural Information Processing Systems: Thirty-second Conference, Canada, 3–8 December 2018: proceedings. Montreal, NIPS, 2018, pp. 1–12.

Lee S., Kim J., Kang H. et al. Genetic Algorithm Based Deep Learning Neural Network Structure and Hyperparameter Optimization, Appl. Sci, 2021, Vol. 11, Issue 2, P. 744. DOI: 10.3390/app11020744

Wu W. Neural network structure optimization based on improved genetic algorithm, International Conference on Advanced Computational Intelligence: Fifth International Conference, China, 18–20 October 2012: proceedings. Nanjing: ICACI, 2012, pp. 893–895. DOI: 10.1109/ICACI.2012.6463299

Lytvyn V., Peleshchak I., Peleshchak R. The compression of the input images in neural network that using method diagonalization the matrices of synaptic weight connections, Conference on Advanced Information and Communication Technologies: Second International Conference, Ukrine, 4–7 July 2017, proceedings. Lviv, AICT, 2017, pp. 66–70. DOI: 10.1109/AIACT.2017.8020067

Attaway S. Matlab: A Practical Introduction to Programming and Problem Solving. Oxford, Butterworth-Heinemann, 2019, 626 p. DOI: 10.1016/b978-0-12-815479-3.00003-9

Ejov A., Shumskiy A. Neyrokompyuting i ego primeneniya v ekonomike i biznese. Moscow, MIFI, 1998, 224 p.

Hameroff S. R. Quantum coherence in microtubules: A neural basis for emergent consciousness, Journal of consciousness studies, 1994, Vol. 1, Issue 1, pp. 91–118.

Published

2021-07-07

How to Cite

Peleshchak R. М., Lytvyn, V. V. ., Cherniak О. І., Peleshchak І. R., & Doroshenko М. V. (2021). STOCHASTIC PSEUDOSPIN NEURAL NETWORK WITH TRIDIAGONAL SYNAPTIC CONNECTIONS . Radio Electronics, Computer Science, Control, (2), 114–122. https://doi.org/10.15588/1607-3274-2021-2-12

Issue

Section

Neuroinformatics and intelligent systems