STOCHASTIC PSEUDOSPIN NEURAL NETWORK WITH TRIDIAGONAL SYNAPTIC CONNECTIONS
Keywords:neural network, synaptic connections, connection matrix, tridiagonalization.
Context. To reduce the computational resource time in the problems of diagnosing and recognizing distorted images based on a fully connected stochastic pseudospin neural network, it becomes necessary to thin out synaptic connections between neurons, which is solved using the method of diagonalizing the matrix of synaptic connections without losing interaction between all neurons in the network.
Objective. To create an architecture of a stochastic pseudo-spin neural network with diagonal synaptic connections without loosing the interaction between all the neurons in the layer to reduce its learning time.
Method. The paper uses the Hausholder method, the method of compressing input images based on the diagonalization of the matrix of synaptic connections and the computer mathematics system MATLAB for converting a fully connected neural network into a tridiagonal form with hidden synaptic connections between all neurons.
Results. We developed a model of a stochastic neural network architecture with sparse renormalized synaptic connections that take into account deleted synaptic connections. Based on the transformation of the synaptic connection matrix of a fully connected neural network into a Hessenberg matrix with tridiagonal synaptic connections, we proposed a renormalized local Hebb rule. Using the computer mathematics system “WolframMathematica 11.3”, we calculated, as a function of the number of neurons N, the relative tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network.
Conclusions. We found that with an increase in the number of neurons, the tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network, decreases according to a hyperbolic law. Depending on the direction of pseudospin neurons, we proposed a classification of a renormalized neural network with a ferromagnetic structure, an antiferromagnetic structure, and a dipole glass.
Haykin S. Neural Networks and Learning Machines. Hamilton, Pearson, 2008, 936 p.
Datta B. N. Numerical Linear Algebra and Applications, Illinois: Second Ed. Society for Industrial and Applied Mathematics (SIAM), 2010, 554 p. DOI: 10.1137/1.9780898717655
Richard L., Burden J., Douglas F. Numerical Analysis. Boston, Cengage Learning, 2010, 888 p.
Korniienko O. V., Subbotin S. O. Zghortkova neiromerezha dlia vyiavlennia shakhraiskykh operatsii z kredytnymy kartkamy, Automation of technological and business processes, 2019, Vol. 11, Issue 3, pp. 65–74. DOI: 10.15673/atbp.v11i3.1503
Ismoilov N., Jang S. A Comparison of Regularization Techniques in Deep Neural Networks, Symmetry, 2018, Vol. 10, Issue 11, P. 648. DOI:10.3390/sym10110648
Belharbi S., Herault R., Chatelain C. et al. Deep Neural Networks Regularization for Structured Output Prediction, Neurocomputing, 2018, Vol. 281, pp. 169–177. DOI: 10.1016/j.neucom.2017.12.002
Slyadnikov E. E. Fizicheskaya model i assotsiativnaya pamyat dipolnoy sistemyi mikrotrubochki tsitoskeleta, Zhurnal tehnicheskoy fiziki, 2007, Vol. 77, Issue 7, pp. 77–86.
Penrose R. Shadows of the Mind: A Search for the Missing Science of Consciousness. Oxford, Oxford University Press, 1996, 480 p.
Weigened A. S., Rumelhart D. E., Huberman B. A. Generalization by weight-elimination with application to forecasting, Advances in neural information processing systems, 1991, Vol. 3, pp. 875–882. DOI: 10.1109/ijcnn.1991.155287
Yang J., Ma J., Berryman M. et al. A structure optimization algorithm of neural networks for large-scale data sets, Fuzzy Systems: First International Conference, China, 6–11 July 2014: proceedings. Beijing, IEEE, 2014, pp. 956–961. DOI: 10.1109/fuzz-ieee.2014.6891662
Luo R., Tian F., Qin T. et al. Neural Architecture Optimization, Neural Information Processing Systems: Thirty-second Conference, Canada, 3–8 December 2018: proceedings. Montreal, NIPS, 2018, pp. 1–12.
Lee S., Kim J., Kang H. et al. Genetic Algorithm Based Deep Learning Neural Network Structure and Hyperparameter Optimization, Appl. Sci, 2021, Vol. 11, Issue 2, P. 744. DOI: 10.3390/app11020744
Wu W. Neural network structure optimization based on improved genetic algorithm, International Conference on Advanced Computational Intelligence: Fifth International Conference, China, 18–20 October 2012: proceedings. Nanjing: ICACI, 2012, pp. 893–895. DOI: 10.1109/ICACI.2012.6463299
Lytvyn V., Peleshchak I., Peleshchak R. The compression of the input images in neural network that using method diagonalization the matrices of synaptic weight connections, Conference on Advanced Information and Communication Technologies: Second International Conference, Ukrine, 4–7 July 2017, proceedings. Lviv, AICT, 2017, pp. 66–70. DOI: 10.1109/AIACT.2017.8020067
Attaway S. Matlab: A Practical Introduction to Programming and Problem Solving. Oxford, Butterworth-Heinemann, 2019, 626 p. DOI: 10.1016/b978-0-12-815479-3.00003-9
Ejov A., Shumskiy A. Neyrokompyuting i ego primeneniya v ekonomike i biznese. Moscow, MIFI, 1998, 224 p.
Hameroff S. R. Quantum coherence in microtubules: A neural basis for emergent consciousness, Journal of consciousness studies, 1994, Vol. 1, Issue 1, pp. 91–118.
How to Cite
Copyright (c) 2021 Р. М. Пелещак, В. В. Литвин, О. І. Черняк, І. Р. Пелещак, М. В. Дорошенко
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.