NEURAL ORDINARY DIFFERENTIAL EQUATIONS FOR TIME SERIES RECONSTRUCTION

Authors

  • D. V. Androsov National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv, Ukraine, Ukraine

DOI:

https://doi.org/10.15588/1607-3274-2023-4-7

Keywords:

neural ordinary differential equations, deep neural networks, variational autoencoders, recurrent neural networks, long term short memory networks

Abstract

Context. Neural Ordinary Differential Equations is a deep neural networks family that leverage numerical methods approaches for solving the problem of time series reconstruction, given small amount of unevenly distributed samples.

Objective. The goal of the following research is the synthesis of a deep neural network that is able to solve input signal reconstruction and time series extrapolation task.

Method. The proposed method exhibits the benefits of solving time series extrapolation task over forecasting one. A model that implements encoder-decoder architecture with differential equation solving in latent space, is proposed. The latter approach was proven to demonstrate outstanding performance in solving time series reconstruction task given a small percentage of noisy and uneven distributed input signals. The proposed Latent Ordinary Differential Equations Variational Autoencoder (LODE-VAE) model was benchmarked on synthetic non-stationary data with added white noise and randomly sampled with random intervals between each signal.

Results. The proposed method was implemented via deep neural network to solve time series extrapolation task.

Conclusions. The conducted experiments have confirmed that proposed model solves the given task effectively and is recommended to apply it to solving real-world problems that require reconstructing dynamics of non-stationary processes. The prospects for further research may include the process of computational optimization of proposed models, as well as conducting additional experiments involving different baselines, e. g. Generative Adversarial Networks or attention Networks.

Author Biography

D. V. Androsov, National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv, Ukraine

Post-graduate student of the Institute for Applied System Analysis

References

Bidyuk P. I., Romanenko V. D., Timoshchuk O. L. Time series analysis. Kyiv, Polytechnika, 2013, 230 p. (In Ukrainian)

Parfenenko Y. V., Shendryk V. V., Kholiavka Y. P., Pavlenko P. M. Comparison of short-term forecasting methods of electricity consumption in microgrids, Radio Electronics, Computer Science, Control, 2023, № 1, pp. 14–23. DOI: https://doi.org/10.15588/1607-3274-2023-1-2

Terence C. M. Chapter 3 – ARMA Models for Stationary Time Series, Applied Time Series Analysis. A Practical Guide to Modeling and Forecasting. Cambridge, Academic Press, 2019, Ch. 3, pp. 31–56. DOI: https://doi.org/10.1016/B978-0-12-813117-6.00003-X.

Terence. C. M. Chapter 4 – ARIMA Models for Nonstationary Time Series, Applied Time Series Analysis. A Practical Guide to Modeling and Forecasting. Cambridge, Academic Press, 2019, Ch. 4, pp. 57–69. DOI: https://doi.org/10.1016/B978-0-12-813117-6.00004-1.

Charles A., Darné O. The accuracy of asymmetric GARCH model estimation, International Economics, 2019, Vol. 157, pp. 179–202. DOI: https://doi.org/10.1016/j.inteco.2018.11.001

Bidyuk P., Prosyankina-Zharova T., Terentiev O. Modeling nonlinear nonstationary processes in macroeconomy and finances, Advances in Computer Science for Engineering and Education, 2019, Vol. 754, pp. 735–745. DOI: https://doi.org/10.1007/978-3-319-91008-6_72.

Kumar A. S., Anandarao S. Volatility spillover in cryptocurrency markets: Some evidences from GARCH and wavelet analysis, Physica A: Statistical Mechanics and its Applications, 2019, Vol. 524, pp. 448–458. DOI: https://doi.org/10.1016/j.physa.2019.04.154.

Geron A. Hands-On Machine Learning with Scikit-Learn and TensorFlow. Sebastopol: O’Reilly Media Inc., 2017, 760 p.

Goodfellow I., Bengio Y., Courville A. Deep Learning. Cambridge, The MIT Press, 2016, 802 p.

Whittle P. Hypothesis Testing in Time Series Analysis. Stockholm, Almquist and Wicksell, 1951, 187 p.

Box G. E.P., Jenkins G. M. Time Series Analysis: Forecasting and Control. San-Francisco, Holden-Day, 1976, 575 p.

Hochreiter S., Schmidhuber J. Long short-term memory, Neural computation, 1997, Vol. 9, № 8, pp. 1735–1780.

Cho K., Merrienboer B. van, Gulcehre C., Bahdanau D., Bougares F., Schwenk H., Bengio Y. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. EMNLP 2014: Conference on Empirical Methods in Natural Language Processing, Doha, 25–29 October 2014: proceedings. Doha, Association for Computational Linguistics, 2014, pp. 1724–1734. DOI: https://doi.org/10.48550/arXiv.1406.1078

Chen R. T.Q., Rubanova Y., Bettencourt J., Duvenaud D. Neural ordinary differential equations [Electronic resource]. Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems, Vancouver, 3–8 December 2018: proceedings. Access mode: https://proceedings.neurips.cc/paper_files/paper/2018/file/69 386f6bb1dfed68692a24c8686939b9-Paper.pdf

Lu J., Deng K., Zhang X., Liu G., Guan Y. Neural-ODE for pharmacokinetics modeling and its advantage to alternative machine learning models in predicting new dosing regimens, Science, 2021, Vol. 24, Issue 7, pp. 1–13. DOI: https://doi.org/10.1016/j.isci.2021.102804.

De Brouwer E., Simm J., Arany A., Moreau Y. GRU-ODEBayes: Continuous modeling of sporadically-observed time series [Electronic resource], Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems, Vancouver, 8–14 December 2019: proceedings. Access mode: https://proceedings.neurips.cc/paper_files/paper/2019/file/45 5cb2657aaa59e32fad80cb0b65b9dc-Paper.pdf

Downloads

Published

2023-12-24

How to Cite

Androsov, D. V. (2023). NEURAL ORDINARY DIFFERENTIAL EQUATIONS FOR TIME SERIES RECONSTRUCTION. Radio Electronics, Computer Science, Control, (4), 69. https://doi.org/10.15588/1607-3274-2023-4-7

Issue

Section

Neuroinformatics and intelligent systems