COMBINED NEWTON’S THIRD-ORDER CONVERGENCE METHOD FOR MINIMIZE ONE VARIABLE FUNCTIONS
DOI:
https://doi.org/10.15588/1607-3274-2021-2-5Keywords:
unimodal function, Brent method, combined Newton minimization method, method speed.Abstract
Contex. The article deals with the actual problem of numerical optimization of slowly computed unimodal functions of one variable. The analysis of existing methods of minimization of the first and second orders of convergence, which showed that these methods can be used to quickly solve these problems for functions, the values of which can be obtained without difficulty. For slowly computed functions, these methods give slow algorithms; therefore, the problem of developing fast methods for minimizing such functions is urgent.
Objective. Development of a combined third-order Newtonian method of convergence to minimize predominantly slowly computed unimodal functions, as well as the development of a database, including smooth, monotonic and partially constant functions, to test the method and compare its effectiveness with other known methods.
Method. A technique and an algorithm for solving the problem of fast minimization of a unimodal function of one variable by a combined numerical Newtonian method of the third order of convergence presented. The method is capable of recognizing strictly unimodal, monotonic and constant functions, as well as functions with partial or complete sections of a flat minimum.
Results. The results of comparison of the proposed method with other methods, including the fast Brent method, presented. 6954 problems were solved using the combined Newtonian method, while the method turned out to be faster than other methods in 95.5% of problems, Brent’s method worked faster in only 4.5% of problems. In general, the analysis of the calculation results showed that the combined method worked 1.64 times faster than the Brent method.
Conclusions. A combined third-order Newtonian method of convergence proposed for minimizing predominantly slowly computed unimodal functions of one variable. A database of problems developed, including smooth, monotone and partially constant functions, to test the method and compare its effectiveness with other known methods. It is shown that the proposed method, in comparison with other methods, including the fast Brent method, has a higher performance.
References
Sahu D. R., Ansari Q. H., Yao J. C. A unified hybrid iterative method for hierarchical minimization problems, Journal of Computational and Applied Mathematics, 2013, Vol. 253 (1), pp. 208–221. https://doi.org/10.1016/j.cam.2013.04.018.
Rao S. S. Engineering optimization: theory and practice, John Wiley & Sons, 2009. https://doi.org/10.1002/9781119454816.
Gill P. E., Murray W., Wright M. H. Practical Optimization, Emerald Group Publishing Limited, 2019. https://doi.org/10.1137/1.9781611975604.
Аоki M. Introduction to optimization techniques: Fundamentals and Applications of Nonlinear Programming. London, Macmillan, 1971.
Bai Z., Tao M. Rigorous convergence analysis of alternating variable minimization with multiplier methods for quadratic programming problems with equality constraints, BIT Numerical Mathematics, 2016, 56, pp. 399–422. https://doi.org/10.1007/s10543-015-0563-z.
Shor N. Z. Minimization Methods for Non-Differentiable Functions, Springer Berlin Heidelberg, Softcover reprint of the original 1st ed. 1985, 2011. https://doi.org/10.1007/978-3-64282118-9.
Himmelblau D. M. Applied Nonlinear Programming. New York, McGraw-Hill, 1972.
Hassin R., Reuven H. Asymptotic analysis of dichotomous search with search and travel costs, European Journal of Operational Research, 1992, Vol. 58(1), pp. 78–89. https://doi.org/10.1016/0377-2217(92)90237-4.
Glad T., Goldstein A. Optimization of functions whose values are subject to small errors, BIT Numerical Mathematics, 17, 1977, pp. 160–169. https://doi.org/10.1007/bf01932287.
Bazaraa M. S., Shetty C. M. Nonlinear Programming, Theory and Algorithms, New York, Wiley, 1979.
Chen D., Zhou Y., Song L. Fixed point algorithm based on adapted metric method for convex minimization problem with application to image deblurring, Advances in Computational Mathematics, 2016, 42, pp. 1287–1310. DOI: 10.1007/s10444016-9462-3.
Ruszczynski A. Nonlinear Optimization, Princeton University Press, 2006. https://doi.org/10.1515/9781400841059.
Bian W., Chen X., Ye Y. Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization, Mathematical Programming, 2015, 149, pp. 301–327. DOI: 10.1007/s10107-014-0753-5.
Cai J., Hanb D., Chen Ch., Chen S. Application of the golden section search algorithm in the nonlinear isoconversional calculations to the determination of the activation energy from nonisothermal kinetic conversion data, Solid State Sciences, 2010, Vol.12 (5), pp. 829–833. DOI: 10.1021/ef7006672.
Kodnyanko V. A. Economical dichotomous search for minimizing one-variable functions, Radio Electronics, Computer Science, Control, 2019, No. 3, рр. 34–39. DOI: 10.15588/1607-3274-2019-3-4
Weerakoon S., Fernando T. G. I. A variant of Newton’s method with accelerated third-order convergence, Applied Mathematics Letters, 2000, Vol. 13, pp. 87–93. DOI: 10.1016/S08939659(00)00100-2
Brent R. P. Algorithms for Minimization Without Derivatives, Dover, 2002. DOI: 10.2307/2005713
Deng Y., Glimm J., Yu Q., Eisenberg M. Global minimization for problems with multiple local minima, Applied Mathematics Letters, 1993, Vol. 6 (2), pp. 89–90. https://doi.org/10.1016/0893-9659(93)90019-j.
Gerald C. F., Wheatley P. O. Applied Numerical Analysis, fifth ed., Addison-Wesley Pub. Co., MA, 1994. https://doi.org/10.2307/2007813.
Atkinson K. Intro to Numerical Analysis, 2nd Edition, John Wiley & Sons, 1989.
Guimier A. Modelisation d'algorithmes d’optimisation a strategie aleatoire, Calcolo, 1998, 23, pp. 21–43. https://doi.org/10.1007/bf02576906.
Salgueiroda Silva M. A. A novel method for robust minimisation of univariate functions with quadratic convergence, Journal of Computational and Applied Mathematics, 2007, Vol. 200 (1), pp. 168–177. https://doi.org/10.1016/j.cam.2005.12.020.
Archetti F. Evaluation of random gradient techniques for unconstrained optimization, Calcolo, 1975, 12, pp. 83–94. https://doi.org/10.1007/bf02576717.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2021 В. А. Коднянко , О. А. Григорьева , Л. В. Строк
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
-
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
-
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.