TWO ALGORITHMS FOR GLOBAL OPTIMIZATION OF ONE-VARIABLE FUNCTIONS BASED ON THE SMALLEST ESTIMATE DISTANCES BETWEEN EXTREMES AND THEIR NUMBER

Authors

  • V. A. Kodnyanko Siberian Federal University, Krasnoyarsk, Russian Federation

DOI:

https://doi.org/10.15588/1607-3274-2020-2-4

Keywords:

Function of one variable, local minimum of a function, global minimum of a function, global optimization, Brent's method, algorithm performance.

Abstract

Contex. Making managerial decisions is often associated with solving one-dimensional global optimization problems. The most important property of global optimization methods is their speed, which is determined by the number of calls to the objective function in the optimization process.

Objective. Development of high-performance algorithms global for optimizing the function of one variable, based on conditions that allow you to bring the problem to a form that opens up the practical possibility of obtaining a solution with a given accuracy.

Method. Two algorithms of conditional global optimization of a function of one variable are considered. The first is based on estimating the smallest distance between neighboring local extrema and allows you to find the global minimum of the goal function and, if necessary, all its local extrema. The second is suitable for finding the global minimum of a function if the number of local extrema in the uncertainty interval is known in advance. Both algorithms are based on segmentation methods of the initial uncertainty segment. The local extremum on a segment is determined by three or four points. An approach is proposed that, in most cases, allows localization of the extremum at three points, which provides savings in the calculation of digital filters, thereby contributing to an increase in the speed of the algorithm.

Results. The results of solving optimization problems and data on the effectiveness of the proposed algorithms are presented. A comparative analysis of the speed of the developed algorithms and well-known algorithms is carried out on the example of solving test problems used in world practice to assess the effectiveness of global optimization algorithms. Examples of the practical use of algorithms are given. The analysis of the data obtained showed that according to the number of calls to the objective function, the algorithms in the sequential computing mode work several times faster than modern high-speed algorithms with which they were compared.

Conclusions. The data presented indicate the efficiency and high speed of the proposed algorithms. Their speed will be even higher if the stated ideas of algorithmization are extended to parallel computations. This suggests that the proposed algorithms can find practical application in the global optimization of functions of the considered classes of problems. 

Author Biography

V. A. Kodnyanko, Siberian Federal University, Krasnoyarsk

Dr. Sc., Professor, Polytechnic Institute

References

Bryson A. E., Ho Y. Applied Optimal Control: Optimization, Estimation and Control. Hemisphere Publishing Corporation, Washington DC, 1975, 477 p.

Tan Y. Nešić D., Mareels M. Y., Astolfi A. Automatica. On global extremum seeking in the presence of local extrema, 2009, Vol. 45 (1), pp. 245–251. https://doi.org/10.1016/j.automatica.2008.06.010

Bizon N. Energy optimization of fuel cell system by using global extremum seeking algorithm, Applied Energy. – 2017, Vol. 206, pp. 458–474. https://doi.org/10.1016/j.apenergy.2017.08.097

Bertsekas D. P., Tsitsiklis J. N. Parallel and Distributed Computation: Numerical Methods. New Jersey, Prentice Hall, 1989, 94 p.

Bomze I. M., Csendes T., Horst R., Pardalos P. M. Developments in Global Optimization. London, Kluwer, 1997, 348 p.

Kvasov D. E., Sergeev Ya. D. Methods of Lipschitz global optimization in control problems, Automation and telemechanics, 2013, No. 9, pp. 3–19.

Sergeev Ya. D., Kvasov D. E. Adaptive diagonal curves and their software implementation, Bulletin of Nizhny Novgorod State University: Mathematical Modeling and Optimal Control, 2001, Vol. 2, No. 24, pp. 300–317.

Sergeev Ya. D., Kvasov D. E. Diagonal methods of global optimization. Moscow, FIZMATLIT, 2008, 276 p.

Pizzuti C., Sergeyev Ya. D. Comparison of two partitionstrategies in diagonal global optimization algorithms: Technical report 9. – Institute of Systems and Informatics. Italy, Rende, 2001, 16 p.

Dennis J. E. Jr., Torczon V. Direct Search Methods on Parallel Machines, SIAM Journal Optimization, 1991, pp. 448–474.

Nemhauster G. L., Pruul E. A., Rushmeier R. A. Branchand-bound and Parallel Computation: a Historical Note, Operations Research Letters, 1988. No 7, pp. 65–69.

Orlyanskaya I. V., Perunova Y. N., Zavriev S. K. An Implementation of the Parallel Control Algorithm for Global Optimization, Proceedings of the 3rd Moscow International Conference on the Research of Operations, 2001, pp. 91–92.

Ali M., Törn A., Viitanen S. Stochastic Global Optimization: Problem, Classes and Solution Techniques, Journal of Global Optimization, 1999, No 14, pp. 437–447.

Moccus J. Application of Bayesian Approach to Numerical Methods of Global and Stochastic Optimization, Journal Global Optimization, 1994,Vol. 4, No. 4, pp. 347–356.

Hansen P., Jaumard B. Lipschitz optimization, Handbook of global optimization, 1995, Vol. 1, pp. 407–493.

Kodnyanko V. A. Optimization of unimodal functions by the parabolic predictor method, Bulletin of the Astrakhan State Technical University. Series: Management, Computing and Informatics, 2018, No. 3, pp. 101–108. DOI: 10.24143 / 2072-9502-2018-3-101-108

Brent R. P. Algorithms for Minimization Without Derivatives. Dover, 2002, 195 p.

Brent R. P. Errata for Algorithms for Minimization without Derivatives. – Prentice-Hall edition, 1973. Available from: https://mathspeople.anu.edu.au/~brent/pub/pub011_errata.html

Burkardt J. V. Brent algorithms for minimization without derivatives, 2008. Available from: http://people.sc.fsu.edu/~jburkardt/f77_src/brent/brent.html

Rheinboldt W. C., Burkardt J. V. Algorithm 596: a program for a locally parameterized, 2008. Available from: http://people.sc.fsu.edu/~jburkardt/f77_src/brent/brent.f

Rao, S. S. Engineering optimization: theory and practice. John Wiley & Sons, Inc., Hoboken, New Jersey, 2009, 813 p.

Amdahl G. The Validity of Single Processor Approach to Achieving Large Scale Computing Capabilities, Spring Joint Computer Conference, 1967, Vol. 30, pp. 483– 485.

Nocedal J. Theory of algorithms for unconstrained optimization. Acta Numerica, Serles A, Cambridge University Press, Cambridge, 1991, pp. 199–242.

Birkhoff S., de Boor C. R. Piecewise polynomial interpolation and approx-imation, Garabedian, General Motors Symposium, 1964, Elsevier, New York and Amsterdam, 1965, pp. 164–190.

Epps B. P., Truscott T. T., Techet A. H. Evaluating derivatives of experimental data using smoothing splines, 3rd Mathematical Methods in Engineering International Symposium MME’10, Coimbra, Portugal, 2010, pp. 21–24.

Kodnyanko V., Kurzakov A. Quality of Dynamics of Gasstatic Thrust Bearing with Movable Carrying Circle on Elastic Suspension / V. Kodnyanko, A. Kurzakov, Tribology in Industry, 2019, Vol. 41, No. 2, pp. 237–241. DOI: 10.24874/ti.2019.41.02.09

Kurosh A. Higher Algebra. Moscow, Mir Publishers, 1984, P. 432.

Camargo Brunetto M. A. O., Claudio D. M., Trevisan V. An algebraic algorithm to isolate complex polynomial zeros using Sturm sequences, Computers & Mathematics with Applications, 2000, Vol. 39 (3–4), pp. 95–105. https://doi.org/10.1016/S0898-1221(99)00336-3

Vandergraft J. S. Certification of Algorithm 3: Solution of polynomial equations by Bairstow-Hitchcock method, Communications of the ACM, 1961, Vol. 4, No 2, pp. 105– 117.

Downloads

How to Cite

Kodnyanko, V. A. (2020). TWO ALGORITHMS FOR GLOBAL OPTIMIZATION OF ONE-VARIABLE FUNCTIONS BASED ON THE SMALLEST ESTIMATE DISTANCES BETWEEN EXTREMES AND THEIR NUMBER. Radio Electronics, Computer Science, Control, (2), 36–43. https://doi.org/10.15588/1607-3274-2020-2-4

Issue

Section

Mathematical and computer modelling