Izbor parametara kod gradijentnih metoda za probleme optimizacije bez ograničenja
Choice of parameters in gradient methods for the unconstrained optimization problems
Author
Đorđević, SnežanaMentor
Krejić, Nataša
Committee members
Lužanin, Zorana
Krejić, Nataša

Uzelac, Zorica
Metadata
Show full item recordAbstract
Posmatra se problem optimizacije bez ograničenja. Za rešavanje problema optimizacije bez ograničenja postoji mnoštvo raznovrsnih metoda. Istraživanje ovde motivisano je potrebom za metodama koje će brzo konvergirati. Cilj je sistematizacija poznatih rezultata, kao i teorijska i numerička analiza mogućnosti uvođenja parametra u gradijentne metode. Najpre se razmatra problem minimizacije konveksne funkcije više promenljivih. Problem minimizacije konveksne funkcije više promenljivih ovde se rešava bez izračunavanja matrice hesijana, što je naročito aktuelno za sisteme velikih dimenzija, kao i za probleme optimizacije kod kojih ne raspolažemo ni tačnom vrednošću funkcije cilja, ni tačnom vrednošću gradijenta. Deo motivacije za istraživanjem ovde leži i u postojanju problema kod kojih je funkcija cilja rezultat simulacija. Numerički rezultati, predstavljeni u Glavi 6, pokazuju da uvođenje izvesnog parametra može biti korisno, odnosno, dovodi do ubrzanja određenog metoda optimizacije. Takođ...e se predstavlja jedan novi hibridni metod konjugovanog gradijenta, kod koga je parametar konjugovanog gradijenta konveksna kombinacija dva poznata parametra konjugovanog gradijenta. U prvoj glavi opisuje se motivacija kao i osnovni pojmovi potrebni za praćenje preostalih glava. U drugoj glavi daje se pregled nekih gradijentnih metoda prvog i drugog reda. Četvrta glava sadrži pregled osnovnih pojmova i nekih rezultata vezanih za metode konjugovanih gradijenata. Pomenute glave su tu radi pregleda nekih poznatih rezultata, dok se originalni doprinos predstavlja u trećoj, petoj i šestoj glavi. U trećoj glavi se opisuje izvesna modifikacija određenog metoda u kome se koristi multiplikativni parametar, izabran na slučajan način. Dokazuje se linearna konvergencija tako formiranog novog metoda. Peta glava sadrži originalne rezultate koji se odnose na metode konjugovanih gradijenata. Naime, u ovoj glavi predstavlja se novi hibridni metod konjugovanih gradijenata, koji je konveksna kombinacija dva poznata metoda konjugovanih gradijenata. U šestoj glavi se daju rezultati numeričkih eksperimenata, izvršenih na izvesnom skupu test funkcija, koji se odnose na metode iz treće i pete glave. Implementacija svih razmatranih algoritama rađena je u paketu MATHEMATICA. Kriterijum upoređivanja je vreme rada centralne procesorske jedinice.6
The problem under consideration is an unconstrained optimization problem. There are many different methods made in aim to solve the optimization problems. The investigation made here is motivated by the fact that the methods which converge fast are necessary. The main goal is the systematization of some known results and also theoretical and numerical analysis of the possibilities to int roduce some parameters within gradient methods. Firstly, the minimization problem is considered, where the objective function is a convex, multivar iable function. This problem is solved here without the calculation of Hessian, and such solution is very important, for example, when the big dimension systems are solved, and also for solving optimization problems with unknown values of the objective function and its gradient. Partially, this investigation is motivated by the existence of problems where the objective function is the result of simulations. Numerical results, presented in Chapter 6, sho...w that the introduction of a parameter is useful, i.e., such introduction results by the acceleration of the known optimization method. Further, one new hybrid conjugate gradient method is presented, in which the conjugate gradient parameter is a convex combination of two known conjugate gradient parameters. In the first chapter, there is motivation and also the basic co ncepts which are necessary for the other chapters. The second chapter contains the survey of some first order and second order gradient methods. The fourth chapter contains the survey of some basic concepts and results corresponding to conjugate gradient methods. The first, the second and the fourth chapters are here to help in considering of some known results, and the original results are presented in the chapters 3,5 and 6. In the third chapter, a modification of one unco nstrained optimization method is presented, in which the randomly chosen multiplicative parameter is used. Also, the linear convergence of such modification is proved. The fifth chapter contains the original results, corresponding to conjugate gradient methods. Namely, one new hybrid conjugate gradient method is presented, and this method is the convex combination of two known conjugate gradient methods. The sixth chapter consists of the numerical results, performed on a set of test functions, corresponding to methods in the chapters 3 and 5. Implementation of all considered algorithms is made in Mathematica. The comparison criterion is CPU time.
The problem under consideration is an unconstrained optimization problem. There are many different methods made in aim to solve the optimization problems. The investigation made here is motivated by the fact that the methods which converge fast are necessary. The main goal is the systematization of some known results and also theoretical and numerical analysis of the possibilities to int roduce some parameters within gradient methods. Firstly, the minimization problem is considered, where the objective function is a convex, multivar iable function. This problem is solved here without the calculation of Hessian, and such solution is very important, for example, when the big dimension systems are solved, and also for solving optimization problems with unknown values of the objective function and its gradient. Partially, this investigation is motivated by the existence of problems where the objective function is the result of simulations. Numerical results, presented in Chapter 6, sho...w that the introduction of a parameter is useful, i.e., such introduction results by the acceleration of the known optimization method. Further, one new hybrid conjugate gradient method is presented, in which the conjugate gradient parameter is a convex combination of two known conjugate gradient parameters. In the first chapter, there is motivation and also the basic co ncepts which are necessary for the other chapters. Key Words Documentation 97 The second chapter contains the survey of some first order and second order gradient methods. The fourth chapter contains the survey of some basic concepts and results corresponding to conjugate gradient methods. The first, the second and the fourth chapters are here to help in considering of some known results, and the original results are presented in the chapters 3,5 and 6. In the third chapter, a modification of one unco nstrained optimization method is presented, in which the randomly chosen multiplicative parameter is used. Also, the linear convergence of such modification is proved. The fifth chapter contains the original results, corresponding to conjugate gradient methods. Namely, one new hybrid conjugate gradient method is presented, and this method is the convex combination of two known conjugate gradient methods. The sixth chapter consists of the numerical results, performed on a set of test functions, corresponding to methods in the chapters 3 and 5. Implementation of all considered algorithms is made in Mathematica. The comparison criterion is CPU time