Приказ основних података о дисертацији

Choice of parameters in gradient methods for the unconstrained optimization problems

dc.contributor.advisorKrejić, Nataša
dc.contributor.otherLužanin, Zorana
dc.contributor.otherKrejić, Nataša
dc.contributor.otherUzelac, Zorica
dc.creatorĐorđević, Snežana
dc.date.accessioned2015-12-29T11:17:01Z
dc.date.available2015-12-29T11:17:01Z
dc.date.available2020-07-03T13:42:56Z
dc.date.issued2015-05-22
dc.identifier.urihttps://nardus.mpn.gov.rs/handle/123456789/1698
dc.identifier.urihttp://www.cris.uns.ac.rs/DownloadFileServlet/Disertacija144351680479795.pdf?controlNumber=(BISIS)94106&fileName=144351680479795.pdf&id=4370&source=NaRDuS&language=srsr
dc.identifier.urihttp://www.cris.uns.ac.rs/record.jsf?recordId=94106&source=NaRDuS&language=srsr
dc.identifier.urihttp://www.cris.uns.ac.rs/DownloadFileServlet/IzvestajKomisije142435019637094.pdf?controlNumber=(BISIS)94106&fileName=142435019637094.pdf&id=3308&source=NaRDuS&language=srsr
dc.description.abstractPosmatra se problem optimizacije bez ograničenja. Za rešavanje problema  optimizacije bez ograničenja postoji mnoštvo raznovrsnih metoda. Istraživanje ovde motivisano je potrebom za metodama koje će brzo konvergirati. Cilj je sistematizacija poznatih rezultata, kao i teorijska i numerička analiza mogućnosti uvođenja parametra u gradijentne metode. Najpre se razmatra problem minimizacije konveksne funkcije više promenljivih. Problem minimizacije konveksne funkcije više promenljivih ovde se rešava bez izračunavanja matrice hesijana, što je naročito aktuelno za sisteme velikih dimenzija, kao i za probleme optimizacije kod kojih ne raspolažemo ni tačnom vrednošću funkcije cilja, ni tačnom vrednošću gradijenta. Deo motivacije za istraživanjem ovde leži i u postojanju problema kod kojih je funkcija cilja rezultat simulacija. Numerički rezultati, predstavljeni u Glavi 6, pokazuju da uvođenje izvesnog parametra može biti korisno, odnosno, dovodi do ubrzanja određenog metoda optimizacije. Takođe se predstavlja jedan novi hibridni metod konjugovanog gradijenta, kod koga je parametar konjugovanog gradijenta konveksna kombinacija dva poznata parametra konjugovanog gradijenta. U prvoj glavi opisuje se motivacija kao i osnovni pojmovi potrebni za praćenje preostalih glava. U drugoj glavi daje se pregled nekih gradijentnih metoda prvog i drugog reda. Četvrta glava sadrži pregled osnovnih pojmova i nekih rezultata vezanih za metode konjugovanih gradijenata. Pomenute glave su tu radi pregleda nekih poznatih rezultata, dok se originalni doprinos predstavlja u trećoj, petoj i šestoj glavi. U trećoj glavi se opisuje izvesna modifikacija određenog metoda u kome se koristi multiplikativni parametar, izabran na slučajan način. Dokazuje se linearna konvergencija tako formiranog novog metoda. Peta glava sadrži originalne rezultate koji se odnose na metode konjugovanih gradijenata. Naime, u ovoj glavi predstavlja se novi hibridni metod konjugovanih gradijenata, koji je konveksna kombinacija dva poznata metoda konjugovanih gradijenata. U šestoj glavi se daju rezultati numeričkih eksperimenata, izvršenih na  izvesnom skupu test funkcija, koji se odnose na metode iz treće i pete glave. Implementacija svih razmatranih algoritama rađena je u paketu MATHEMATICA. Kriterijum upoređivanja je vreme rada centralne procesorske jedinice.6sr
dc.description.abstractThe problem under consideration is an unconstrained optimization problem. There are many different methods made in aim to solve the optimization problems.  The investigation made here is motivated by the fact that the methods which converge fast are necessary. The main goal is the systematization of some known results and also theoretical and numerical analysis of the possibilities to int roduce some parameters within gradient methods. Firstly, the minimization problem is considered, where the objective function is a convex, multivar iable function. This problem is solved here without the calculation of Hessian, and such solution is very important, for example, when the  big dimension systems are solved, and also for solving optimization problems with unknown values of the objective function and its gradient. Partially, this investigation is motivated by the existence of problems where the objective function is the result of simulations. Numerical results, presented in  Chapter  6, show that the introduction of a parameter is useful, i.e., such introduction results by the acceleration of the known optimization method. Further, one new hybrid conjugate gradient method is presented, in which the conjugate gradient parameter is a convex combination of two known conjugate gradient parameters. In the first chapter, there is motivation and also the basic co ncepts which are necessary for the other chapters. The second chapter contains the survey of some first order and second order gradient methods. The fourth chapter contains the survey of some basic concepts and results corresponding to conjugate gradient methods. The first, the second and the fourth  chapters are here to help in considering of some known results, and the original results are presented in the chapters 3,5 and 6. In the third chapter, a modification of one unco nstrained optimization method is presented, in which the randomly chosen multiplicative parameter is used. Also, the linear convergence of such modification is proved. The fifth chapter contains the original results, corresponding to conjugate gradient methods. Namely, one new hybrid conjugate gradient method is presented, and this  method is the convex combination of two known conjugate gradient methods. The sixth chapter consists of the numerical results, performed on a set of test functions, corresponding to methods in the chapters 3 and 5. Implementation of all considered algorithms is made in Mathematica. The comparison criterion is CPU time.en
dc.description.abstractThe problem under consideration is an unconstrained optimization problem. There are many different methods made in aim to solve the optimization problems.  The investigation made here is motivated by the fact that the methods which converge fast are necessary. The main goal is the systematization of some known results and also theoretical and numerical analysis of the possibilities to int roduce some parameters within gradient methods. Firstly, the minimization problem is considered, where the objective function is a convex, multivar iable function. This problem is solved here without the calculation of Hessian, and such solution is very important, for example, when the  big dimension systems are solved, and also for solving optimization problems with unknown values of the objective function and its gradient. Partially, this investigation is motivated by the existence of problems where the objective function is the result of simulations. Numerical results, presented in  Chapter  6, show that the introduction of a parameter is useful, i.e., such introduction results by the acceleration of the known optimization method. Further, one new hybrid conjugate gradient method is presented, in which the conjugate gradient parameter is a convex combination of two known conjugate gradient parameters. In the first chapter, there is motivation and also the basic co ncepts which are necessary for the other chapters. Key  Words Documentation  97 The second chapter contains the survey of some first order and second order gradient methods. The fourth chapter contains the survey of some basic concepts and results corresponding to conjugate gradient methods. The first, the second and the fourth  chapters are here to help in considering of some known results, and the original results are presented in the chapters 3,5 and 6. In the third chapter, a modification of one unco nstrained optimization method is presented, in which the randomly chosen multiplicative parameter is used. Also, the linear convergence of such modification is proved. The fifth chapter contains the original results, corresponding to conjugate gradient methods. Namely, one new hybrid conjugate gradient method is presented, and this  method is the convex combination of two known conjugate gradient methods. The sixth chapter consists of the numerical results, performed on a set of test functions, corresponding to methods in the chapters 3 and 5. Implementation of all considered algorithms is made in Mathematica. The comparison criterion is CPU timeen
dc.languagesr (latin script)
dc.publisherУниверзитет у Новом Саду, Природно-математички факултетsr
dc.rightsopenAccessen
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.sourceУниверзитет у Новом Садуsr
dc.subjectNelinearna optimizacijasr
dc.subjectNonlinear optimizationen
dc.subjectmetodi linijskog pretraživanjasr
dc.subjectmonotono linijsko pretraživanjesr
dc.subjectaproksimacija matrice hesijanasr
dc.subjectslučajno izabrani parametarsr
dc.subjecthibridni metod konjugovanih gradijenatasr
dc.subjectuslov konjugacijesr
dc.subjectline search methodsen
dc.subjectmonotone line searchen
dc.subjectapproximation of Hessianen
dc.subjectrandomly chosen parameteren
dc.subjecthybrid conjugate gradient methoden
dc.subjectconjugation conditionen
dc.titleIzbor parametara kod gradijentnih metoda za probleme optimizacije bez ograničenjasr
dc.titleChoice of parameters in gradient methods for the unconstrained optimization problemsen
dc.titleChoice of parameters in gradient methods for the unconstrained optimization problemsen
dc.typedoctoralThesisen
dc.rights.licenseBY-NC-ND
dcterms.abstractКрејић Наташа; Лужанин Зорана; Крејић Наташа; Узелац Зорица; Ђорђевић Снежана; Избор параметара код градијентних метода за проблеме оптимизације без ограничења; Избор параметара код градијентних метода за проблеме оптимизације без ограничења;
dc.identifier.fulltexthttp://nardus.mpn.gov.rs/bitstream/id/38089/Disertacija.pdf
dc.identifier.fulltexthttps://nardus.mpn.gov.rs/bitstream/id/38089/Disertacija.pdf
dc.identifier.fulltexthttp://nardus.mpn.gov.rs/bitstream/id/38090/IzvestajKomisije.pdf
dc.identifier.fulltexthttps://nardus.mpn.gov.rs/bitstream/id/38090/IzvestajKomisije.pdf
dc.identifier.rcubhttps://hdl.handle.net/21.15107/rcub_nardus_1698


Документи за докторску дисертацију

Thumbnail
Thumbnail

Ова дисертација се појављује у следећим колекцијама

Приказ основних података о дисертацији