您好,欢迎访问三七文档
AnnalsofOperationsResearch103,33–47,20012001KluwerAcademicPublishers.ManufacturedinTheNetherlands.AnEfficientHybridConjugateGradientMethodforUnconstrainedOptimization∗Y.H.DAI∗∗andY.YUAN{dyh,yyx}@lsec.cc.ac.cnStateKeyLaboratoryofScientificandEngineeringComputing,InstituteofComputationalMathematicsandScientific/EngineeringComputing,AcademyofMathematicsandSystemSciences,ChineseAcademyofSciences,P.O.Box2719,Beijing100080,P.R.ChinaAbstract.Recently,weproposeanonlinearconjugategradientmethod,whichproducesadescentsearchdirectionateveryiterationandconvergesgloballyprovidedthatthelinesearchsatisfiestheweakWolfeconditions.Inthispaper,wewillstudymethodsrelatedtothenewnonlinearconjugategradientmethod.Specifically,ifthesizeofthescalarβkwithrespecttotheoneinthenewmethodbelongstosomeinterval,thenthecorrespondingmethodsareprovedtobegloballyconvergent;otherwise,weareabletoconstructaconvexquadraticexampleshowingthatthemethodsneednotconverge.NumericalexperimentsaremadefortwocombinationsofthenewmethodandtheHestenes–Stiefelconjugategradientmethod.Theinitialresultsshowthat,oneofthehybridmethodsisespeciallyefficientforthegiventestproblems.Keywords:unconstrainedoptimization,conjugategradientmethod,linesearch,descentproperty,globalconvergenceAMSsubjectclassification:49M37,65K05,90C301.IntroductionIn[6],weproposeanonlinearconjugategradientmethod.Animportantpropertyofthemethodisthat,itproducesadescentsearchdirectionateveryiterationandconvergesgloballyprovidedthatthelinesearchsatisfiestheweakWolfeconditions.Thepurposeofthispaperistostudysomemethodsrelatedtothenewnonlinearconjugategradientmethodandfindefficientalgorithmsamongthem.Considerthefollowingunconstrainedoptimizationproblem,minf(x),x∈Rn,(1.1)wherefissmoothanditsgradientgisavailable.Conjugategradientmethodsareveryefficientforsolving(1.1)especiallywhenthedimensionnislarge,andhavethefollow-ingformxk+1=xk+αkdk,(1.2)∗ResearchpartlysupportedbyChineseNationalNaturalSciencesFoundationGrants19731010and19801033.∗∗Correspondingauthor.34DAIANDYUANdk=−gkfork=1,−gk+βkdk−1fork2,(1.3)wheregk=−∇f(xk),αk0isasteplengthobtainedbyalinesearch,andβkisascalar.Theformulaforβkshouldbesochosenthatthemethodreducestothelinearconjugategradientmethodinthecasewhenfisastrictlyconvexquadraticandthelinesearchisexact.Well-knownformulaeforβkarecalledtheFletcher–Reeves(FR),conjugatedescent(CD),Polak–Ribière–Polyak(PRP),andHestenes–Stiefel(HS)formulae(see[7,8,10,14,15]),andaregivenbyβFRk=gk2gk−12,(1.4)βPRPk=gTkyk−1gk−12,(1.5)βCDk=−gk2dTk−1gk−1,(1.6)βHSk=gTkyk−1dTk−1yk−1,(1.7)respectively,whereyk−1=gk−gk−1and·meanstheEuclideannorm.Intheconvergenceanalysesandimplementationsofconjugategradientmethods,oneoftenrequiresthelinesearchtobeexactorsatisfythestrongWolfeconditions,namely,f(xk)−f(xk+αkdk)−δαkgTkdk,(1.8)g(xk+αkdk)Tdk−σgTkdk,(1.9)where0δσ1(forthelatter,wecallthelinesearchasthestrongWolfelinesearch).Forexample,theFRmethodisshowntobegloballyconvergentunderstrongWolfelinesearcheswithσ1/2[1,3,12].Ifσ1/2,theFRmethodmayfailduetoproducinganascentsearchdirection[3].ThePRPmethodwithexactlinesearchesmaycyclewithoutapproachinganystationarypoint,seePowell’scounter-example[17].Recently,in[6],weproposeanonlinearconjugategradientmethod,inwhichβDYk=gk2dTk−1yk−1.(1.10)Themethodisprovedtoproduceadescentsearchdirectionateveryiterationandcon-vergegloballyprovidedthatthelinesearchsatisfiestheweakWolfeconditions,namely,(1.8)andg(xk+αkdk)TdkσgTkdk,(1.11)wherealso0δσ1(inthiscase,wecallthelinesearchastheweakWolfelinesearch).Othernicepropertiesofthemethodcanbefoundin[2,5].Inthispaper,wecallthemethoddefinedby(1.2),(1.3)whereβkiscomputedby(1.10)asthemethod(1.10).EFFICIENTHYBRIDCONJUGATE35Althoughonewouldbesatisfiedwithitsglobalconvergenceproperties,theFRmethodsometimesperformsmuchworsethanthePRPmethodinrealcomputations.Powell[16]observedonemajorevidencefortheinefficientbehaviorsoftheFRmethodwithexactlinesearches;thatis,ifaverysmallstepisgeneratedawayfromthesolution,thenthesubsequentstepswillbelikelytobealsoveryshort.Incontrast,thePRPmethodwithexactlinesearchescouldrecoverfromthissituation.GilbertandNocedal[9]ex-tendedPowell’sanalysestothecaseofinexactlinesearches.Forthemethod(1.10)withstrongWolfelinesearches,wecandeducethesamedrawbackastheFRmethod.Infact,bymultiplying(1.3)withgkandusing(1.10),wehavethatgTkdk=gTk−1dk−1dTk−1yk−1gk2,(1.12)whichwith(1.10)givesanequivalentformulato(1.10):βDYk=gTkdkgTk−1dk−1.(1.13)Ontheotherhand,writing(1.3)asdk+gk=βkdk−1andsquaringit,wegetdk2=−gk2−2gTkdk+β2kdk−12.(1.14)Dividing(1.14)by(gTkdk)2andsubstituting(1.12)and(1.13),wecanobtaindk2(gTkdk)2=dk−12(gTk−1dk−1)2+1−l2k−1gk2,(1.15)wherelk−1isgivenbylk−1=gTkdk−1gTk−1dk−1.(1.16)Denoteθktobetheanglebetween−gkanddk,namely,cosθk=−gTkdkgkdk.(1.17)Thenitfollowsfrom(1.15)thatcos−2θk=gk2gk−12cos−2θk−1+1−l2k−1.(1.18)IncaseofstrongWolfelinesearches,wehaveby(1.9)that|lk−1|σ.Supposethatat(k−1)thiterationanunfortunatesearchdirectionisgenerated,suchthatcosθk−1≈0,andthatxk≈xk−1.Thengk≈gk−1.Itfollowsfromthis,(1.18)and|lk−1|σthatcosθk≈0.
本文标题:an efficient hybrid conjugate gradient method
链接地址:https://www.777doc.com/doc-3328452 .html