您好,欢迎访问三七文档
当前位置:首页 > 商业/管理/HR > 商业计划书 > 46非线性最小二乘法
IMMMETHODSFORNON-LINEARLEASTSQUARESPROBLEMS2ndEdition,April2004K.Madsen,H.B.Nielsen,O.TingleffInformaticsandMathematicalModellingTechnicalUniversityofDenmarkCONTENTS1.INTRODUCTIONANDDEFINITIONS..............................12.DESCENTMETHODS............................................52.1.TheSteepestDescentmethod...............................72.2.Newton’sMethod..........................................82.3.LineSearch...............................................92.4.TrustRegionandDampedMethods........................113.NON-LINEARLEASTSQUARESPROBLEMS.....................173.1.TheGauss–NewtonMethod...............................203.2.TheLevenberg–MarquardtMethod.........................243.3.Powell’sDogLegMethod.................................293.4.AHybridMethod:L–MandQuasi–Newton.................343.5.ASecantVersionoftheL–MMethod......................403.6.ASecantVersionoftheDogLegMethod...................453.7.FinalRemarks...........................................47APPENDIX.......................................................50REFERENCES....................................................55INDEX...........................................................571.INTRODUCTIONANDDEFINITIONSInthisbookletweconsiderthefollowingproblem,Definition1.1.LeastSquaresProblemFindx¤,alocalminimizerfor1)F(x)=12mXi=1(fi(x))2;wherefi:IRn7!IR;i=1;:::;maregivenfunctions,andm¸n.Example1.1.Animportantsourceofleastsquaresproblemsisdatafitting.Asanexampleconsiderthedatapoints(t1;y1);:::;(tm;ym)shownbelowtyFigure1.1.Datapointsf(ti;yi)g(markedby+)andmodelM(x;t)(markedbyfullline.)Further,wearegivenafittingmodel,M(x;t)=x3ex1t+x4ex2t:1)Thefactor12inthedefinitionofF(x)hasnoeffectonx¤.Itisintroducedforconve-nience,seepage18.1.INTRODUCTIONANDDEFINITIONS2Themodeldependsontheparametersx=[x1;x2;x3;x4].Weassumethatthereexistsanxysothatyi=M(xy;ti)+i;wherethefigare(measurement)errorsonthedataordinates,assumedtobe-havelike“whitenoise”.Foranychoiceofxwecancomputetheresidualsfi(x)=yi¡M(x;ti)=yi¡x3ex1ti¡x4ex2ti;i=1;:::;m:Foraleastsquaresfittheparametersaredeterminedastheminimizerx¤ofthesumofsquaredresiduals.ThisisseentobeaproblemoftheforminDefini-tion1.1withn=4.ThegraphofM(x¤;t)isshownbyfulllineinFigure1.1.Aleastsquaresproblemisaspecialvariantofthemoregeneralproblem:GivenafunctionF:IRn7!IR,findanargumentofFthatgivestheminimumvalueofthisso-calledobjectivefunctionorcostfunction.Definition1.2.GlobalMinimizerGivenF:IRn7!IR.Findx+=argminxfF(x)g:Thisproblemisveryhardtosolveingeneral,andweonlypresentmeth-odsforsolvingthesimplerproblemoffindingalocalminimizerforF,anargumentvectorwhichgivesaminimumvalueofFinsideacertainregionwhosesizeisgivenby±,where±isasmall,positivenumber.Definition1.3.LocalMinimizerGivenF:IRn7!IR.Findx¤sothatF(x¤)·F(x)forkx¡x¤k±:Intheremainderofthisintroductionweshalldiscusssomebasicconceptsinoptimization,andChapter2isabriefreviewofmethodsforfindingalocal31.INTRODUCTIONANDDEFINITIONSminimizerforgeneralcostfunctions.FormoredetailswerefertoFrandsenetal(2004).InChapter3wegivemethodsthatarespeciallytunedforleastsquaresproblems.WeassumethatthecostfunctionFisdifferentiableandsosmooththatthefollowingTaylorexpansionisvalid,2)F(x+h)=F(x)+hg+12hHh+O(khk3);(1.4a)wheregisthegradient,g´F0(x)=2666664@F@x1(x)...@F@xn(x)3777775;(1.4b)andHistheHessian,H´F00(x)=·@2F@xi@xj(x)¸:(1.4c)Ifx¤isalocalminimizerandkhkissufficientlysmall,thenwecannotfindapointx¤+hwithasmallerF-value.Combiningthisobservationwith(1.4a)wegetTheorem1.5.Necessaryconditionforalocalminimizer.Ifx¤isalocalminimizer,theng¤´F0(x¤)=0:Weuseaspecialnameforargumentsthatsatisfythenecessarycondition:Definition1.6.Stationarypoint.Ifgs´F0(xs)=0;thenxsissaidtobeastationarypointforF.2)Unlessotherwisespecified,k¢kdenotesthe2-norm,khk=qh21+¢¢¢+h2n.1.INTRODUCTIONANDDEFINITIONS4Thus,alocalminimizerisalsoastationarypoint,butsoisalocalmaximizer.Astationarypointwhichisneitheralocalmaximizernoralocalminimizeriscalledasaddlepoint.Inordertodeterminewhetheragivenstationarypointisalocalminimizerornot,weneedtoincludethesecondordertermintheTaylorseries(1.4a).InsertingxsweseethatF(xs+h)=F(xs)+12hHsh+O(khk3)withHs=F00(xs):(1.7)Fromdefinition(1.4c)oftheHessianitfollowsthatanyHisasymmetricmatrix.IfwerequestthatHsispositivedefinite,thenitseigenvaluesaregreaterthansomenumber±0(seeAppendixA),andhHsh±khk2:Thisshowsthatforkhksufficientlysmallthethirdtermontheright-handsideof(1.7)willbedominatedbythesecond.Thistermispositive,sothatwegetTheorem1.8.Sufficientconditionforalocalminimizer.AssumethatxsisastationarypointandthatF00(xs)ispositivedefinite.Thenxsisalocalminimizer.IfHsisnegativedefinite,thenxsisalocalmaximizer.IfHsisindefinite(ieithasbothpositiveandnegativeeigenvalues),thenxsisasaddlepoint.2.DESCENTMETHODSAllmethodsfornon-linearoptimizationareiterative:Fromastartingpointx0themethodproducesaseriesofvectorsx1;x2;:::,which(hopefully)convergestox¤,alocalminimizerforthegivenfunction,seeDefinition1.3.MostmethodshavemeasureswhichenforcethedescendingconditionF(xk+1)F(xk):(2.1)Thispreventsconvergencetoamaximizerandalsomakesitlessprobablethatw
本文标题:46非线性最小二乘法
链接地址:https://www.777doc.com/doc-5537800 .html