您好,欢迎访问三七文档
AOne-PassSequentialMonteCarloMethodforBayesianAnalysisofMassiveDatasetsSuhridBalakrishnan†DavidMadigan‡†DepartmentofComputerScience‡DepartmentofStatisticsRutgersUniversityPiscataway,NJ08854August28,2004SecondWorkshoponMonteCarloMethods1PracticalitiesofBayesianAnalysis•Inallbuttrivialcases,analyticalposteriorunavailable.•Sequentialsetupisappealing,butmostpriorsarenotconjugate.•Approximations(Normal/Laplace)maynotbefeasible.SecondWorkshoponMonteCarloMethods2PracticalitiesofBayesianAnalysis•Inallbuttrivialcases,analyticalposteriorunavailable.•Sequentialsetupisappealing,butmostpriorsarenotconjugate.•Approximations(Normal/Laplace)maynotbefeasible.•MCMCistypicallyemployed.However,MCMCneedstolaprepeatedlythroughthedataset(#laps≥lengthofthechain).SecondWorkshoponMonteCarloMethods2PracticalitiesofBayesianAnalysis•Inallbuttrivialcases,analyticalposteriorunavailable.•Sequentialsetupisappealing,butmostpriorsarenotconjugate.•Approximations(Normal/Laplace)maynotbefeasible.•MCMCistypicallyemployed.However,MCMCneedstolaprepeatedlythroughthedataset(#laps≥lengthofthechain).•Whatifyourdatasetistoolargeforthistobefeasible?SecondWorkshoponMonteCarloMethods2Problemformulation•Goal:tocomputetheexpectedvalueofh(θ)E(h(θ)|x1,...,xN)=Zh(θ)f(θ|x1,...,xN)dθ(1)•f(θ|x)istheposteriordensityoftheparametersgiventheobserveddatax=x1,...,xN.TheMonteCarloapproximationforthisexpectedvalue,basedonMsamplesfromtheposterior,θ1,...,θMwouldbe1MPMi=1h(θi).SecondWorkshoponMonteCarloMethods3Problemformulation•Goal:tocomputetheexpectedvalueofh(θ)E(h(θ)|x1,...,xN)=Zh(θ)f(θ|x1,...,xN)dθ(1)•f(θ|x)istheposteriordensityoftheparametersgiventheobserveddatax=x1,...,xN.TheMonteCarloapproximationforthisexpectedvalue,basedonMsamplesfromtheposterior,θ1,...,θMwouldbe1MPMi=1h(θi).•Howevermassivedata(andmodelcomplexity)makeithardtosamplefromf(θ|x).SecondWorkshoponMonteCarloMethods3Problemformulation•Goal:tocomputetheexpectedvalueofh(θ)E(h(θ)|x1,...,xN)=Zh(θ)f(θ|x1,...,xN)dθ(1)•f(θ|x)istheposteriordensityoftheparametersgiventheobserveddatax=x1,...,xN.TheMonteCarloapproximationforthisexpectedvalue,basedonMsamplesfromtheposterior,θ1,...,θMwouldbe1MPMi=1h(θi).•Howevermassivedata(andmodelcomplexity)makeithardtosamplefromf(θ|x).•MainIdeas:Useimportancesampling,setupprobleminadatasequentialmanner,i.e.particlefiltering.[RidgewayandMadigan,2002,Chopin,2002a]SecondWorkshoponMonteCarloMethods3ImportancesamplingCannotsamplefrom“targetdensity,”f(θ|x),butcanfroma“samplingdensity,”g(θ).Then:Zh(θ)f(θ|x1,...,xN)dθ=Zh(θ)f(θ|x)g(θ)g(θ)dθ(2)=limM→∞1MMXi=1wih(θi)(3)θiisadrawfromg(θ)andwi=f(θi|x)/g(θi).Sincetheexpectedvalueofwiunderg(θ)is1,weneedonlycomputeweightsuptoaconstantofproportionalityandthennormalize:Zh(θ)f(θ|x1,...,xN)dθ=limM→∞PMi=1wih(θi)PMi=1wi(4)SecondWorkshoponMonteCarloMethods4SequentialformulationLetthesamplingdistributiong(θ)=f(θ|x1,...,xn)wheren≪N.Note:Wearepartitioningthedatasetintotwodisjointpieces,amanageableportionD1:n=x1,...,xnandtheremainderofthedata,Dn+1:N=xn+1,...,xN.Theimportanceweightssimplify(tothelikelihoodoftheobservationsevaluatedateachparticle):wi=f(θi|x)/g(θi)=f(θi|D1:N)/f(θi|D1:n)(5)=f(D1:N|θi)f(θi)f(D1:N)f(D1:n)f(D1:n|θi)f(θi)(6)∝f(Dn+1:N|θi)=Yxj∈Dn+1:Nf(xj|θi)(7)SecondWorkshoponMonteCarloMethods5Formulationcontd.1.LoadasmuchdataintomemoryaspossibletoformD1:n2.DrawMtimesfromf(θ|D1:n)viaMonteCarloorMarkovchainMonteCarlo3.Iteratethroughtheremainingobservations(thosethatcompriseDn+1:N).Foreachobservation,xj,updatethelog-weightsonallofthedrawsfromf(θ|D1:n).Setj=n+1.WhilejNforiin1,...,Mdowi←wi×f(xj|θi)SecondWorkshoponMonteCarloMethods6Formulationcontd.1.LoadasmuchdataintomemoryaspossibletoformD1:n2.DrawMtimesfromf(θ|D1:n)viaMonteCarloorMarkovchainMonteCarlo3.Iteratethroughtheremainingobservations(thosethatcompriseDn+1:N).Foreachobservation,xj,updatethelog-weightsonallofthedrawsfromf(θ|D1:n).Setj=n+1.WhilejNforiin1,...,Mdowi←wi×f(xj|θi)•Arewedone?SecondWorkshoponMonteCarloMethods6Degeneracy•Ifallwedoisre-weightexistingparticles,sampledegeneracyquicklybecomesanissue.Figure1:Comparisonoff(θ|D1:n,Dn+1:N)(dashed)andf(θ|D1:n)(solid)Var(θ|D1:n)=E(Var(θ|D1:n,Dn+1:N))+Var(E(θ|D1:n,Dn+1:N))SecondWorkshoponMonteCarloMethods7Furtherillustratingdegeneracy•Imagesfrom“TutorialonParticlefilters”byKeithCopseySecondWorkshoponMonteCarloMethods8SequentialMonteCarloIdeas•Fightsampledegeneracybyresampling[Gordonetal.,1993,Kitagawa,1996]andrejuvenatingparticles(wewillmakepreciseexactlywhenalittlelater...)usingamovestep[GilksandBerzuini,2001].ThisisasingleMetropolis-Hastingsstep,conditionedonallthedataseenthusfar,x′.1.Drawaproposalθ′fromq(θ|θi−1),2.Computetheacceptanceprobabilityα(θ′,θi−1)=min1,f(θ′|x′)q(θi−1|θ′)f(θi−1|x′)q(θ′|θi−1)(8)3.Withprobabilityα(θ′,θi−1)setθi=θ′.Otherwisesetθi=θi−1SecondWorkshoponMonteCarloMethods9Remainingissues1.Whentoapplythisresample-movestep2.Thisisaveryexpensivestep!SecondWorkshoponMonteCarloMethods10Remainingissues1.Whentoapplythisresample-movestep2.Thisisaveryexpensivestep!•1.MonitortheEffectiveSampleSize(ESS)[Kongetal.,1994,Liu,2001].TheESSisthenumbe
本文标题:A one-pass sequential Monte Carlo method for Bayes
链接地址:https://www.777doc.com/doc-5065389 .html