您好,欢迎访问三七文档
当前位置:首页 > 商业/管理/HR > 信息化管理 > ppt-2013-Deconvolutional Networks
DeconvolutionalNetworksMatthewD.ZeilerGrahamW.TaylorRobFergusDept.ofComputerScience,CourantInstitute,NewYorkUniversityMattZeiler Overview• “Generative”imagemodel• Convolutionalformofsparsecoding• Poolingwithlatentvariables(what/where)– Integratedintocostfunction(differentiable)• LearnfeaturesforobjectrecognitionTalkOverview• Singlelayer– ConvolutionalSparseCoding– GaussianPooling• Multiplelayers– Multi-layerinference– Filterlearning• Relatedwork• ExperimentsTalkOverview• Singlelayer– ConvolutionalSparseCoding– GaussianPooling• Multiplelayers– Multi-layerinference– Filterlearning• Relatedwork• ExperimentsRecap:SparseCoding(Patch-based)• Over-completelineardecompositionofinputusingdictionaryDictionaryInput • regularizationyieldssolutionswithfewnon-zeroelements:• Outputissparsevector:yyDD1C(y,D)=argminpλ2Dp−y22+|p|1p=[0,0.3,0,...,0.5,...,0.2,...,0]ConvolutionalNetworks• Feed-forward:– Convolveinput– Non-linearity– Pooling• Supervised• Encoder-onlyInputImageConvolution(Learned)Non-linearityPoolingLeCun et al. 1998 FeaturemapsDeconvolutionalNetworks• Feed-back:– Unpoolfeaturemaps• Usinginferredlatentvariables– Convolveunpooledmaps• Learnedfilters• Unsupervised– Mustreconstructinput– Sparsityconstraint• Decoder-only– HavetoinferfeaturesInputImageConvolution(learned)Unpooling(latentvars.)FeaturemapsSparsityConstraint[Zeileretal.CVPR’10,ICCV’11] SingleLayerArchitecture* Σ InputimageyConvLayer1UnpoolLayer1UnpooledMapz1,1FeatureMapp1,11stLayerFilters|.|0.5 θ1,1f11,1 fC1,1 f1B,1 fCB,1 UnpoolingVars * UnpooledMapzK,1|.|0.5 θK,1UnpoolingVars * * Top Down • Decompositionofinputimage• Over-completeàper-elementsparsityconstraintGaussianUnpooling FeatureMapp!(1):µx,µy,x,y!(2):µx,µy,x,y!(3):µx,µy,x,y!(4):µx,µy,x,y(Un)poolingVariables!NeighborhoodN1N2N4Unpooledfeaturemapzxy• Eachunpoolingregionhasitsown2DGaussian • Gaussianweightsscaledbyfeaturemapactivation• Differentiablerepresentation(What) (Where) SingleLayerCostFunctionλ2FUθp−y22+|p|αInputImage FeatureMaps Reconstruction Sparsity(per-element) Reconstructedimage ˆy=kzk∗fkUnpooling[withGaussianparametersθ] zk=Uθpk12(featuremapindex) ˆy=kzk∗fkzk=Uθpkλ2FUθp−y22+|p|αSingleLayerCostFunctionInputImage FeatureMaps Reconstructedimage Unpooling[withGaussianparametersθ] Inferforeachimage Learned(shareacrossallimages) \frac{\lambda}{2} \| \textcolor{green}{F} U_{\textcolor{red}{\theta}} \textcolor{red}{p} -‐ y \|_2^2 + |\textcolor{red}{p}|_\alpha \hat{y}= \sum_k z_k * \textcolor{green}{f_k} \textcolor{red}{z_k} = U_{\textcolor{red}{\theta}} \textcolor{red}{p_k} 12(featuremapindex) SingleLayerInference• FeatureMapsp(What)– Fixconvolutionfiltersfandpoolingvariablesθ – Convolutionalformofsparsecoding– UseISTA[Beck&TeboulleSIAMJ.ImagingSciences2009]:• Gradientsteponreconstructionterm(Quadratic)• Gradientsteponsparsityterm• Projecttobenon-negative• Poolingvariablesθ (Where)– Fixconvolutionfiltersfandfeaturemapsp– Chainruleofderivativestoupdatemean&precisionofeachGaussianpoolingneighborhoodSingleLayerExample16FeatureMaps UnpooledFeatureMaps InputImage Unpooling Reconstruction Filters Convolution&Sum EffectofPoolingVariablesPixel-spaceprojectionsofsamplefeaturemapactivations Filtercoefficients ReconstructionExamples TalkOverview• Singlelayer– ConvolutionalSparseCoding– GaussianPooling• Multiplelayers– Multi-layerinference– Filterlearning• Relatedwork• ExperimentsStackingtheLayers• Takepooledmapsasinputtonextlayer• Jointinferenceoveralllayers– Onlypossiblewithdifferentiablepooling• Objectiveisreconstructionerrorofinputimage– Notlayerbelow,likemostdeepmodels• Sparsity&poolingmakemodelnon-linear– Noexplicitnon-linearities(e.g.sigmoid)OverallArchitecture(2layers)* Σ InputimageyConvLayer1ConvLayer2UnpoolLayer1UnpooledMapz1,1FeatureMapp1,11stLayerFilters2ndLayerFilters|.|0.5 z1,2 p1,2UnpoolLayer2θ1,2θ1,1zB,2 f11,1 fC1,1 f1B,1 fCB,1 UnpoolingVars * UnpooledMapzB,1FeatureMappB,1|.|0.5 θB,1UnpoolingVars * * Σ Σ Σ Σ * * * * * * pB,2θB,2f11,2 f1B,2 θ1,θ2p2• Considerlayer2inference:– Wanttominimizereconstructionerrorofinputimage,subjecttosparsity.– Updatefeaturemapsattop(p2)andpoolingvariables(θ1,θ2)frombothlayers• Updatefeatures(ISTA):1.Reconstructinput2.Computeerror3.Forwardprop.error4.Gradientstep5.Shrinkage• UpdateGaussianpoolingvariables:Combinetop-downwithbottom-uperrorChainruleto• Noexplicitnon-linearitiesbetweenlayers– Butstillgetverynon-linearbehaviorMulti-layerJointInferenceLayer1featuresp1ReconstructedinputyLayer2unpooledfeaturesz2L0.5SparsityLayer1unpooledfeaturesz1UnpoolVarsθ1Uθ1 Pθ1 F1 FT1 F2 FT2 ^ R2 R2 T Layer2featuresp2UnpoolVarsθ2Uθ2 Pθ2 ˆy−y22FilterLearning∂C∂F=zTλ(Fz−y)→0• Goal:updateconvolutionalfiltersf• Fixedfeaturemapsp&poolingvariablesθ àfrominferenceonalltrainingimages• Over-constrainedleast-squaresproblem• UseConjugateGradients• NormalizetounitL2length&projectpositive• Learnfilterslayer-by-layer• Jointtrainingdoesn’tseemtoworkTwoLayerExamp
本文标题:ppt-2013-Deconvolutional Networks
链接地址:https://www.777doc.com/doc-5120335 .html