您好,欢迎访问三七文档
当前位置:首页 > 行业资料 > 造纸印刷 > 数字图像处理,冈萨雷斯,课件英文版Chapter08图像压缩
DigitalImageProcessingChapter8:ImageCompression11August2006DatavsInformationInformation=Matter(สาระ)Data=ThemeansbywhichinformationisconveyedReducingtheamountofdatarequiredtorepresentadigitalimagewhilekeepinginformationasmuchaspossibleImageCompressionRelativeDataRedundancyandCompressionRatioRDCR11RelativeDataRedundancyCompressionRatio21nnCRTypesofdataredundancy1.Codingredundancy2.Interpixelredundancy3.PsychovisualredundancyCodingRedundancyDifferentcodingmethodsyielddifferentamountofdataneededtorepresentthesameinformation.ExampleofCodingRedundancy:VariableLengthCodingvs.FixedLengthCodingLavg3bits/symbolLavg2.7bits/symbol(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.VariableLengthCodingConcept:assignthelongestcodewordtothesymbolwiththeleastprobabilityofoccurrence.(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.InterpixelRedundancyInterpixelredundancy:Partsofanimagearehighlycorrelated.Inotherwords,wecanpredictagivenpixelfromitsneighbor.(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.RunLengthCodingThegrayscaleimageofsize343x1024pixelsBinaryimage=343x1024x1=351232bitsLineNo.100RunlengthcodingLine100:(1,63)(0,87)(1,37)(0,5)(1,4)(0,556)(1,62)(0,210)Total12166runs,eachrunuse11bitsTotal=133826Bits(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.PsychovisualRedundancyTheeyedoesnotresponsewithequalsensitivitytoallvisualinformation.8-bitgrayscaleimage4-bitgrayscaleimage4-bitIGSimageFalsecontours(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.ImprovedGrayScaleQuantizationPixeli-1ii+1i+2i+3GraylevelN/A01101100100010111000011111110100Sum0000000001101100100101111000111011110100IGSCodeN/A0110100110001111+Algorithm1.Addtheleastsignificant4bitsofthepreviousvalueofSumtothe8-bitcurrentpixel.Ifthemostsignificant4bitofthepixelis1111thenadd0000instead.KeeptheresultinSum2.Keeponlythemostsignificant4bitsofSumforIGScode.FidelityCriteria:howgoodisthecompressionalgorithm-ObjectiveFidelityCriterion-RMSE,PSNR-SubjectiveFidelityCriterion:-HumanRating(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.ImageCompressionModelsSourceencoderChannelencoderSourcedecoderChanneldecoderChannel),(ˆyxf),(yxfReducedataredundancyIncreasenoiseimmunityNoiseSourceEncoderandDecoderModelsMapperQuantizerSymbolencoder),(yxfSourceencoderInversemapperSymboldecoderSourcedecoder),(ˆyxfReduceinterpixelredundancyReducepsychovisualredundancyReducecodingredundancyChannelEncoderandDecoder-Hammingcode,Turbocode,…InformationTheoryMeasuringinformation)(log)(1log)(EPEPEIEntropyorUncertainty:AverageinformationpersymboljjjaPaPH))(log()(SimpleInformationSystemBinarySymmetricChannelA={a1,a2}={0,1}z=[P(a1),P(a2)]B={b1,b2}={0,1}v=[P(b1),P(b2)]SourceDestination(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.01(1-Pe)Pe01Pe(1-Pe)SourceDestinationPe=probabilityoferrorP(a1)1-P(a1)P(a1)(1-Pe)+(1-P(a1))Pe(1-P(a1))(1-Pe)+P(a1)PeBinarySymmetricChannelA={a1,a2}={0,1}z=[P(a1),P(a2)]B={b1,b2}={0,1}v=[P(b1),P(b2)]H(z)=-P(a1)log2P(a1)-P(a2)log2P(a2)SourceDestinationH(z|b1)=-P(a1|b1)log2P(a1|b1)-P(a2|b1)log2P(a2|b1)H(z|b2)=-P(a1|b2)log2P(a1|b2)-P(a2|b2)log2P(a2|b2)H(z|v)=H(z|b1)+H(z|b2)MutualinformationI(z,v)=H(z)-H(z|v)Capacity),(maxvzzICBinarySymmetricChannelLetpe=probabilityoferrorbsbseeeepppppp11vbsbsppapap)(1)(11z)1(log)1()(log)(22bsbsbsbsppppHz)(log))1)(1((log)1)(1())1((log)1())1((log)1()|(2222ebsebsebsebsebsebsebsebsppppppppppppppppHvz)()(),(ebsebsebsbspHppppHvzI)(1ebspHCBinarySymmetricChannel(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.CommunicationSystemModel(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.2Casestobeconsidered:NoiselessandnoisyNoiselessCodingTheoremProblem:Howtocodedataascompactaspossible?Shannon’sfirsttheorem:definestheminimumaveragecodewordlengthpersourcethatcanbeachieved.Letsourcebe{A,z}whichiszeromemorysourcewithJsymbols.(zeromemory=eachoutcomeisindependentfromotheroutcomes)thenasetofsourceoutputofnelementbe},...,,{321nJA}111,110,101,100,011,010,001,000{AExample:}1,0{Aforn=3,NoiselessCodingTheorem(cont.)Probabilityofeachjis)()()()(21jnjjjaPaPaPPEntropyofsource:)())(log()()(1zznHPPHnjiiiEachcodewordlengthl(i)canbe1)(1log)()(1logiiiPlPThenaveragecodewordlengthcanbe1)(1log)()()()(1log)(111iJiiiJiiiJiiPPlPPPnnnNoiselessCodingTheorem(cont.)nHnLHavg1)()(zzor)(limzHnLavgnTheminimumaveragecodewordlengthpersourcesymbolcannotlowerthantheentropy.CodingefficiencyavgLzHn)(Weget1)()(zzHLHavgfrom)()(zznHHthenExtensionCodingExample(ImagesfromRafaelC.GonzalezandRichardE.Wood,DigitalImageProcessing,2ndEdition.H=1.83Lavg=1.89H=0.918Lavg=1918.01918.01197.089.183.12NoisyCodingTheoremProblem:Howtocodedataasreliableaspossible?Example:Repeateachcode3times:Sourcedata={1,0,0,1,
本文标题:数字图像处理,冈萨雷斯,课件英文版Chapter08图像压缩
链接地址:https://www.777doc.com/doc-4253565 .html