您好,欢迎访问三七文档
ElementsofInformationTheorySecondEditionSolutionstoProblemsThomasM.CoverJoyA.ThomasSeptember22,20061COPYRIGHT2006ThomasCoverJoyThomasAllrightsreserved2Contents1Introduction72Entropy,RelativeEntropyandMutualInformation93TheAsymptoticEquipartitionProperty494EntropyRatesofaStochasticProcess615DataCompression976GamblingandDataCompression13934CONTENTSPrefaceTheproblemsinthebook,\ElementsofInformationTheory,SecondEdition,werechosenfromtheproblemsusedduringthecourseatStanford.Mostofthesolutionsherewerepreparedbythegradersandinstructorsofthecourse.WewouldparticularlyliketothankProf.JohnGill,DavidEvans,JimRoche,LauraEkrootandYoungHanKimfortheirhelpinpreparingthesesolutions.Mostoftheproblemsinthebookarestraightforward,andwehaveincludedhintsintheproblemstatementforthedicultproblems.Insomecases,thesolutionsincludeextramaterialofinterest(forexample,theproblemoncoinweighingonPg.12).Wewouldappreciateanycomments,suggestionsandcorrectionstothisSolutionsMan-ual.TomCoverJoyThomasDurand121,InformationSystemsLabStratifyStanfordUniversity701NShorelineAvenueStanford,CA94305.MountainView,CA94043.Ph.415-723-4505Ph.650-210-2722FAX:415-723-8473FAX:650-988-2159Email:cover@isl.stanford.eduEmail:jat@stratify.com56CONTENTSChapter1Introduction78IntroductionChapter2Entropy,RelativeEntropyandMutualInformation1.Coinips.Afaircoinisippeduntiltherstheadoccurs.LetXdenotethenumberofipsrequired.(a)FindtheentropyH(X)inbits.Thefollowingexpressionsmaybeuseful:1Xn=0rn=11 r;1Xn=0nrn=r(1 r)2:(b)ArandomvariableXisdrawnaccordingtothisdistribution.Findan\ecientsequenceofyes-noquestionsoftheform,\IsXcontainedinthesetS?CompareH(X)totheexpectednumberofquestionsrequiredtodetermineX.Solution:(a)ThenumberXoftossestilltherstheadappearshasthegeometricdistributionwithparameterp=1=2,whereP(X=n)=pqn 1,n2f1;2;:::g.HencetheentropyofXisH(X)= 1Xn=1pqn 1log(pqn 1)= 1Xn=0pqnlogp+1Xn=0npqnlogq#= plogp1 q pqlogqp2= plogp qlogqp=H(p)=pbits.Ifp=1=2,thenH(X)=2bits.910Entropy,RelativeEntropyandMutualInformation(b)Intuitively,itseemsclearthatthebestquestionsarethosethathaveequallylikelychancesofreceivingayesoranoanswer.Consequently,onepossibleguessisthatthemost\ecientseriesofquestionsis:IsX=1?Ifnot,isX=2?Ifnot,isX=3?...witharesultingexpectednumberofquestionsequaltoP1n=1n(1=2n)=2:ThisshouldreinforcetheintuitionthatH(X)isamea-sureoftheuncertaintyofX.Indeedinthiscase,theentropyisexactlythesameastheaveragenumberofquestionsneededtodeneX,andingeneralE(#ofquestions)H(X).Thisproblemhasaninterpretationasasourcecod-ingproblem.Let0=no,1=yes,X=Source,andY=EncodedSource.Thenthesetofquestionsintheaboveprocedurecanbewrittenasacollectionof(X;Y)pairs:(1;1),(2;01),(3;001),etc..Infact,thisintuitivelyderivedcodeistheoptimal(Human)codeminimizingtheexpectednumberofquestions.2.Entropyoffunctions.LetXbearandomvariabletakingonanitenumberofvalues.Whatisthe(general)inequalityrelationshipofH(X)andH(Y)if(a)Y=2X?(b)Y=cosX?Solution:Lety=g(x).Thenp(y)=Xx:y=g(x)p(x):Consideranysetofx'sthatmapontoasingley.ForthissetXx:y=g(x)p(x)logp(x)Xx:y=g(x)p(x)logp(y)=p(y)logp(y);sincelogisamonotoneincreasingfunctionandp(x)Px:y=g(x)p(x)=p(y).Ex-tendingthisargumenttotheentirerangeofX(andY),weobtainH(X)= Xxp(x)logp(x)= XyXx:y=g(x)p(x)logp(x) Xyp(y)logp(y)=H(Y);withequalityigisone-to-onewithprobabilityone.(a)Y=2Xisone-to-oneandhencetheentropy,whichisjustafunctionoftheprobabilities(andnotthevaluesofarandomvariable)doesnotchange,i.e.,H(X)=H(Y).(b)Y=cos(X)isnotnecessarilyone-to-one.HenceallthatwecansayisthatH(X)H(Y),withequalityifcosineisone-to-oneontherangeofX.Entropy,RelativeEntropyandMutualInformation113.Minimumentropy.WhatistheminimumvalueofH(p1;:::;pn)=H(p)asprangesoverthesetofn-dimensionalprobabilityvectors?Findallp'swhichachievethisminimum.Solution:Wewishtondallprobabilityvectorsp=(p1;p2;:::;pn)whichminimizeH(p)= Xipilogpi:Now pilogpi0,withequalityipi=0or1.HencetheonlypossibleprobabilityvectorswhichminimizeH(p)arethosewithpi=1forsomeiandpj=0;j6=i.Therearensuchvectors,i.e.,(1;0;:::;0),(0;1;0;:::;0),...,(0;:::;0;1),andtheminimumvalueofH(p)is0.4.Entropyoffunctionsofarandomvariable.LetXbeadiscreterandomvariable.ShowthattheentropyofafunctionofXislessthanorequaltotheentropyofXbyjustifyingthefollowingsteps:H(X;g(X))(a)=H(X)+H(g(X)jX)(2.1)(b)=H(X);(2.2)H(X;g(X))(c)=H(g(X))+H(Xjg(X))(2.3)(d)H(g(X)):(2.4)ThusH(g(X))H(X):Solution:Entropyoffunctionsofarandomvariable.(a)H(X;g(X))=H(X)+H(g(X)jX)bythechainruleforentropies.(b)H(g(X)jX)=0sinceforanyparticularvalueofX,g(X)isxed,andhenceH(g(X)jX)=Pxp(x)H(g(X)jX=x)=Px0=0.(c)H(X;g(X))=H(g(X))+H(Xjg(X))againbythechainrule.(d)H(Xjg(X))0,withequalityiXisafunctionofg(X),i.e.,g(:)isone-to-one.HenceH(X;g(X))H(g(X)).Combiningparts(b)and(d),weobtainH(X)H(g(X)).5.Zeroconditionalentropy.ShowthatifH(YjX)=0,thenYisafunctionofX,i.e.,forallxwithp(x)0,thereisonlyonepossiblevalueofywithp(x;y)0.Solution:ZeroConditionalEntropy.Assumethatthereexistsanx,sayx0andtwodierentvaluesofy,sayy1andy2suchthatp(x0;y1)0andp(x0;y2)0.Thenp(x0)p(x0;y1)+p(x0;y2)0,andp(y1jx0)andp(y2jx0)arenotequalto0o
本文标题:Solution-to-Information-Theory-2nd-Edition-pdf
链接地址:https://www.777doc.com/doc-6140563 .html