您好,欢迎访问三七文档
EEE315InformationTheoryandCodingAssignment1DatePerformed:2011.11.4DateSubmitted:2011.11.51IntroductionInformationtheoryanswerstwofundamentalquestionsincommunicationtheory:whatistheultimatedatacompressionandwhatistheultimatetransmissionrateofcommu-nication.ThesetwoaspectscanbealsoregardastheentropyHandthechannelcapacityC.Intheearly1940s,Shannonraisedthatrandomprocesseshaveanirreduci-blecomplexitybelowwhichthesignalcannotbecompressandthishenamedentropy.Healsoarguedthatiftheentropyofthesourceislessthanthecapacityofthechannel,thenasymptoticallyerrorfreecommunicationcanbeachieved.Shannon'sinformationcontentShannon’sinformationcontentshortforSICalsonamedasself-information.Ininfor-mationtheory,itisameasureoftheinformationcontentcontainsinasingleevent.Bydefinition,theamountofSICcontainedinaprobabilisticeventdependsonlyontheprobabilityofthatevent,andSIChasaninverserelationshipwithprobability.ThenaturalmeasureoftheuncertaintyofaneventXistheprobabilityofXdenotebypx.Bydefinition,theinformationcontentinaneventasInfo{X}=-logpxThemeasureofinformationhassomeintuitivepropertiessuchas:1.Informationcontainedintheeventsoughttobedefinedintermsofsomemeasureofuncertaintyoftheevent.2.Lesscertaineventsoughttocontainmoreinformationthanmorecertainevents.3.Theinformationofunrelatedeventstakenasasingleeventshouldequalthesumoftheinformationoftheunrelatedevents.TheunitofSICis“bits”ifbase2isusedforthelogarithm,and“nats”ifthenaturallogarithmisused.EntropyTheentropyquantifiestheexpectedvalueoftheinformationcontainedinamessage.Theentropycanbeviewedas:1.Ameasureoftheminimumcostneededtosendsomeformofinformation.2.“Theamountofsurprisefactor”oftheinformationmeasuredinbits.23.Orhowmuchenergyitisworthspendingtocarrytheinformationwhichtranslatestotheminimumnumberofbitsneededtocodetheinformation.Theentropyisdefinedas()∑()()Itcanbeviewedfromanumberofperspectives:1.TheaverageSICofX2.Theamountofinformationgainedifitsvaluesareknown.3.Theaveragenumberofbinaryquestionneededtofindoutitsvalueisin[H(X),H(X)+1]EntropyisquantifiedintermsofNatsorBits.Ifthesourceiscontinuous,theentropycanbewrittenas()∫()()MutualInformationMutualinformationoftworandomvariablesisaquantitythatmeasuresthemutualdependenceofthetworandomvariables.Itisreductionintheuncertaintyofonerandomvariableduetotheknowledgeoftheother.Themostcommonunitofmeasure-mentofmutualinformationisthebit,whenlogarithmstothebase2areused.ConsidertwoRVsX,Y.ThemutualinformationI(X,Y)istherelativeentropybetweenthejointdistributionP(X,Y)andtheproductdistributionP(X)P(Y).()∑∑()()()()Thecaseofcontinuousrandomvariables,themutualinformationis()∫∫()()()()wherep(x)and(y)arethemarginalprobabilitydistributionfunctionsofXandYrespectively.3Mutualinformationcanbeequivalentexpressedas()()(|)()(|)()(|)(|)()()()TherelationshipbetweenMutualInformationandVariousEntropiesshowsasFig.1Fig.1IndividualH(X),H(Y),jointH(X,Y)andconditionalentropiesforapairofcorrelatedsubsystemsX,YwithmutualinformationI(X;Y).Question1Thecodeofthefirstquestionshowsasfollows:function[entropy]=findEntropy(array)L=length(array);entropy=0.0;fori=1:Lentropy=entropy+array(i)*log2(1/array(i));enddisp('Theentropyforthesourceis');arrayend4Therunningresultshowsasfollows:5Question2Bydefinitionofthechannelcapacity,theinformationchannelcapacityofadiscretememorylesschannelas()wherethemaximumistakenoverallpossibleinputdistributionsp(x).ThemutualinformationbetweenXandYis()∑∑()()()()∑()∑(|)()()()∑(|)()(|)()(|)()((|)())(|)()((|)())(|)()((|)())(|)()((|)())Thefig.2illustratesthat(|)(|)(|)(|)Theequationsrevealthatwhenp=0.5,theuniformdistributedinputalphabetcangetthemaximummutualinformation.Therefore,thecapacityoftheBSCis()&()()()ThereforetheCis()()6Fig.2TheMatlabcodeforthisquestionisp=[0:0.001:1];C=1+(1-p).*log2(1-p)+p.*log2(p);plot(p,C)xlabel('Crossprobabilityp');ylabel('ThecapacityofaBSC');gridonTheresultisFig.3ThecapacityofaBSCwithcrossprobabilitypasfunctionofpwhere0p17Question3Basedonthepreviousquestion()(|)()((|)())(|)()((|)())(|)()((|)())(|)()((|)())Dueto:p(Y=0|X=1)=0.1;P(Y=1|X=0)=0.2;SoP(Y=1|X=1)=0.9;P(Y=0|X=0)=0.8;Moreover:P(X=1)=P;P(X=0)=1-P;P(Y=1)=0.2+0.7P;P(Y=0)=0.8-0.7P;Insertthesevaluesintothemutualinformationequationderivedinthepreviousquestion()()()()()()8()TheMatlabcodeforthequestionisp=0:0.05:1;t1=0.8*(1-p).*log2(0.8./(0.8-0.7*p));t2=0.1*p.*log2(0.1./(0.8-0.7*p));t3=0.2*(1-p).*log2(0.2./(0.2+0.7*p));t4=0.9*p.*log2(0.9./(0.2+0.7*p));I=t1+t2+t3+t4;plot(p,I)gridon.ThecorrespondingresultisFig.4MutualInformationbetweeninputandoutputofthefunctionofpwherepistheprobabilityoftransmittinga‘1’9DiscussionQuestion2canberegardasaspecialcaseofquestion3becauseitisabinarysymmetricchannel.Accordingtothesecondquestion,thelowestcapacityChappenswhenprobabilitypiszeroorone,andwhenpis0.5,theCiszero.Attwoterminalsofp-axis,Creachesitstopvaluethatis1.Forthethirdquestion,thefigureissimilarwiththepreviousonebutinversed.Whenpis0.5,theMutualInformationreachesmaximumvalue0.4,whenpis0and1,theMutualInformationIiszero.ConclusionTheassignmentfirstlystatedtheconceptofS
本文标题:信息论试题
链接地址:https://www.777doc.com/doc-2714588 .html