您好,欢迎访问三七文档
当前位置:首页 > 临时分类 > 人工智能结课ppt(英)
ApplicationofFisherLinearDiscriminant(FLD)inDataClassification1、FLDalgorithmoverviewFisherLinearDiscriminant(FLD)isaclassicalalgorithmforpatternrecognition.ItwasintroducedbyBelhumeurin1996inthefieldofpatternrecognitionandartificialintelligence.Thebasicideaofsexualdiscriminantanalysisistoprojecthigh-dimensionalsamplepatternsintotheoptimaldiscriminantvectorspacetoachievetheeffectofextractingtheclassificationinformationandcompressingthefeature'sspatialdimension.Afterprojection,thesamplemodelisguaranteedtohavethelargestinterclassspaceinthenewsubspace.Thedistanceandtheminimumintra-classdistance,ie,thepattern,havethebestseparabilityinthisspace.Therefore,itisaneffectivefeatureextractionmethod.Usingthismethod,theinter-classscattermatrixofthesampleafterprojectioncanbemaximized,andtheintra-classscattermatrixcanbeminimizedatthesametime.Thatistosay,itcanguaranteethatthemodelsampleafterprojectionhasthesmallestintra-classdistanceandthelargestinter-classdistanceinthenewspace,thatis,thepatternhasthebestseparabilityinthespace.2、Formuladerivation,thealgorithmformalizeddescription.Algorithmfunction:dimensionalityreductionandclassification2、Formuladerivation,thealgorithmformalizeddescription.Generallyspeaking,agooddirectioncanalwaysbefoundwhichisveryeasytoseparatesamplesfromthelinesprojected,seekingthebestprojectionvectorW*isFisheralgorithmtodo,thespecificmethodisasfollows:1)Calculatingthemeanvectorofallkindsofsamplesmi;NiisthenumberofsamplesofthewiclassiwXiiXNm1i=1,2(1)i=1,2(2)iwXiimXmXiS(3)21SSSw2)TheintraclasdispersionmatrixSiandthetotalintraclassscattermatrixSwarecaculated2、Formuladerivation,thealgorithmformalizeddescription.3)calculateaninterclassdiscretenessmatrixforsamplesSb4)FindthebestprojectionvectorW*(4)2121mmmmSbWSWWSWWJwbfisherWehopethataftertheprojection,thesamplesareasseparateaspossibleintheone-dimensionalspace,thatis,thedifferencebetweenthemeanofthetwotypesofsamplesisbettertolarge.wealsohopethatthelessthebettertheintraclassflickeris,soweintroducetheFisheridentificationexpressionhere:21~~mm(5)2、Formuladerivation,thealgorithmformalizeddescription.1WSWw(7)The,AfterjoiningLagrangemultiplier,guide1WSWWSWWCwb(8)022WSWSdWCdwb(9)WSWSwb(6)211mmSWwTheobtainthemaximum,theW*isWJfisherThederivationprocessisasfollows:By(4),ithastobe(11)wbmmWmmmmWS212121Thefinaleigenvalueformulaisderived.(12)WmmSWSSwwbw2111Duetowthesizeofanytimesdoesnotaffecttheresults,soitcanbeanunknownconstanttobothsidesand,obtained:w(13)211mmSWwTheresultsafterthederivationaremultipliedby,obtain1wS(10)WWSSbw12、Formuladerivation,thealgorithmformalizeddescription.XWy6)Computingthesegmentationthresholdy0ontheprojectionspace(14)7)ForagivenX,caculatedtheprojectionyontheW*XWy*5)Projectionofallsamplesinthetrainingset.212211~~NNmNmNy0(15)(16)8)Classificationaccordingtodecisionrules21wXyywXyyoo(17)3、MatlabinstanceoperationOpentheprogram'zhucx.m',the'lda_touyin.m‘and'local_roc_calculate.m'isthesubroutine,Theexecutionprogramhasthefollowingresults:Fig.1scatterplottoclassifydatafeaturesThedataisatwo-class2-dimensionalfeaturematrix,eachofwhichis2×100Fig.2TheprojectionvectorsobtainedbytheFisheralgorithmprojectionthetwodimensionalfeaturesintoastraightlineandbecomeonedimensionalfeature,preparingforthelatterclassificationFig.3Inthelaststep,wegetonedimensionalfeatureasascatterplot,whichcanclearlyseetheseparationoftwotypesofdata.Fig.4ClassificationresultFig.5ROCcurveAUC=0.9996ROC(ReceiverOperatingCharacteristic)CurvesandAUC(AreaUnderCurve)areoftenusedtoevaluateabinaryclassifier.Thecloserthecurveistotheupperleft,theclassificationeffectisbetter.Fig.6ProbabilitydensitydistributiondiagramFig.7ProbabilitydensitydistributioncurveTheresultsshowthattheFisherlineardiscriminantalgorithmachievesgoodresultsintheclassificationoftwoclassificationdata.Theaccuracyoftheclassificationobtainedbytheseexperimentsisover95%.Referencesandotherlinks:[1]ROC和AUC介绍以及如何计算AUC:[2]线性判别分析(LinearDiscriminantAnalysis,LDA)算法分析:[3]线性判别分析(LinearDiscriminantAnalysis)(一):
本文标题:人工智能结课ppt(英)
链接地址:https://www.777doc.com/doc-4161403 .html