您好,欢迎访问三七文档
TemporalSequenceProcessingusingRecurrentSOMTimoKoskelaMarkusVarstaJukkaHeikkonenKimmoKaskiHelsinkiUniversityofTechnologyLaboratoryofComputationalEngineeringP.O.Box9400,FIN-02015HUTFinlandKeywords:RecurrentSelf-OrganizingMap,TemporalsequenceprocessingAbstractRecurrentSelf-OrganizingMap(RSOM)isstudiedintemporalsequenceprocessing.RSOMincludesarecurrentdifferencevectorineachunitofthemap,whichallowsstoringtemporalcontextfromconsec-utiveinputvectorsfedtothemap.RSOMisamod-ificationoftheTemporalKohonenMap(TKM).ItisshownthatRSOMlearnsacorrectmappingfromtemporalsequencesofasimplesyntheticdata,whileTKMfailstolearnthismapping.Inaddition,twocasestudiesarepresented,inwhichRSOMisap-pliedtoEEGbasedepilepticactivitydetectionandtotimeseriespredictionwithlocalmodels.ResultssuggestthatRSOMcanbeefficientlyusedintempo-ralsequenceprocessing.1IntroductionTemporalsequenceprocessing(TSP)isaresearchareahavingapplicationsindiversefieldsvaryingfromweatherforecastingtotimeseriesprediction,speechrecognitionandremotesensing.Inordertoconstructamodelforaprocess,dataisgatheredbymeasuringvaluesofcertainvariablessequentiallyintime.Usuallydataisincompleteandincludesnoise.Thegoalformodelbuildingistorevealtheunderly-ingprocessfromdata.Modelisestimatedbyusingstatisticalmethodstofindregularitiesandnonlineardependenciesthatexistinthedata.Usuallythemod-elthatpredictsthefutureoftheprocessmostaccu-ratelyisconsideredtobethebestmodel.Severalcomputationaltechniqueshavebeenpro-posedtogainmoreinsightintoprocessesandphe-nomenathatincludetemporalinformation.Statisti-calmethodsbasedonlinear(e.g.ARandARMA)andnon-linear(e.g.NARMAXandMARS)havebeeneffectivelyusedinmanyapplications[3].Re-centlyneuralnetworkshavegainedalotofinterestinTSPduetotheirabilitytolearneffectivelynonlineardependenciesfromlargevolumeofpossiblynoisydatawithalearningalgorithm.Whilemanyarchi-tecturese.g.multilayerperceptron(MLP)andradialbasisfunctionnetwork(RBF)havebeenproofedtobeuniversalfunctionapproximators,thisdoesnotnecessarilyimplytheirusabilityinTSP.TraditionalwayofusingneuralnetworksinTSPistoconvertthetemporalsequenceintoconcatenat-edvectorviaatappeddelayline,andtofeedthere-sultingvectorasaninputtoanetwork[11].Thistime-delayneuralnetworkapproach,however,hasitswellknowdrawbacks,oneofthemostseriousonesofbeingthedifficultytodeterminetheproperlengthforthedelayline.Thereforeanumberofdy-namicneuralnetworksmodelshavebeendesignedforTSPtocaptureinherentlytheessentialcontextofthetemporalsequencewithouttheneedofexter-naltimedelaymechanics.Inthesemodelslearningequationsareoftendescribedbydifferentialordif-ferenceequationsandtheinterconnectionsbetweenthenetworkunitsmayincludeasetoffeedbackcon-nections,i.e.,thenetworksarerecurrentinnature(see[11,14]).Mostrecurrentneuralnetworksaretrainedviasu-pervisedlearningrules.Onlyquiterareunsuper-visedneuralnetworksmodelshavebeenproposedforTSP,although,itcanbearguedthatintempo-ralsequenceanalysisunsupervisedneuralnetwork-scouldrevealusefulinformationfromthetemporalsequencesathandinanalogytounsupervisedneuralnetworks’reportedpowerinclusteranalysis,dimen-sionalityreductionandvisualizationoftheir’static’inputspaces.Moreover,inmanyTSPapplicationsunsupervisedlearningcouldutilizemoreeffectivelytheavailabletemporaldatathansupervisedlearningmethods,becausenopreclassificationorprelabelingoftheinputdataisneeded.BasedontheabovetheneedsfortheunsupervisedlearningmethodsinTSPareimmense.TemporalKohonenMap(TKM)[1]isoneinter-estingunsupervisedapproachforTSPderivedfromtheKohonen’sSelf-OrganizingMap[6,7]algorith-m.IntheTKMtheinvolvementoftheearlierin-putvectorsineachunitisrepresentedbyusingarecursivedifferenceequationwhichdefinesthecur-rentunitactivityasafunctionofthepreviousac-tivationsandthecurrentinputvector.Recurren-tSelf-OrganizingMap(RSOM)proposedoriginal-lyin[16]canbepresentedasanenhancementfortheTKMalgorithm.InbriefRSOMdefinesad-ifferencevectorforeachunitofthemapwhichisusedforselectingthebestmatchingunitandalsoforadaptationofweightsofthemap.Differencevec-torcapturesthemagnitudeanddirectionoftheer-rorintheweightvectorsandallowslearningtem-poralcontext.WeightupdateissimilartotheSOMalgorithm—exceptthatweightvectorsaremovedto-wardsrecursivelinearsumofpastdifferencevectorsandthecurrentinputvector.Therestofthepaperisorganizedasfollows.TheRSOMalgorithmisdescribedindetailinsection2.Classificationoftemporalsequences,clusteringofEEGpatternsandtimeseriespredictionarecon-sideredinsection3.Finallysomeconclusionsaremade.2RecurrentSelf-OrganizingMapWepresentasanextensiontotheSelf-OrganizingMaptheRecurrentSelf-OrganizingMap(R-SOM)[17,8]thatallowsstoringcertaininformationfromthepastinputvectors.Theinformationiss-toredintheformofdifferencevectorsinthemapu-nits.ThemappingthatisformedduringtraininghasthetopologypreservationcharacteristicoftheSOM.TheSelf-OrganizingMap(SOM)[6]isavec-torquantizationmethodwithtopologypreservationwhenthedimensionofthemapmatchesthetruedimensionoftheinputspace.Inbrieftopologyp-reservationmeansthatinputpatternscloseinthein-putspacearemappedtounitscloseontheSOMlattice,wheretheu
本文标题:Temporal sequence processing using recurrent som
链接地址:https://www.777doc.com/doc-840279 .html