您好,欢迎访问三七文档
AnExtendedTransformationApproachtoInductiveLogicProgrammingNADALAVRAˇCJoˇzefStefanInstituteandPETERA.FLACHUniversityofBristolInductivelogicprogramming(ILP)isconcernedwithlearningrelationaldescriptionsthattypicallyhavetheformoflogicprograms.Inatransformationapproach,anILPtaskistransformedintoanequivalentlearningtaskinadifferentrepresentationformalism.Propositionalizationisaparticu-lartransformationmethod,inwhichtheILPtaskiscompiledtoanattribute-valuelearningtask.ThemainrestrictionofpropositionalizationmethodssuchasLINUSisthattheyareunabletodealwithnondeterminatelocalvariablesinthebodyofhypothesisclauses.Inthispaperweshowhowthislimitationcanbeovercome,bysystematicfirst-orderfeatureconstructionusingaparticularindividual-centeredfeaturebias.Theapproachcanbeappliedinanydomainwherethereisaclearnotionofindividual.Wealsoshowhowtoimproveuponexhaustivefirst-orderfeatureconstructionbyusingarelevancyfilter.Theproposedapproachisillustratedonthe“trains”and“mutagenesis”ILPdomains.CategoriesandSubjectDescriptors:I.2.3[ArtificialIntelligence]:DeductionandTheoremProv-ing—Logicprogramming;I.2.4[ArtificialIntelligence]:KnowledgeRepresentationFormalismsandMethods—Predicatelogic;I.2.6[ArtificialIntelligence]:Learning—Conceptlearning;InductionGeneralTerms:Algorithms,ExperimentationAdditionalKeyWordsandPhrases:Datamining,inductivelogicprogramming,machinelearning,relationaldatabases1.INTRODUCTIONInductivelogicprogramming(ILP)[Muggleton1992;MuggletonandDeRaedt1994;LavraˇcandDˇzeroski1994;BergadanoandGunetti1995;Nienhuys-ChenganddeWolf1997]isaresearchareathathasitsbackgroundsininduc-tivemachinelearningandcomputationallogic.ILPresearchaimsataformalframeworkaswellaspracticalalgorithmsforinductivelearningofrelationaldescriptionsthattypicallyhavetheformoflogicprograms.FromcomputationalAuthors’addresses:N.Lavraˇc,JoˇzefStefanInstitute,Jamova39,1000Ljubljana,Slovenia;email:Nada.Lavrac@ijs.si;P.A.Flach,UniversityofBristol,WoodlandRoad,BristolBS81UB,UnitedKingdom;email:Peter.Flach@bristol.ac.uk.Permissiontomakedigital/hardcopyofallorpartofthismaterialwithoutfeeforpersonalorclassroomuseprovidedthatthecopiesarenotmadeordistributedforprofitorcommercialadvan-tage,theACMcopyright/servernotice,thetitleofthepublication,anditsdateappear,andnoticeisgiventhatcopyingisbypermissionoftheACM,Inc.Tocopyotherwise,torepublish,topostonservers,ortoredistributetolistsrequirespriorspecificpermissionand/orafee.C°2001ACM1529-3758/01/1000–0458$5.00ACMTransactionsonComputationalLogic,Vol.2,No.4,October2001,Pages458–494.AnExtendedTransformationApproachtoInductiveLogicProgramming²459logic,ILPhasinheriteditssoundtheoreticalbasis,andfrommachinelearning,anexperimentalapproachandorientationtowardpracticalapplications.ILPresearchhasbeenstronglyinfluencedalsobycomputationallearningtheory,andrecentlybyknowledgediscoveryindatabases[Fayyadetal.1995]whichledtothedevelopmentofnewtechniquesforrelationaldatamining.Ingeneral,anILPlearnerisgivenaninitialtheoryT(backgroundknowl-edge)andsomeevidenceE(examples),anditsaimistoinduceatheoryH(hypothesis)thattogetherwithTexplainssomepropertiesofE.InmostcasesthehypothesisHhastosatisfycertainrestrictions,whichweshallrefertoasthebias.Biasincludespriorexpectationsandassumptions,andcanthereforebeconsideredasthelogicallyunjustifiedpartofthebackgroundknowledge.Biasisneededtoreducethenumberofcandidatehypotheses.ItconsistsofthelanguagebiasL,determiningthehypothesisspace,andthesearchbiaswhichrestrictsthesearchofthespaceofpossiblehypotheses.Thebackgroundknowledgeusedtoconstructhypothesesisadistinctivefea-tureofILP.Itiswellknownthatrelevantbackgroundknowledgemaysubstan-tiallyimprovetheresultsoflearningintermsofaccuracy,efficiency,andtheexplanatorypotentialoftheinducedknowledge.Ontheotherhand,irrelevantbackgroundknowledgewillhavejusttheoppositeeffect.Consequently,muchoftheartofinductivelogicprogrammingliesintheappropriateselectionandformulationofbackgroundknowledgetobeusedbytheselectedILPlearner.Thisworkshows,thatbydevotingenoughefforttotheconstructionoffea-tures,tobeusedasbackgroundknowledgeinlearning,evencomplexrelationallearningtaskscanbesolvedbysimplepropositionalrulelearningsystems.Inpropositionallearning,theideaofaugmentinganexistingsetofattributeswithnewonesisknownunderthetermconstructiveinduction.Afirst-ordercoun-terpartofconstructiveinductionispredicateinvention.Thisworktakesthemiddleground:weperformasimpleformofpredicateinventionthroughfirst-orderfeatureconstruction,andusetheconstructedfeaturesforpropositionallearning.Inthiswayweareabletoshowthatthetraditionallimitationsoftransformation-basedapproachessuchasthelimitedhypothesislanguageofLINUS(i.e.,nolocalvariablesinclausebodies)anditssuccessorDINUS(onlydeterminatelocalvariables)[LavraˇcandDˇzeroski1994]canbealleviatedbymeansofnondeterminatefirst-orderfeatureconstruction.Ourapproachtofirst-orderfeatureconstructioncanbeappliedintheso-calledindividual-centereddomains,wherethereisaclearnotionofindividual.Suchdomainsincludeclassificationproblemsinmolecularbiology,forexample,wheretheindividualsaremolecules.Individu
本文标题:An extended transformation approach to inductive l
链接地址:https://www.777doc.com/doc-6041709 .html