您好,欢迎访问三七文档
当前位置:首页 > 商业/管理/HR > 信息化管理 > Evolutionary-Computation
本代码适用于做科研工作的人员。如果没有导师指明方向的话,那么群智能是一个很好的方向。如果已经明确研究群智能方法的话,给群智能找一个应用场景更容易发表文章。这套代码中有现成的示例,如果要发表文章的话,改一改模型就可以了。一套代码总价为39.9。加Q**群:769154213智能优化算法经过了数十年的发展,已经成为求解大规模NP难、非线性、强约束问题的一种有效工具。包括遗传算法、粒子群等经典方法和教学优化、灰狼优化、竞争群优化等新兴智能方法均在多个工程领域取得了广泛应用。在新能源系统中,面向大规模机组组合、经济环境调度、潮流分析、最优配置、散热片设计、太阳能和燃料电池参数辨识等混合整数及强非线性优化问题,仍是长期困扰新能源系统大规模应用的难题。本报告从电动汽车接入电网的优化方法综述出发,重点介绍几种电动汽车接入电网的机组组合和经济调度的新模型和基于启发式优化的求解方法,也简述了课题组近期在其他新能源复杂优化问题研究相关工作。Geneticalgorithms◦ParallelgeneticalgorithmsGeneticprogrammingEvolutionstrategiesClassifiersystemsEvolutionprogrammingRelatedtopicsConclusionFitness=HeightSurvivalofthefittestMaintainapopulationofpotentialsolutionsNewsolutionsaregeneratedbyselecting,combiningandmodifyingexistingsolutions◦Crossover◦MutationObjectivefunction=Fitnessfunction◦Bettersolutionsfavoredforparenthood◦Worsesolutionsfavoredforreplacementmaximize2X^2-y+5whereX:[0,3],Y:[0,3]maximize2X^2-y+5whereX:[0,3],Y:[0,3]RepresentationFitnessfunctionInitializationstrategySelectionstrategyCrossoveroperatorsMutationoperatorsRepresentationFitnessfunctionInitializationstrategySelectionstrategyCrossoveroperatorsMutationoperatorsReplacementstrategyProportionalselection(roulettewheel)◦Selectionprobabilityofindividual=individual’sfitness/sumoffitnessRankbasedselection◦Example:decreasingarithmetic/geometricseries◦BetterwhenfitnessrangeisverylargeorsmallTournamentselection◦VirtualtournamentbetweenrandomlyselectedindividualsusingfitnessPointcrossover(classical)◦Parent1=x1,x2,x3,x4,x5,x6◦Parent2=y1,y2,y3,y4,y5,y6◦Child=x1,x2,x3,x4,y5,y6Uniformcrossover◦Parent1=x1,x2,x3,x4,x5,x6◦Parent2=y1,y2,y3,y4,y5,y6◦Child=x1,x2,y3,x4,y5,y6Arithmeticcrossover◦Parent1=x1,x2,x3◦Parent2=y1,y2,y3◦Child=(x1+y1)/2,(x2+y2)/2,(x3+y3)/2changeoneormorecomponentsLetChild=x1,x2,P,x3,x4...Gaussianmutation:◦P¬P±∆p◦∆p:(small)randomnormalvalueUniformmutation:◦P¬Pnew◦pnew:randomuniformvalueboundarymutation:◦P¬PminORPmaxBinarymutation=bitflipFindsglobaloptimaCanhandlediscrete,continuousandmixedvariablespacesEasytouse(shortprograms)Robust(lesssensitivetonoise,illconditions)Relativelyslowerthanothermethods(notsuitableforeasyproblems)TheorylagsbehindapplicationsCoarse-grainedGAathighlevelFine-grainedGAatlowlevelCoarse-grainedGAathighlevelGlobalparallelGAatlowlevelCoarse-grainedGAathighlevelCoarse-grainedGAatlowlevelIntroduced(officially)byJohnKozainhisbook(geneticprogramming,1992)Earlyattemptsdatebacktothe50s(evolvingpopulationsofbinaryobjectcodes)IdeaistoevolvecomputerprogramsDeclarativeprogramminglanguagesusuallyused(Lisp)ProgramsarerepresentedastreesApopulationoftreesrepresentingprogramsTheprogramsarecomposedofelementsfromtheFUNCTIONSETandtheTERMINALSETThesesetsareusuallyfixedsetsofsymbolsThefunctionsetformsnon-leafnodes.(e.g.+,-,*,sin,cos)Theterminalsetformsleafnodes.(e.g.x,3.7,random())FitnessisusuallybasedonI/OtracesCrossoverisimplementedbyrandomlyswappingsubtreesbetweenindividualsGPusuallydoesnotextensivelyrelyonmutation(randomnodesorsubtrees)GPsareusuallygenerational(sometimeswithagenerationgap)GPusuallyuseshugepopulations(1Mindividuals)MoreflexiblerepresentationGreaterapplicationspectrumIftractable,evolvingawaytomake“things”ismoreusefulthanevolvingthe“things”.Example:evolvingalearningruleforneuralnetworks(AmrRadi,GP98)vs.evolvingtheweightsofaparticularNN.ExtremelyslowVerypoorhandlingofnumbersVerylargepopulationsneededGeneticprogrammingwithlineargenomes(WolfgangBanzaf)◦KindofgoingbacktotheevolutionofbinaryprogramcodesHybridsofGPandothermethodsthatbetterhandlenumbers:◦Leastsquaresmethods◦Gradientbasedoptimizers◦Geneticalgorithms,otherevolutionarycomputationmethodsEvolvingthingsotherthanprograms◦Example:electriccircuitsrepresentedastrees(Koza,AIindesign1996)WereinventedtosolvenumericaloptimizationproblemsOriginatedinEuropeinthe1960sInitially:two-memberor(1+1)ES:◦onePARENTgeneratesoneOFFSPRINGperGENERATION◦byapplyingnormallydistributed(Gaussian)mutations◦untiloffspringisbetterandreplacesparent◦Thissimplestructureallowedtheoreticalresultstobeobtained(speedofconvergence,mutationsize)Later:enhancedtoa(μ+1)strategywhichincorporatedcrossoverSchwefelintroducedthemulti-memberedESsnowdenotedby(μ+λ)and(μ,λ)(μ,λ)ES:Theparentgenerationisdisjointfromthechildgeneration(μ+λ)ES:SomeoftheparentsmaybeselectedtopropagatetothechildgenerationRealvaluedvectorsconsistingoftwoparts:◦Objectvariable:justlikereal-valuedGAindividual◦Strategyvariable:asetofstandarddeviationsfortheGaussianmutationThisstructureallowsforSelf-adaptation“ofthemutationsize◦ExcellentfeaturefordynamicallychangingfitnesslandscapeInmachinelearningweseekagoodhypothesisThehypothesismaybearule,aneuralnetwork,aprogram...etc.GAsandotherECmethodscanevolverules,NNs,programs...etc.Classifiersystems(CFS)arethemostexplicitGAbasedmachinelearningtool.Ruleandmessagesystem◦ifconditionthenactionApportionmentofcreditsystem◦Basedonasetoftrainingexamples◦Credit(fitness)giventorulesthatmatchtheexample◦Example:Bucketbrigade(auctionsforexamples,winnertakesall,e
本文标题:Evolutionary-Computation
链接地址:https://www.777doc.com/doc-8638060 .html