您好,欢迎访问三七文档
当前位置:首页 > 电子/通信 > 数据通信与网络 > 深度学习最新进展资料
HowdoesReducetheDimensionalityofDatawithDeepNeuralNetworks?2019/12/13◆2012,ImageNetRecognitionChallenge,Hinton’steamwitha16%errorrateinclassifying1.2millionimages,againsta26%errorratebyitsclosestcompetitors.◆2006,HintonandSalakhutdinovproposeddeepneuralnetworkslearningmethod;Subsequently(June26,2012,news),researchersatGoogleandStanfordUniversitycreatedthelargestneuralnetworkswhichisadeepneuralnetworkwithnine-layerthatlearnedontheinternettolearnonitsown.Thesystemwasusedtosimulatehumanbrain,withonebillionconnections,andtrainedoverthreedayson10millionimagesbyconnecting16,000cores.Afterthreedays’unsupervisedlearning,itinventedtheconceptofacat.(Oct,2012,news)Microsoftdisplayafullyautomatedinterpretationsystemonthe21stCenturyComputingConference.TheKeytechnologyofthissystemisDNN(Deepneuralnetworks).◆2013,Hinton-Google;YannLeCun-Facebook;Baidu(IDL)◆2014,AndrewNg-BaiduRecentlyAdvanceofDeepLearning◆2015,JorunalcitationreportsforSCIThreeIdeasofCNN1.TheareaoflocalperceptionFig.1ConnectedwayofConventionalcurrentneuralnetsFig.2Connectedwayofcurrentneuralnets2.ShareweightsFig.3SingleconvolutionkernelReducethecomplexityofnetworkandreduceamountofcalculationFig.4MultipleconvolutionkernelFig.5Insideaconvolutionalneuralnetwork3.SecondaryextractionofthefeaturemapsResearchpointsthatcanproducenewideas※Initialweightspretrainingalgorithm※Thearchitectureofdeepneuralnetworks※Layer-by-layerlearningalgorithmReference[1]G.E.Hinton,R.R.Salakhutdinov.ReducingtheDimensionalityofDatawithNeuralNetworks[J].Science,2006,313(28):504-507.[2]YannLeCun,YoshuaBengio,GeoffreyHinton.Deeplearning[J].Nature,2015,521(28):436-444.[3]!
本文标题:深度学习最新进展资料
链接地址:https://www.777doc.com/doc-1927683 .html