您好,欢迎访问三七文档
当前位置:首页 > 机械/制造/汽车 > 综合/其它 > 机器学习-试卷-mids19
CS189Spring2019IntroductiontoMachineLearningMidtermPleasedonotopentheexambeforeyouareinstructedtodoso.Theexamisclosedbook,closednotesexceptyourone-pagecheatsheet.Electronicdevicesareforbiddenonyourperson,includingcellphones,iPods,headphones,andlaptops.Turnyourcellphoneoandleaveallelectronicsatthefrontoftheroom,orriskgettingazeroontheexam.Youhave1hourand20minutes.Pleasewriteyourinitialsatthetoprightofeachpageafterthisone(e.g.,write“JS”ifyouareJonathanShewchuk).Finishthisbytheendofyour1hourand20minutes.Markyouranswersontheexamitselfinthespaceprovided.Donotattachanyextrasheets.Thetotalnumberofpointsis100.Thereare20multiplechoicequestionsworth3pointseach,and4writtenquestionsworthatotalof40points.Formultipleanswerquestions,fillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcreditonmultipleanswerquestions:thesetofallcorrectanswersmustbechecked.FirstnameLastnameSIDFirstandlastnameofstudenttoyourleftFirstandlastnameofstudenttoyourright1Q1.[60pts]MultipleAnswerFillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcredit:thesetofallcorrectanswersmustbechecked.(a)[3pts]LetAbeareal,symmetricnnmatrix.WhichofthefollowingaretrueaboutA’seigenvectorsandeigenvalues?AcanhavenomorethanndistincteigenvaluesAcanhavenomorethan2ndistinctunit-lengtheigenvectorsThevector~0isaneigenvector,becauseA~0=~0WecanfindnmutuallyorthogonaleigenvectorsofATherecanbeinfinitelymanyunit-lengtheigenvectorsifthemultiplicityofanyeigenvectorisgreaterthan1(sotheeiganspaceisaplane,andyoucanpickanyvectorontheunitcircleonthatplane).The0vectorisnotaneigenvectorbydefinition.(b)[3pts]Thematrixthathaseigenvector[1;2]witheigenvalue2andeigenvector[ 2;1]witheigenvalue1(notethatthesearenotuniteigenvectors!)is9 2 26#9=5 2=5 2=56=5#6229#6=52=52=59=5#(c)[3pts]Considerabinaryclassificationproblemwhereweknowbothoftheclassconditionaldistributionsexactly.Tocomputetherisk,weneedtoknowallthesamplepointsweneedtoknowthelossfunctionweneedtoknowtheclasspriorprobabilitiesweneedtousegradientdescent(d)[3pts]Assumingwecanfindalgorithmstominimizethem,whichofthefollowingcostfunctionswillencouragesparsesolutions(i.e.,solutionswheremanycomponentsofwarezero)?kXw yk22+kwk1kXw yk22+kwk21kXw yk22+(#ofnonzerocomponentsofw)kXw yk22+kwk22ThefirstanswerisLasso,whichweknowfindssparsesolutions.ThesecondisLassowiththepenaltysquared.SquaringthiswillleavethesameisocontoursandthiswillkeepthesamepropertiesasLasso.Thethirdcostfunctionpenalizessolutionsthatarenotsparseandwillnaturallyencouragesparsesolutions.Thelastsolutionisridgeregression,whichshrinksweightsbutdoesnotsetweightstozero.(e)[3pts]Whichofthefollowingstatementsaboutlogisticregressionarecorrect?ThecostfunctionoflogisticregressionisconvexLogisticregressionusesthesquarederrorasthelossfunctionThecostfunctionoflogisticregressionisconcaveLogisticregressionassumesthateachclass’spointsaregeneratedfromaGaussiandistribution(f)[3pts]WhichofthefollowingstatementsaboutstochasticgradientdescentandNewton’smethodarecorrect?2Newton’smethodoftenconvergesfasterthanstochasticgradientdescent,especiallywhenthedi-mensionissmallNewton’smethodconvergesinoneiterationwhenthecostfunctionisexactlyquadraticwithoneuniqueminimum.Ifthefunctioniscontinuouswithcontinuousderivatives,Newton’smethodalwaysfindsalocalminimumStochasticgradientdescentreducesthecostfunc-tionateveryiteration.3(g)[3pts]LetX2Rndbeadesignmatrixcontainingnsamplepointswithdfeatureseach.Lety2Rnbethecorrespondingreal-valuedlabels.WhatisalwaystrueabouteverysolutionwthatlocallyminimizesthelinearleastsquaresobjectivefunctionkXw yk22,nomatterwhatthevalueofXis?w=X+y(whereX+isthepseudoinverse)wisinthenullspaceofXwsatisfiesthenormalequationsAllofthelocalminimaareglobalminimaTopleft:w=X+yistheleastsquaressolutionwithleastnorm.Ifthenull-spaceofXisnon-empty,thereareinfinitelymanyothersolutionsthatminimizekXw yk22butthathavelargernorm.Bottomleft:IfwwasinthenullspaceofX,thenXw=0.Thiscanonlybeasolutiontothelinearleastsquaresobjectiveifandonlyifyisalsointhenull-spaceofX.Topright:ThenormalequationsATAw=ATydefineallvaluesofwthatmakezerothegradientoftheleastsquaresobjective.Thereforeanyminimizermustsatisfythem.Bottomright:Theobjectiveisconvex,andthereforealllocalminimizersarealsoglobalminimizers.(h)[3pts]Weareusinglineardiscriminantanalysistoclassifypointsx2Rdintothreedierentclasses.LetSbethesetofpointsinRdthatourtrainedmodelclassifiesasbelongingtothefirstclass.Whichofthefollowingaretrue?ThedecisionboundaryofSisalwaysahyperplaneThedecisionboundaryofSisalwaysasubsetofaunionofhyperplanesScanbethewholespaceRdSisalwaysconnected(thatis,everypairofpointsinSisconnectedbyapathinS)Topleft:Giventhatwehavethreeclasses,Sisdefinedbytwolinearinequalities,andthereforeitsboundarymaynotbeahyperplane.Bottomleft:GiventhatSisdefinedasthepointssatisfyingasetofinequalities,itsboundaryisasubsetofthehyperplanesdefinedbyeachofthelinearinequalities.Topright:Ifthepriorforthefirstclassishighenough,theprobab
本文标题:机器学习-试卷-mids19
链接地址:https://www.777doc.com/doc-6863008 .html