您好,欢迎访问三七文档
当前位置:首页 > 机械/制造/汽车 > 综合/其它 > 机器学习-试卷-mids20a
CS189Spring2020IntroductiontoMachineLearningMidtermAPleasedonotopentheexambeforeyouareinstructedtodoso.Theexamisclosedbook,closednotesexceptyourcheatsheets.PleasewriteyournameatthetopofeachpageoftheAnswerSheet.(Youmaydothisbeforetheexam.)Youhave80minutestocompletethemidtermexam(6:40–8:00PM).(IfyouareintheDSPprogramandhaveanallowanceof150%or200%time,thatcomesto120minutesor160minutes,respectively.)Whentheexamends(8:00PM),stopwriting.Youhave15minutestoscantheexamandturnitintoGradescope.Youmustremainvisibleoncamerawhileyouscanyourexamandturnitin(unlessthescanningdeviceisyouronlyself-monitoringdevice).Mostofyouwilluseyourcellphoneandathird-partyscanningapp.Ifyouhaveaphysicalscannerinyourworkspacethatyoucanmakevisiblefromyourcamera,youmayusethat.Lateexamswillbepenalizedatarateof10pointsperminuteafter8:15PM.(Themidtermhas100pointstotal.)Continuingtoworkontheexamafter8:00PM(ornotbeingvisiblepriortosubmission)mayincurascoreofzero.MarkyouranswersontheAnswerSheet.Ifyouabsolutelymustuseoverflowspaceforawrittenquestion,usethespacefor“WrittenQuestion#5”(butpleasetryhardnottooverflow).Donotattachanyextrasheets.Thetotalnumberofpointsis100.Thereare10multiplechoicequestionsworth4pointseach,andthreewrittenquestionsworth20pointseach.Formultipleanswerquestions,fillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcreditonmultipleanswerquestions:thesetofallcorrectanswersmustbechecked.Forwrittenquestions,pleasewriteyourfullanswerinthespaceprovidedandclearlylabelallsubpartsofeachwrittenquestion.Again,donotattachextrasheets.FirstnameLastnameSID1Q1.[40pts]MultipleAnswerFillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcredit:thesetofallcorrectanswersmustbechecked.(a)[4pts]LetXbeanmnmatrix.Whichofthefollowingarealwaysequaltorank(X)?A:rank(XT)B:rank(XTX)C:m dimension(nullspace(X))D:dimension(rowspace(X))OptionCisnotequalbecausebyRank-NullityTheorem:n dim(nullspace(X))=rank(X)OptionsAandDareequal,sincedim(rowspace(X))=dim(columnspace(XT))=rank(XT)=rank(X)(b)[4pts]Whichofthefollowingtypesofsquarematricescanhavenegativeeigenvalues?A:asymmetricmatrixB:I uuTwhereuisaunitvectorC:anorthonormalmatrix(MsuchthatMM=I)D:r2f(x)wheref(x)isaGaussianPDFTopleft:Asymmetricmatrixcanhavenegativeeigenvalues,theyjusthavetobereal.Bottomleft:uisaunitvectorthatcanbeexpressedasalinearcombinationofthestandardvectorsPni=1cieiwhereci1.(Pni=1ciei)(Pni=1ciei)T=(Pni=1ciei)(Pni=1ciei)TTopright:Anorthogonalmatrixhaseigenvaluesof1and-1.Bottomright:ThegaussianofaPDFisaconcavefunction,thusthehessianmusthavenegativeeigenvalues.(c)[4pts]Choosethecorrectstatement(s)aboutSupportVectorMachines(SVMs).A:ifafinitesetoftrainingpointsfromtwoclassesislinearlyseparable,ahard-marginSVMwillalwaysfindadecisionboundarycorrectlyclassifyingeverytrainingpointB:ifafinitesetoftrainingpointsfromtwoclassesislinearlyseparable,asoft-marginSVMwillalwaysfindadecisionboundarycorrectlyclassifyingeverytrainingpointC:everytrainedtwo-classhard-marginSVMmodelhasatleastonepointofeachclassatadistanceofexactly1=kwk(themarginwidth)fromthedecisionboundaryD:everytrainedtwo-classsoft-marginSVMmodelhasatleastonepointofeachclassatadistanceofexactly1=kwk(themarginwidth)fromthedecisionboundaryOptionAiscorrect:fundamentalmaterialaboutSVMsfromlectures.(d)[4pts]Supposeweperformleast-squareslinearregression,butwedon’tassumethatallweightvectorsareequallyreasonable;instead,weusethemaximumaposteriorimethodtoimposeanormally-distributedpriorprobabilityontheweights.ThenwearedoingA:L2regularizationB:LassoregressionC:logisticregressionD:ridgeregressionAsshowninLecture13,theBayesianjustificationforridgeregressionisderivedbyapplyingMAPtotheposteriorprobabilitywithaGaussianpriorontheweights.(e)[4pts]WhichofthefollowingstatementsregardingROCcurvesaretrue?2A:theROCcurveismonotonicallyincreasingB:foralogisticregressionclassifier,theROCcurve’shorizontalaxisistheposteriorprobabilityusedasathresholdforthedecisionruleC:theROCcurveisconcaveD:iftheROCcurvepassesthrough(0;1),theclassifierisalwayscorrect(onthetestdatausedtomaketheROCcurve)TheaxesofanROCcurvedonotcorrespondtothe”knob”we’returningwhenweplotthecurve.Alwayspredictingpositivewillgiveus100Doesnothavetobeconcave,justneedstobeincreasing.Sinceit’sincreasing,thecurveisahorizontallineaty=1.So,wehavenofalsepositivesnorfalsenegatives.(f)[4pts]Onewaytounderstandregularizationistoaskwhichvectorsminimizetheregularizationterm.Considerthesetofunitvectorsintheplane:fx2R2:kxk22=1g.Whichofthefollowingregularizationtermsareminimizedsolelybythefourunitvectorsf(0;1);(1;0);( 1;0);(0; 1)gandnootherunitvector?A:f(x)=kxk0=the#ofnonzeroentriesofxB:f(x)=kxk1C:f(x)=kxk22D:f(x)=kxk1=maxfjx1j;jx2jgThefirstoptionisalmosttruebydefinition:thesearethesparsestunitvectors.ThesecondoptionfollowsCauchy–Schwartz.Intuitively,however,weknowalsothatthe`1-normpromotessparsity,soweshouldexpectthistobetrue.Finally,noticethatkxk22alwaysequals1andthatmax(x1;x2)isminimizedwhenx1=x2,
本文标题:机器学习-试卷-mids20a
链接地址:https://www.777doc.com/doc-6863020 .html