您好,欢迎访问三七文档
当前位置:首页 > 机械/制造/汽车 > 综合/其它 > 机器学习-试卷-mids14
CS189Spring2014IntroductiontoMachineLearningMidterm•Youhave2hoursfortheexam.•Theexamisclosedbook,closednotesexceptyourone-pagecribsheet.•Pleaseusenon-programmablecalculatorsonly.•MarkyouranswersONTHEEXAMITSELF.Ifyouarenotsureofyouransweryoumaywishtoprovideabriefexplanation.•Fortrue/falsequestions,llintheTrue/Falsebubble.•Formultiple-choicequestions,llinthebubblesforALLCORRECTCHOICES(insomecases,theremaybemorethanone).Wehaveintroducedanegativepenaltyforfalsepositivesforthemultiplechoicequestionssuchthattheexpectedvalueofrandomlyguessingis0.Don'tworry,forthissection,yourscorewillbethemaximumofyourscoreand0,thusyoucannotincuranegativescoreforthissection.FirstnameLastnameSIDFirstandlastnameofstudenttoyourleftFirstandlastnameofstudenttoyourrightForstauseonly:Q1.TrueorFalse/10Q2.MultipleChoice/24Q3.DecisionTheory/8Q4.Kernels/14Q5.L2-RegularizedLinearRegressionwithNewton'sMethod/8Q6.MaximumLikelihoodEstimation/8Q7.AneTransformationsofRandomVariables/13Q8.GenerativeModels/15Total/1001Q1.[10pts]TrueorFalse(a)[1pt]Thehyperparametersintheregularizedlogisticregressionmodelare(learningrate)and(regularizationterm).TrueFalse(b)[1pt]TheobjectivefunctionusedinL2regularizedlogisticregressionisconvex.TrueFalse(c)[1pt]InSVMs,thevaluesofifornon-supportvectorsare0.TrueFalse(d)[1pt]Asthenumberofdatapointsapproaches1,theerrorrateofa1-NNclassierapproaches0.TrueFalse(e)[1pt]Crossvalidationwillguaranteethatourmodeldoesnotovert.TrueFalse(f)[1pt]Asthenumberofdimensionsincreases,thepercentageofthevolumeintheunitballshellwiththicknessgrows.TrueFalse(g)[1pt]Inlogisticregression,theHessianofthe(nonregularized)loglikelihoodispositivedenite.TrueFalse(h)[1pt]GivenabinaryclassicationscenariowithGaussianclassconditionalsandequalpriorprobabilities,theoptimaldecisionboundarywillbelinear.TrueFalse(i)[1pt]IntheprimalversionofSVM,weareminimizingtheLagrangianwithrespecttowandinthedualversion,weareminimizingtheLagrangianwithrespectto.TrueFalse(j)[1pt]ForthedualversionofsoftmarginSVM,thei'sforsupportvectorssatisfyiC.TrueFalse2Q2.[24pts]MultipleChoice(a)[3pts]Considerthebinaryclassicationproblemwherey2f0;1gisthelabelandwehavepriorprobabilityP(y=0)=0.IfwemodelP(xjy=1)tobethefollowingdistributions,whichone(s)willcausetheposteriorP(y=1jx)tohavealogisticfunctionform?GaussianPoissonUniformNoneoftheabove(b)[3pts]Giventhefollowingdatasamples(squareandtrianglebelongtotwodierentclasses),whichone(s)ofthefollowingalgorithmscanproducezerotrainingerror?vvv1-nearestneighborSupportvectormachineLogisticregressionLineardiscriminantanalysis(c)[3pts]Thefollowingdiagramsshowtheiso-probabilitycontoursfortwodierent2DGaussiandistributions.Ontheleftside,thedataN(0;I)whereIistheidentitymatrix.Therightsidehasthesamesetofcontourlevelsasleftside.Whatisthemeanandcovariancematrixfortherightside'smultivariateGaussiandistribution?xy−5−4−3−2−1012345−5−4−3−2−1012345xy−5−4−3−2−1012345−5−4−3−2−1012345=[0;0]T,=1001#=[0;1]T,=1001#=[0;1]T,=4000:25#=[0;1]T,=2000:5#3(d)[3pts]Giventhefollowingdatasamples(squareandtrianglemeantwoclasses),whichone(s)ofthefollowingkernelscanweuseinSVMtoseparatethetwoclasses?vvvvvvvvvvvvLinearkernelPolynomialkernelGaussianRBF(radialbasisfunction)kernelNoneoftheabove(e)[3pts]Considerthefollowingplotsofthecontoursoftheunregularizederrorfunctionalongwiththeconstraintregion.Whatregularizationtermisusedinthiscase?L2L1L1Noneoftheabove(f)[3pts]Supposewehaveacovariancematrix=5aa4Whatisthesetofvaluesthatacantakeonsuchthatisavalidcovariancematrix?a2 p20ap20a0 p20ap204(g)[3pts]ThesoftmarginSVMformulationisasfollows:min12wTw+CNXi=1isubjecttoyi(wTxi+b)1 i8ii08iWhatisthebehaviorofthewidthofthemargin(2kwk)asC!0?BehaveslikehardmarginGoestoinnityGoestozeroNoneoftheabove(h)[3pts]InHomework4,youtalogisticregressionmodelonspamandhamdataforaKaggleCompetition.Assumeyouhadaverygoodscoreonthepublictestset,butwhentheGSIsranyourmodelonaprivatetestset,yourscoredroppedalot.Thisislikelybecauseyouoverttedbysubmittingmultipletimesandchangingthefollowingbetweensubmissions:,yourpenaltyterm,yourstepsize,yourconvergencecriterionFixingarandombug(i)[0pts]BONUSQUESTION(Answerthisonlyifyouhavetimeandarecondentofyourotheranswersbecausethisisnotextrapoints.)Wehaveconstructedthemultiplechoiceproblemssuchthateveryfalsepositivewillincursomenegativepenalty.Foroneofthesemultiplechoiceproblems,giventhatthereareppoints,rcorrectanswers,andkchoices,whatistheformulaforthepenaltysuchthattheexpectedvalueofrandomguessingisequalto0?(Youmayassumekr)pk r5Q3.[8pts]DecisionTheoryConsiderthefollowinggenerativemodelfora2-classclassicationproblem,inwhichtheclassconditionalsareBernoullidistributions:p(!1)=p(!2)=1 xj!1=(1withprobability0:50withprobability0:5xj!2=(1withprobability0:50withprobability0:5Assumethelossmatrixtrueclass=1trueclass=2predictedclass=1012predictedclass=2210(a)[8pts]Giveaconditionintermsof12,21,andthatdetermineswhenclass1shouldalwaysbechosenastheminimum-riskclass.BasedonBayes'Rule,theposteriorprobabilityofP(wijx)isP(w1jx)=P(xjw1)P(w1)P(x)=12P(x)
本文标题:机器学习-试卷-mids14
链接地址:https://www.777doc.com/doc-6863021 .html