Practicalaspectsofdeeplearning
Ifyouhave10,000,000examples,howwouldyousplitthetrain/dev/testset?
口33%train,33%dev,33%tPracticalaspectsofdeeplearning
Ifyouhave10,000,000examples,howwouldyousplitthetrain/dev/testset?
口33%train,33%dev,33%test
口60%train,20%dev,20%test
口98%train,1%dev,1%test
Thedevandtestsetshould:
口Comefromthesamedistribution
口Comefromdifferentdistribution
口Beidenticaltoeachother(same(x,y)pairs)
口Havethesamenumberofexamples
IfyourNeuralNetworkmodelseemstohavehighbias,whatofthefollowingwouldbepromisingthingstotry?(Checkallthatapply.)
口Addregularization
口Getmoretestdata
口Increasethenumberofunitsineachhiddenlayer
口MaketheNeuralNetworkdeeper
口Getmoretrainingdata
Youareworkingonanautomatedcheck-outkioskforasupermarket,andarebuildingaclassifierforapples,%,andadevseterrorof7%.Whichofthefollowingarepromisingthingstotrytoimproveyourclassifier?(Checkallthatapply.)
口Increasetheregularizationparameterlambda
口Decreasetheregularizationparameterlambda
口Getmoretrainingdata
口Useabiggerneuralnetwork
Whatisweightdecay?
口Aregularizationtechnique(suchasL2regularization)thatresultsingradientdecentshrinkingtheweightsoneveryiteration.
口Theprocessofgraduallydecreasingthelearningrateduringtraining.
口Gradualcorruptionoftheweightsintheneuralnetworkifitistrainedonnoisydata.
口Atechniquetoavoidvanishin
吴恩达深度学习第二课课后测验(docx版) 来自淘豆网m.daumloan.com转载请标明出处.