Paern lassificaion VI 杜俊 jundu@usc.edu.cn
Ouline Bayesian Decision Theory How o make he oimal decision? Maximum a oserior MAP decision rule Generaive Models Join disribuion of observaion and label sequences Model esimaion: MLE Bayesian learning discriminaive raining Discriminaive Models Model he oserior robabiliy direcly discriminan funcion Logisic regression suor vecor machine neural nework
Saisical Models: Roadma Gaussian -d Mulivariae Gaussian oninuous daa Discree daa GMM Markov hain DHMM DDHMM Grahical Models Mulinomial Mixure of Mulinomial Esimaion: ML Bayesian DT
Model Parameer Esimaion Maximum Likelihood ML Esimaion: ML mehod: mos oular model esimaion EM Execed-Maximizaion algorihm Examles: Univariae Gaussian disribuion Mulivariae Gaussian disribuion Mulinomial disribuion Gaussian Mixure model Markov chain model: n-gram for language modeling Hidden Markov Model HMM Discriminaive Training Minimum lassificaion Error ME Maximum Muual Informaion MMI Bayesian Model Esimaion: Bayesian heory
Minimum lassificaion Error Esimaion I In a -class aern classificaion roblem given a se of raining daa D{ T T} esimae model arameers for all class o minimize oal classificaion errors in D. ME: minimize emirical classificaion errors Objecive funcion oal classificaion errors in D For each raining daa define misclassificaion measure: or d + max d ln[ ] + max ln[ ] If d >0 incorrec classificaion for error If d <0 correc classificaion for 0 error
Minimum lassificaion Error Esimaion II Sof-max: aroximae d by a differeniable funcion: or η η / ] ex[ ln + d η η / ] ln ex[ ln ] ln[ + d where η>.
Minimum lassificaion Error Esimaion III Error coun for one daa is a se funcion Hd Toal errors in raining se: Q Λ T H d Se funcion is no differeniable aroximaed by a sigmoid funcion smoohed oal errors in raining se. Q Λ Q' Λ T l d where a>0 is a arameer o conrol is shae.
Minimum lassificaion Error Esimaion IV ME esimaion of model arameers for all classes: { } ME arg min Q' Oimizaion: no simle soluion is available Ieraive gradien descen mehod. Sochasic GD bach mode mini-bach mode
Minimum lassificaion Error Esimaion V Find iniial model arameers e.g. ML esimaes alculae gradien of he objecive funcion alculae he value of he gradien based on he curren arameers Udae model arameers Ierae unil convergence
How o alculae Gradien? [ ] T i T i T i i d d l d l a d d d l d l Q ] [ ' The key issue in ME raining is o se a roer se size exerimenally.
Overraining Overfiing Low classificaion error rae in raining se does no always lead o a low error rae in a new es se due o overraining.
Measuring Performance of ME Objecive funcion lassificaion Error in % When o converge: monior hree quaniies in he ME The objecive funcion Error rae in raining se Error rae in es se
Maximum Muual Informaion Esimaion I The model is viewed as a noisy daa generaion channel lass id observaion feaure Maximize muual informaion beween and noisy daa generaion channel I log log log log arg max } { MMI I
Maximum Muual Informaion Esimaion II Difficuly: join disribuion is unknown. Soluion: collec a reresenaive raining se T T o aroximae he join disribuion. Oimizaion: Ieraive gradien-ascen mehod Growh-ransformaion mehod T I MMI log arg max log arg max arg max } {
Bayesian Model Esimaion Bayesian mehods view model arameers as random variables having some known rior disribuion. Prior secificaion Secify rior disribuion of model arameers θ as θ. Training daa D allow us o conver he rior disribuion ino a oseriori disribuion. Bayesian learning
Bayesian Learning Poseriori Likelihood D θ Prior θmap θml
MAP Esimaion Do a oin esimae abou θ based on he oseriori disribuion θ MAP arg max θ θ D arg max θ D θ Then θmap is reaed as esimae of model arameers jus like ML esimae. Someimes need he EM algorihm o derive i. MAP esimaion oimally combine rior knowledge wih new informaion rovided by daa. MAP esimaion is used in seech recogniion o ada seech models o a aricular seaker o coe wih various accens From a generic seaker-indeenden seech model rior ollec a small se of daa from a aricular seaker The MAP esimae give a seaker-adaive model which suis beer o his aricular seaker. θ
How o Secify Priors oninformaive riors Wihou enough rior knowledge jus use a fla rior onjugae riors: for comuaion convenience Afer Bayesian leaning he oserior will have he exac same funcion form as he rior exce he all arameers are udaed. o every model has conjugae rior.
onjugae Prior For a univariae Gaussian model wih only unknown mean: The conjugae rior of Gaussian is Gaussian Afer observing a new daa x he oserior will sill be Gaussian: ] ex[ µ π µ x x x 0 0 0 0 0 0 where ] ex[ µ µ µ µ π µ µ µ + + + + x x
The Sequenial MAP Esimae of Gaussian For univariae Gaussian wih unknown mean he MAP esimae of is mean afer observing x: Afer observing nex daa x:
Projec: Building a -lass lassifier Given some daa from wo classes Build a classifier wih mulivariae Gaussian models ML esimaion Tes wih he lug-in MAP decision rule Imrove i wih GMM models Iniialize GMM wih he K-means clusering Esimae GMM wih he EM algorihm Invesigae GMM wih he mixure number 4 8. Imrove he Gaussian classifier wih discriminaive raining minimum classificaion error esimaion Preferably rogramming wih /++ Reor all of your exerimens and your bes classifier.