MULTI-REGULARIZATION PARAMETERS ESTIMATION FOR GAUSSIAN MIXTURE CLASSIFIER BASED ON MDL PRINCIPLE

Size: px
Start display at page:

Download "MULTI-REGULARIZATION PARAMETERS ESTIMATION FOR GAUSSIAN MIXTURE CLASSIFIER BASED ON MDL PRINCIPLE"

Transcription

1 MULI-REGULARIZAIO PARAMEERS ESIMAIO FOR GAUSSIA MIXURE CLASSIFIER BASED O MDL PRICIPLE Xulng Zou,, Png Guo an C L Plp Cen 3 e Laboratory of Image Processng an Pattern Recognton, Beng ormal Unversty, Beng, 875, Cna Artfcal Intellgence Insttute, Beng Cty Unversty, Beng, Cna 3 e Faculty of Scence & ecnology, Unversty of Macau, Macau SAR, Cna zxlmouse@bcueucn, {pguo, plpcen}@eeeorg Keywors: Abstract: Gaussan classfer, Covarance matrx estmaton, Mult-regularzaton parameters selecton, Mnmum escrpton lengt Regularzaton s a soluton to solve te problem of unstable estmaton of covarance matrx wt a small sample set n Gaussan classfer An mult-regularzaton parameters estmaton s more ffcult tan sngle parameter estmaton In ts paper, KLIM_L covarance matrx estmaton s erve teoretcally base on MDL (mnmum escrpton lengt) prncple for te small sample problem wt g menson KLIM_L s a generalzaton of KLIM (Kullbac-Lebler nformaton measure) wc consers te local fference n eac menson Uner te framewor of MDL prncple, mult-regularzaton parameters are selecte by te crteron of mnmzaton te KL vergence an estmate smply an rectly by pont estmaton wc s approxmate by two-orer aylor expanson It costs less computaton tme to estmate te mult-regularzaton parameters n KLIM_L tan n RDA (regularze scrmnant analyss) an n LOOC (leave-one-out covarance matrx estmate) were cross valaton tecnque s aopte An ger classfcaton accuracy s aceve by te propose KLIM_L estmator n experment IRODUCIO Gaussan mxture moel (GMM) as been wely use n real pattern recognton problem for clusterng an classfcaton, were te maxmum leloo crteron s aopte to estmate te moel parameters wt te tranng samples (Bsop, 7) (Evertt, 98) However, t often suffers from small sample sze problem wt g mensonal ata In ts case, for -mensonal ata, f less tan + tranng samples from eac class s avalable, te sample covarance matrx estmate n Gaussan classfer s sngular An ts can lea to lower classfcaton accuracy Regularzaton s a soluton to ts n of problem Srnage an regularze covarance estmators are examples of suc tecnques Srnage estmators are a wely use class of estmators wc regularze te covarance matrx by srnng t towar some postve efnte target structures, suc as te entty matrx or te agonal of te sample covarance (Freman, 989); (Hoffbec, 996); (Scafer, 5); (Srvastava, 7) More recently, a number of metos ave been propose for regularzng te covarance estmate by constranng te estmate of te covarance or ts nverse to be sparse (Bcel, 8); (Freman, 8); (Cao, ) e above regularzaton metos manly concern varous mxture of sample covarance matrx, common covarance matrx an entty matrx or constrant te estmate of te covarance or ts nverse to be sparse In tese metos, te regularzaton parameters are requre to be etermne by cross valaton tecnque Altoug te regularzaton metos ave been successfully apple for classfyng small-number ata wt some eurstc approxmatons (Freman, 989); (Hoffbec, 996), te selecton of regularzaton parameters by cross valaton tecnque s very computaton-expensve Moreover, cross-valaton performance s not always well n te selecton of lnear moels n some cases (Rvals, 999) Recently, a covarance matrx estmator calle Kullbac-Lebler nformaton measure (KLIM) s evelope base on mnmum escrpton lengt

2 MULI-REGULARIZAIO PARAMEERS ESIMAIO FOR GAUSSIA MIXURE CLASSIFIER BASED O MDL PRICIPLE (MDL) prncple for small number samples wt g menson ata (Guo, 8) e KLIM estmator s erve teoretcally by KL vergence An a formula for fast estmaton te regularzaton parameter s erve However, snce multparameters optmzaton s more ffcult tan sngle parameter optmzaton, only a specal case tat te regularzaton parameters are taen te same value for all mensons s consere n KLIM oug estmaton of regularzaton parameter becomes smple, te accuracy of covarance matrx estmaton s ecrease by gnore te local fference n eac menson s wll result n ecreasng te classfcaton accuracy of Gaussan classfer fnally In ts paper, KLIM s generalze to KLIM_L wc consers te local fference n eac menson Base on MDL prncple, te KLIM_L covarance matrx estmaton s erve for te small sample problem wt g menson Multregularzaton parameters n eac menson are selecte by te crteron of mnmzaton te KL vergence an estmate effcently by two-orer aylor expanson e feasblty an effcency of KLIM_L are sown by te experments HEOREICAL BACKGROUD Gaussan Mxture Classfer Gven a ata set D { x } wc wll be classfe Assume tat te ata pont n D s sample from a Gaussan mxture moel wc as component: were p( x, ) G( x, m, ) wt,, G x x x (, m, ) ( ) / exp{ ( m ) ( m )} / () () s a general multvarate Gaussan ensty functon x s a ranom vector, te menson of x s an {,, } m s a parameter vector set of Gaussan mxture moel In te case of Gaussan mxture moel, te Bayesan ecson rule arg max p( x, ) s aopte to classfy te vector x nto class wt te largest posteror probablty p( x, ) After moel parameters estmate by te maxmum leloo (ML) meto wt expectatonmaxmzaton (EM) algortm (Rener, 984), te posteror probablty can be wrtten n te form: ˆ (, ˆ, ˆ ) (, ˆ G x m p x ) p( x, ˆ ),,, An te classfcaton rule becomes: were, (3) arg mn ( x ),,,,, (4) ˆ ˆ ˆ x x (5) ( ) ( ˆ ) ( ˆ X m m ) ln ln s equaton s often calle te scrmnant functon for te class (Aeberar, 994) Snce clusterng s more general tan classfcaton n te mxture moel analyss case, we conser te general case n te followng Covarance Matrx Estmaton e central ea of te MDL prncple s to represent an entre class of probablty strbutons as moels by a sngle unversal representatve moel, suc tat t woul be able to mtate te beavour of any moel n te class e best moel class for a set of observe ata s te one wose representatve permts te sortest cong of te ata e MDL estmates of bot te parameters an ter total number are consstent; e, te estmates converge an te lmt specfes te ata generatng moel (Rssanen, 978); (Barron, 998) e coelengt (probablty strbuton or a moel) crteron of MDL nvolves n te KL vergence (Kullbac, 959) ow conserng a gven sample ata set D { x } generate from an unnown ensty, t can be moelle by a fnte Gaussan, were s te parameter p( X ) mxture ensty p( x, ) set In te absence of nowlege of p( x ), t may be estmate by an emprcal ernel ensty estmate p ( x ) obtane from te ata set Because tese two probablty enstes escrbe te same unnown ensty p( x ), tey soul be best matce wt proper mxture parameters an smootng parameters Accorng to MDL prncple, te moel parameters soul be estmate wt mnmze KL vergence KL(, ) base on te gven ata rawn from te unnown ensty p( x ) (Kullbac, 959), 3

3 CA - Internatonal Conference on eural Computaton eory an Applcatons wt p p ( x) KL(, ) p ( x) ln x p( x, ) ( x) (,, ) G x x W exp ( ) ( ) / / x x W x x ( ) W (6) (7) Here W s a mensonal agonal matrx wt a general form, W ag(,,, ), (8) were,,,, are smootng parameters (or regularzaton parameters) n te nonparametrc ernel ensty In te followng ts set s enote as { } e Eq (6) equals to zero f an only f p ( x) p( x, ) If te lmt, te ernel ensty functon p ( x ) becomes a functon, ten Eq (6) reuces to te negatve log leloo functon So te ornary EM algortm can be re-erve base on te mnmzaton of ts KL vergence functon wt te lmt e ML-estmate parameters are sown as follows: mˆ p(, ˆ x ) x, n m m ˆ p( x, ˆ)( x ˆ )( x ˆ ) n (9) e covarance matrx estmaton for te lmt s sown as follows By mnmzng Eq (6) wt respect to, e, settng KL(, )/, te followng covarance matrx estmaton formula can be obtane: ˆ p x x ˆ x x x ( ) p(, )( mˆ )( mˆ ) p ( x) p( x, ˆ )x () In ts case, te aylor expanson s use for p( x, ) at x x wt respect to x an t s expane to frst orer approxmaton: ˆ ˆ ˆ x p( x, ) p( x, ) ( x x ) p( x, ) () wt p( x, ˆ) p( x, ˆ) x x x x On substtutng te above equaton nto Eq () an accorng to te propertes of probablty ensty functon, te followng approxmaton s fnally erve: ( ) W ˆ () e estmaton n Eq () s calle as KLIM_L n te paper, were ˆ s te ML estmaton wen, tang te form of Eq (9) 3 MULI- REGULARIZAIO PARAMEERS ESIMAIO BASED O MDL 3 Regularzaton Parameters Selecton e regularzaton parameter set n te Gaussan ernel ensty plays an mportant role n estmatng te mxture moel parameter Dfferent wll generate fferent moels So selectng te regularzaton parameters s a moel selecton problem In te paper te smlar meto as n (Guo, 8) s aopte base on MDL prncple to select te regularzaton parameters n KLIM_L Accorng to te prncple of MDL, t soul be wt te sortest coelengt to select a moel Wen, te regularzaton parameters can be estmate wt te mnmze KL vergence regarng wt ML estmate parameter ˆ, * arg mn J( ), J( ) KL(, ˆ ) (3) ow a secon orer approxmaton for estmatng te regularzaton parameter s aopte ere Rewrte te J ( ) as: J ( ) J ( ) J ( ), (4) were J ( ) p ( x) ln p( x, )x, J e( ) p ( ) ln p ( ) e x x x Replacng ln p( x, ) wt te secon orer term of aylor expanson nto te ntegral of J ( ) an resultng n te followng approxmaton of J ( ), J ( ) ln p( x, ) trace( ( ln p( x, ))) W x x (5) For very sparse ata strbuton, te followng approxmaton can be use: 4

4 MULI-REGULARIZAIO PARAMEERS ESIMAIO FOR GAUSSIA MIXURE CLASSIFIER BASED O MDL PRICIPLE p ln p G(, ) ln G(, ) ( x) ( x),, x x W x x W G( xx,, W)[ ln ln ln W ( x x ) W ( x x )] Substtutng te Eq (6) nto J ( ), t can be got: e (6) J ( ) ( ) ln ( ) e p x p x x (7) ln ln ln W So far, te approxmaton formula of J ( ) s obtane: J( ) trace( W ( ln p( x, ))) x x, (8) ln W C were C s a constant rrelevant to Let H xx ln p( x, ) ang partal ervatve of J ( ) to W an lettng t be equal to zero, te roug approxmaton formula of s obtane as follows: W ag( H ) (9) 3 Approxmaton for Regularzaton Parameters e Eq (9) can be rewrtten as follows: W ag( H ) ag( xxln p( x, )) wt x xln p( x, ) { p( x, )[ ( x m )( x m ) ] p x x m p x x m (, ) ( ) (, )( ) } Conserng ar-cut case ( p( x, ) or ) an usng te approxmatons p( x, )( x )( x ) n ˆ, m m p( x, ) n, ˆ I an t can be obtane:, W () ag( ) Suppose te egenvalues an egenvectors of te common covarance matrx are an,, were (,,, ), ten tere exsts:, () Substtutng Eq () nto Eq () an usng te average egenvalue ( )/ trace( )/ to substtute eac egenvalue of matrx (only n te enomnator), t can be obtane: W trace ( ) ag( ) wt [ ] Fnally, te regularzaton parameters can be approxmate by te followng equaton: trace ( ) W ag(,, ) () 33 Comparson of KLIM_L wt Regularzaton Metos e four metos (KLIM_L, KLIM, RDA (Regularze Dscrmnant Analyss, Freman, 989) an LOOC (Leave-one-out covarance matrx estmate, Hoffbec, 996)) are all regularzaton metos to estmate te covarance matrx for small sample sze problem ey all conser ML estmate covarance matrx wt te atonal extra matrces KLIM_L s erve by te smlar way as KLIM uner te framewor of MDL prncple Meanwle, te estmaton of regularzaton parameters s smlar to KLIM base on MDL prncple KLIM_L s a generalzaton of KLIM Mult-regularzaton parameters are nclue an estmate n KLIM_L wle one regularzaton parameter s estmate n KLIM For every ( ), f t s taen by trace( ), ten W (trace( ) / ) I It wll reuce to te case of W I n KLIM, were trace( ) / KLIM_L s erve base on MDL prncple wle RDA an LOOC are eurstcally propose ey ffer n mxtures of covarance matrx consere an te crteron use to select te regularzaton parameters 5

5 CA - Internatonal Conference on eural Computaton eory an Applcatons Dfferent computaton tme costs are requre n te four regularzaton scrmnant metos e tme costs of tem are sorte ecreasng n te followng orer: KLIM, KLIM_L, LOOC an RDA s wll be valate by te followng experments 4 EXPERIME RESULS In ts secton, te classfcaton accuracy an tme cost of KLIM_L are compare wt LDA (Lnear Dscrmnant Analyss, Aeberar, 994), RDA, LOOC an KLIM on COIL- obect ata (ene, 996) COIL- s a atabase of gray-scale mages of obects e obects were place on a motorze turntable aganst a blac bacgroun e turntable was rotate troug 36 egrees to vary obect pose wt respect to a fx camera Images of te obects were taen at pose ntervals of 5 egrees, wc correspons to 7 mages per obect e total number of mages s 44 an te sze of eac mage s 8 8 In te experment, te regularzaton parameter of KLIM s estmate by trace( ) / e parameter matrx W of KLIM_L s estmate by Eq () In RDA, te values of an are sample n a coarse gr, (, 5, 5, 75, ), resultng n 5 ata ponts In LOOC, te four parameters are taen accorng to te table n (Hoffbec, 996) Sx mages are ranomly selecte as tranng samples from eac class to estmate te mean an covarance matrx An te remanng mages are employe as testng samples to verfy te classfcaton accuracy Snce te menson of mage ata s very g of 8 8, PCA s aopte ere to reuce te ata menson Experments are performe wt fve fferent numbers of mensons Eac experment runs 5 tmes, an te mean an stanar evaton of classfcaton accuracy are reporte as results e results of experment are sown n table table In te experment, te classfcaton accuracy of KLIM_L s te best among te fve compare metos, wle te classfcaton accuracy of KLIM s te secon best e classfcaton accuracy of LOOC s te worst among te compare metos except n menson 8, were te classfcaton accuracy of LOOC s ger tan tat of LDA Conserng te tme cost of regularzaton parameters estmatng, KLIM_L nees a lttle more tme to estmate te regularzaton parameters tan KLIM nees, wle RDA an LOOC nee muc more tme tan KLIM_L nees e expermental results are consstent wt te teoretcal analyss 5 COCLUSIOS In ts paper, te KLIM_L covarance matrx estmaton s erve base on MDL prncple for te small sample problem wt g menson Uner te framewor of MDL prncple, multregularzaton parameters are estmate smply an rectly by pont estmaton wc s approxmate by two-orer aylor expanson KLIM_L s a generalzaton of KLIM Wt te KL nformaton measure, total samples can be use to estmate te regularzaton parameters n KLIM_L, mang t less computaton-expensve tan usng leave-one-out cross-valaton meto n RDA an LOOC able : Mean classfcaton accuracy on COIL- obect atabase classfer LDA RDA LOOC KLIM KLIM_L 8 86(6) 87() 8(8) 877(8) 9() 7 836() 864(8) 8(8) 87(7) 879(8) 6 85() 868() 88(3) 877(4) 88(6) 5 85(5) 863(4) 83(9) 875() 878(3) 4 869(7) 869(7) 86(3) 876(3) 876(6) able : me cost (n secons) of estmatng regularzaton parameters on COIL- obect atabase classfer RDA LOOC KLIM KLIM_L e-5 866e e-5 788e e e e e e e-5 6

6 MULI-REGULARIZAIO PARAMEERS ESIMAIO FOR GAUSSIA MIXURE CLASSIFIER BASED O MDL PRICIPLE KLIM_L estmator aceves ger classfcaton accuracy tan LDA, RDA, LOOC an KLIM estmators on COIL- ata set In te future wor, te ernel meto combne wt tese regularzaton scrmnant metos wll be stue for small sample problem wt g menson an te selecton of ernel parameters wll be nvestgate uner some crteron ACKOWLEDGEMES e researc wor escrbe n ts paper was fully supporte by te grants from te atonal atural Scence Founaton of Cna (Proect o 98, 69353) Prof Guo s te autor to wom te corresponence soul be aresse, s e-mal aress s pguo@eeeorg small sample set case base on MDL prncple wt applcaton to spectrum recognton, Pattern Recognton, Vol 4, 84~854 Rener, R A, Waler, H F, 984 Mxture enstes, maxmum leloo an te EM algortm, SIAM Rev 6, Aeberar, S, Coomans, e Vel, D, O, 994 Comparatve analyss of statstcal pattern recognton metos n g mensonal settngs, Pattern Recognton 7 (8), Rssanen, J, 978 Moelng by sortest ata escrpton, Automatca 4, Barron, A, Rssanen, J, Yu, B, 998 e mnmum escrpton lengt prncple n cong an moelng, IEEE rans Inform eory 44 (6), Kullbac, S, 959 Informaton eory an Statstcs, Wley, ew Yor ene, S A, ayar, S K an Murase, H, 996 Columba Obect Image lbrary(coil-) ecncal report CUCS-5-96 REFERECES Bsop, C M, 7 Pattern recognton an macne learnng, Sprnger-Verlag ew Yor, Inc Secaucus, J, USA Evertt, B S, Han, D, 98 Fnte Mxture Dstrbutons, Capman an Hall, Lonon Freman, J H, 989 Regularze scrmnant analyss, Journal of te Amercan Statstcal Assocaton, vol 84, no 45, Hoffbec, J P an Langrebe, D A, 996 Covarance matrx estmaton an classfcaton wt lmte tranng ata, IEEE ransactons on Pattern Analyss an Macne Intellgence, vol 8, no 7, Scafer, J an Strmmer, K, 5 A srnage approac to large-scale covarance matrx estmaton an mplcatons for functonal genomcs, Statstcal Applcatons n Genetcs an Molecular Bology, vol 4, no Srvastava, S, Gupta, M R, Frgy, B A, 7 Bayesan quaratc scrmnant analyss, J Mac Learnng Res 8, Bcel, P J an Levna, E, 8 Regularze estmaton of large covarance matrces, Annals of Statstcs, vol 36, no, 99 7 Freman, J, Haste,, an bsran, R, 8 Sparse nverse covarance estmaton wt te grapcal lasso, Bostatstcs, vol 9, no 3, Cao, G, Bacega, L R, Bouman, C A, e Sparse Matrx ransform for Covarance Estmaton an Analyss of Hg Dmensonal Sgnals IEEE ransactons on Image Processng, Volume, Issue 3, Rvals, I, Personnaz, L, 999 On cross valaton for moel selecton, eural Comput, Guo, P, Ja, Y, an Lyu, M R, 8 A stuy of regularze Gaussan classfer n g-menson 7

New Liu Estimators for the Poisson Regression Model: Method and Application

New Liu Estimators for the Poisson Regression Model: Method and Application New Lu Estmators for the Posson Regresson Moel: Metho an Applcaton By Krstofer Månsson B. M. Golam Kbra, Pär Sölaner an Ghaz Shukur,3 Department of Economcs, Fnance an Statstcs, Jönköpng Unversty Jönköpng,

More information

Regularized Discriminant Analysis for Face Recognition

Regularized Discriminant Analysis for Face Recognition 1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths

More information

Competitive Experimentation and Private Information

Competitive Experimentation and Private Information Compettve Expermentaton an Prvate Informaton Guseppe Moscarn an Francesco Squntan Omtte Analyss not Submtte for Publcaton Dervatons for te Gamma-Exponental Moel Dervaton of expecte azar rates. By Bayes

More information

p(z) = 1 a e z/a 1(z 0) yi a i x (1/a) exp y i a i x a i=1 n i=1 (y i a i x) inf 1 (y Ax) inf Ax y (1 ν) y if A (1 ν) = 0 otherwise

p(z) = 1 a e z/a 1(z 0) yi a i x (1/a) exp y i a i x a i=1 n i=1 (y i a i x) inf 1 (y Ax) inf Ax y (1 ν) y if A (1 ν) = 0 otherwise Dustn Lennon Math 582 Convex Optmzaton Problems from Boy, Chapter 7 Problem 7.1 Solve the MLE problem when the nose s exponentally strbute wth ensty p(z = 1 a e z/a 1(z 0 The MLE s gven by the followng:

More information

Multivariate Ratio Estimator of the Population Total under Stratified Random Sampling

Multivariate Ratio Estimator of the Population Total under Stratified Random Sampling Open Journal of Statstcs, 0,, 300-304 ttp://dx.do.org/0.436/ojs.0.3036 Publsed Onlne July 0 (ttp://www.scrp.org/journal/ojs) Multvarate Rato Estmator of te Populaton Total under Stratfed Random Samplng

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them? Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Pattern Classification (II) 杜俊

Pattern Classification (II) 杜俊 attern lassfcaton II 杜俊 junu@ustc.eu.cn Revew roalty & Statstcs Bayes theorem Ranom varales: screte vs. contnuous roalty struton: DF an DF Statstcs: mean, varance, moment arameter estmaton: MLE Informaton

More information

Linear discriminants. Nuno Vasconcelos ECE Department, UCSD

Linear discriminants. Nuno Vasconcelos ECE Department, UCSD Lnear dscrmnants Nuno Vasconcelos ECE Department UCSD Classfcaton a classfcaton problem as to tpes of varables e.g. X - vector of observatons features n te orld Y - state class of te orld X R 2 fever blood

More information

Large-Scale Data-Dependent Kernel Approximation Appendix

Large-Scale Data-Dependent Kernel Approximation Appendix Large-Scale Data-Depenent Kernel Approxmaton Appenx Ths appenx presents the atonal etal an proofs assocate wth the man paper [1]. 1 Introucton Let k : R p R p R be a postve efnte translaton nvarant functon

More information

Numerical Simulation of One-Dimensional Wave Equation by Non-Polynomial Quintic Spline

Numerical Simulation of One-Dimensional Wave Equation by Non-Polynomial Quintic Spline IOSR Journal of Matematcs (IOSR-JM) e-issn: 78-578, p-issn: 319-765X. Volume 14, Issue 6 Ver. I (Nov - Dec 018), PP 6-30 www.osrournals.org Numercal Smulaton of One-Dmensonal Wave Equaton by Non-Polynomal

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Not-for-Publication Appendix to Optimal Asymptotic Least Aquares Estimation in a Singular Set-up

Not-for-Publication Appendix to Optimal Asymptotic Least Aquares Estimation in a Singular Set-up Not-for-Publcaton Aendx to Otmal Asymtotc Least Aquares Estmaton n a Sngular Set-u Antono Dez de los Ros Bank of Canada dezbankofcanada.ca December 214 A Proof of Proostons A.1 Proof of Prooston 1 Ts roof

More information

Solution for singularly perturbed problems via cubic spline in tension

Solution for singularly perturbed problems via cubic spline in tension ISSN 76-769 England UK Journal of Informaton and Computng Scence Vol. No. 06 pp.6-69 Soluton for sngularly perturbed problems va cubc splne n tenson K. Aruna A. S. V. Rav Kant Flud Dynamcs Dvson Scool

More information

Explicit bounds for the return probability of simple random walk

Explicit bounds for the return probability of simple random walk Explct bouns for the return probablty of smple ranom walk The runnng hea shoul be the same as the ttle.) Karen Ball Jacob Sterbenz Contact nformaton: Karen Ball IMA Unversty of Mnnesota 4 Ln Hall, 7 Church

More information

Chapter 2 Transformations and Expectations. , and define f

Chapter 2 Transformations and Expectations. , and define f Revew for the prevous lecture Defnton: support set of a ranom varable, the monotone functon; Theorem: How to obtan a cf, pf (or pmf) of functons of a ranom varable; Eamples: several eamples Chapter Transformatons

More information

An efficient method for computing single parameter partial expected value of perfect information

An efficient method for computing single parameter partial expected value of perfect information An effcent metho for computng sngle parameter partal expecte value of perfect nformaton Mark Strong,, Jeremy E. Oakley 2. School of Health an Relate Research ScHARR, Unversty of Sheffel, UK. 2. School

More information

Probability Density Function Estimation by different Methods

Probability Density Function Estimation by different Methods EEE 739Q SPRIG 00 COURSE ASSIGMET REPORT Probablty Densty Functon Estmaton by dfferent Methods Vas Chandraant Rayar Abstract The am of the assgnment was to estmate the probablty densty functon (PDF of

More information

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012 MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:

More information

TR/95 February Splines G. H. BEHFOROOZ* & N. PAPAMICHAEL

TR/95 February Splines G. H. BEHFOROOZ* & N. PAPAMICHAEL TR/9 February 980 End Condtons for Interpolatory Quntc Splnes by G. H. BEHFOROOZ* & N. PAPAMICHAEL *Present address: Dept of Matematcs Unversty of Tabrz Tabrz Iran. W9609 A B S T R A C T Accurate end condtons

More information

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14 APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce

More information

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess

More information

Sparse Gaussian Processes Using Backward Elimination

Sparse Gaussian Processes Using Backward Elimination Sparse Gaussan Processes Usng Backward Elmnaton Lefeng Bo, Lng Wang, and Lcheng Jao Insttute of Intellgent Informaton Processng and Natonal Key Laboratory for Radar Sgnal Processng, Xdan Unversty, X an

More information

Statistical pattern recognition

Statistical pattern recognition Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve

More information

Distance-Based Approaches to Pattern Recognition via Embedding

Distance-Based Approaches to Pattern Recognition via Embedding Proceengs of the Worl Congress on Engneerng 04 Vol II, WCE 04, July - 4, 04, Lonon, UK Dstance-Base Approaches to Pattern Recognton va Embeng Ncholas A Nechval, Member, IAENG, Konstantn N Nechval, Vam

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

ENG 8801/ Special Topics in Computer Engineering: Pattern Recognition. Memorial University of Newfoundland Pattern Recognition

ENG 8801/ Special Topics in Computer Engineering: Pattern Recognition. Memorial University of Newfoundland Pattern Recognition EG 880/988 - Specal opcs n Computer Engneerng: Pattern Recognton Memoral Unversty of ewfoundland Pattern Recognton Lecture 7 May 3, 006 http://wwwengrmunca/~charlesr Offce Hours: uesdays hursdays 8:30-9:30

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

Why Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one)

Why Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one) Why Bayesan? 3. Bayes and Normal Models Alex M. Martnez alex@ece.osu.edu Handouts Handoutsfor forece ECE874 874Sp Sp007 If all our research (n PR was to dsappear and you could only save one theory, whch

More information

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of

More information

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern

More information

Visualization of 2D Data By Rational Quadratic Functions

Visualization of 2D Data By Rational Quadratic Functions 7659 Englan UK Journal of Informaton an Computng cence Vol. No. 007 pp. 7-6 Vsualzaton of D Data By Ratonal Quaratc Functons Malk Zawwar Hussan + Nausheen Ayub Msbah Irsha Department of Mathematcs Unversty

More information

Unified Subspace Analysis for Face Recognition

Unified Subspace Analysis for Face Recognition Unfed Subspace Analyss for Face Recognton Xaogang Wang and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, Hong Kong {xgwang, xtang}@e.cuhk.edu.hk Abstract PCA, LDA

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

On a one-parameter family of Riordan arrays and the weight distribution of MDS codes

On a one-parameter family of Riordan arrays and the weight distribution of MDS codes On a one-parameter famly of Roran arrays an the weght strbuton of MDS coes Paul Barry School of Scence Waterfor Insttute of Technology Irelan pbarry@wte Patrck Ftzpatrck Department of Mathematcs Unversty

More information

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

C4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z )

C4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z ) C4B Machne Learnng Answers II.(a) Show that for the logstc sgmod functon dσ(z) dz = σ(z) ( σ(z)) A. Zsserman, Hlary Term 20 Start from the defnton of σ(z) Note that Then σ(z) = σ = dσ(z) dz = + e z e z

More information

ALTERNATIVE METHODS FOR RELIABILITY-BASED ROBUST DESIGN OPTIMIZATION INCLUDING DIMENSION REDUCTION METHOD

ALTERNATIVE METHODS FOR RELIABILITY-BASED ROBUST DESIGN OPTIMIZATION INCLUDING DIMENSION REDUCTION METHOD Proceengs of IDETC/CIE 00 ASME 00 Internatonal Desgn Engneerng Techncal Conferences & Computers an Informaton n Engneerng Conference September 0-, 00, Phlaelpha, Pennsylvana, USA DETC00/DAC-997 ALTERATIVE

More information

IMA Preprint Series # 2103

IMA Preprint Series # 2103 STATISTICAL CHARACTERIZATIO OF PROTEI ESEMBLES By Dego Roter Gullermo Sapro an Vjay Pane IMA Preprnt Seres # 03 ( Marc 006 ) ISTITUTE FOR MATHEMATICS AD ITS APPLICATIOS UIVERSITY OF MIESOTA 400 Ln Hall

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

Large-Scale Kernel Methods - I

Large-Scale Kernel Methods - I Large-Scale Kernel Methos - I Sanjv Kumar, Google Research, NY EECS-6898, Columba Unversty - Fall, 2010 EECS6898 Large Scale Machne Learnng 1 Lnear Moels Popular n machne learnng / Statstcs ue to ther

More information

Lecture 16 Statistical Analysis in Biomaterials Research (Part II)

Lecture 16 Statistical Analysis in Biomaterials Research (Part II) 3.051J/0.340J 1 Lecture 16 Statstcal Analyss n Bomaterals Research (Part II) C. F Dstrbuton Allows comparson of varablty of behavor between populatons usng test of hypothess: σ x = σ x amed for Brtsh statstcan

More information

High-Order Hamilton s Principle and the Hamilton s Principle of High-Order Lagrangian Function

High-Order Hamilton s Principle and the Hamilton s Principle of High-Order Lagrangian Function Commun. Theor. Phys. Bejng, Chna 49 008 pp. 97 30 c Chnese Physcal Socety Vol. 49, No., February 15, 008 Hgh-Orer Hamlton s Prncple an the Hamlton s Prncple of Hgh-Orer Lagrangan Functon ZHAO Hong-Xa an

More information

Sparse Methods. Sanjiv Kumar, Google Research, NY. EECS-6898, Columbia University - Fall, 2010

Sparse Methods. Sanjiv Kumar, Google Research, NY. EECS-6898, Columbia University - Fall, 2010 Sparse Methos Sanv Kumar, Google Research, NY EECS-6898, Columba Unversty - Fall, 00 Sanv Kumar /3/00 EECS6898 Large Scale Machne Learnng What s Sparsty? When a ata tem such as a vector or a matrx has

More information

On Liu Estimators for the Logit Regression Model

On Liu Estimators for the Logit Regression Model CESIS Electronc Workng Paper Seres Paper No. 59 On Lu Estmators for the Logt Regresson Moel Krstofer Månsson B. M. Golam Kbra October 011 The Royal Insttute of technology Centre of Excellence for Scence

More information

Clustering & Unsupervised Learning

Clustering & Unsupervised Learning Clusterng & Unsupervsed Learnng Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 2012 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Statistical inference for generalized Pareto distribution based on progressive Type-II censored data with random removals

Statistical inference for generalized Pareto distribution based on progressive Type-II censored data with random removals Internatonal Journal of Scentfc World, 2 1) 2014) 1-9 c Scence Publshng Corporaton www.scencepubco.com/ndex.php/ijsw do: 10.14419/jsw.v21.1780 Research Paper Statstcal nference for generalzed Pareto dstrbuton

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Radar Trackers. Study Guide. All chapters, problems, examples and page numbers refer to Applied Optimal Estimation, A. Gelb, Ed.

Radar Trackers. Study Guide. All chapters, problems, examples and page numbers refer to Applied Optimal Estimation, A. Gelb, Ed. Radar rackers Study Gude All chapters, problems, examples and page numbers refer to Appled Optmal Estmaton, A. Gelb, Ed. Chapter Example.0- Problem Statement wo sensors Each has a sngle nose measurement

More information

A Comparative Study for Estimation Parameters in Panel Data Model

A Comparative Study for Estimation Parameters in Panel Data Model A Comparatve Study for Estmaton Parameters n Panel Data Model Ahmed H. Youssef and Mohamed R. Abonazel hs paper examnes the panel data models when the regresson coeffcents are fxed random and mxed and

More information

Trace Ratio Problem Revisited

Trace Ratio Problem Revisited IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 4, APRIL 2009 729 [16] Y. Lu an Y. F. Zheng, One-aganst-all mult-class SVM classfcaton usng relablty measures, n Proc. Int. Jont Conf. Neural Netw., Jul.

More information

Shuai Dong. Isaac Newton. Gottfried Leibniz

Shuai Dong. Isaac Newton. Gottfried Leibniz Computatonal pyscs Sua Dong Isaac Newton Gottred Lebnz Numercal calculus poston dervatve ntegral v velocty dervatve ntegral a acceleraton Numercal calculus Numercal derentaton Numercal ntegraton Roots

More information

Clustering & (Ken Kreutz-Delgado) UCSD

Clustering & (Ken Kreutz-Delgado) UCSD Clusterng & Unsupervsed Learnng Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y ), fnd an approxmatng

More information

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON PIOTR NAYAR AND TOMASZ TKOCZ Abstract We prove a menson-free tal comparson between the Euclean norms of sums of nepenent ranom vectors

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Modeling and Simulation NETW 707

Modeling and Simulation NETW 707 Modelng and Smulaton NETW 707 Lecture 5 Tests for Random Numbers Course Instructor: Dr.-Ing. Magge Mashaly magge.ezzat@guc.edu.eg C3.220 1 Propertes of Random Numbers Random Number Generators (RNGs) must

More information

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore 8/5/17 Data Modelng Patrce Koehl Department of Bologcal Scences atonal Unversty of Sngapore http://www.cs.ucdavs.edu/~koehl/teachng/bl59 koehl@cs.ucdavs.edu Data Modelng Ø Data Modelng: least squares Ø

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

GENERALISED WALD TYPE TESTS OF NONLINEAR RESTRICTIONS. Zaka Ratsimalahelo

GENERALISED WALD TYPE TESTS OF NONLINEAR RESTRICTIONS. Zaka Ratsimalahelo GENERALISED WALD TYPE TESTS OF NONLINEAR RESTRICTIONS Zaka Ratsmalahelo Unversty of Franche-Comté, U.F.R. Scence Economque, 45D, av. e l Observatore, 5 030 Besançon - France Abstract: Ths paper proposes

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

WHY NOT USE THE ENTROPY METHOD FOR WEIGHT ESTIMATION?

WHY NOT USE THE ENTROPY METHOD FOR WEIGHT ESTIMATION? ISAHP 001, Berne, Swtzerlan, August -4, 001 WHY NOT USE THE ENTROPY METHOD FOR WEIGHT ESTIMATION? Masaak SHINOHARA, Chkako MIYAKE an Kekch Ohsawa Department of Mathematcal Informaton Engneerng College

More information

The Noether theorem. Elisabet Edvardsson. Analytical mechanics - FYGB08 January, 2016

The Noether theorem. Elisabet Edvardsson. Analytical mechanics - FYGB08 January, 2016 The Noether theorem Elsabet Evarsson Analytcal mechancs - FYGB08 January, 2016 1 1 Introucton The Noether theorem concerns the connecton between a certan kn of symmetres an conservaton laws n physcs. It

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

Semi-supervised Classification with Active Query Selection

Semi-supervised Classification with Active Query Selection Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples

More information

Adaptive Kernel Estimation of the Conditional Quantiles

Adaptive Kernel Estimation of the Conditional Quantiles Internatonal Journal of Statstcs and Probablty; Vol. 5, No. ; 206 ISSN 927-7032 E-ISSN 927-7040 Publsed by Canadan Center of Scence and Educaton Adaptve Kernel Estmaton of te Condtonal Quantles Rad B.

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Problem Set 4: Sketch of Solutions

Problem Set 4: Sketch of Solutions Problem Set 4: Sketc of Solutons Informaton Economcs (Ec 55) George Georgads Due n class or by e-mal to quel@bu.edu at :30, Monday, December 8 Problem. Screenng A monopolst can produce a good n dfferent

More information

Testing for seasonal unit roots in heterogeneous panels

Testing for seasonal unit roots in heterogeneous panels Testng for seasonal unt roots n heterogeneous panels Jesus Otero * Facultad de Economía Unversdad del Rosaro, Colomba Jeremy Smth Department of Economcs Unversty of arwck Monca Gulett Aston Busness School

More information

Perron Vectors of an Irreducible Nonnegative Interval Matrix

Perron Vectors of an Irreducible Nonnegative Interval Matrix Perron Vectors of an Irreducble Nonnegatve Interval Matrx Jr Rohn August 4 2005 Abstract As s well known an rreducble nonnegatve matrx possesses a unquely determned Perron vector. As the man result of

More information

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Automatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models

Automatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models Automatc Object Trajectory- Based Moton Recognton Usng Gaussan Mxture Models Fasal I. Bashr, Ashfaq A. Khokhar, Dan Schonfeld Electrcal and Computer Engneerng, Unversty of Illnos at Chcago. Chcago, IL,

More information

PILAE: A Non-gradient Descent Learning Scheme for Deep Feedforward Neural Networks

PILAE: A Non-gradient Descent Learning Scheme for Deep Feedforward Neural Networks Draft verson PILAE: A on-graent Descent Learnng Scheme for Deep Feeforwar eural etworks P. Guo, Senor Member, IEEE, X. L. Zhou, an K. Wang Abstract In ths work, a non-graent escent learnng scheme s propose

More information

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON

A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON A MULTIDIMENSIONAL ANALOGUE OF THE RADEMACHER-GAUSSIAN TAIL COMPARISON PIOTR NAYAR AND TOMASZ TKOCZ Abstract We prove a menson-free tal comparson between the Euclean norms of sums of nepenent ranom vectors

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

1 Derivation of Point-to-Plane Minimization

1 Derivation of Point-to-Plane Minimization 1 Dervaton of Pont-to-Plane Mnmzaton Consder the Chen-Medon (pont-to-plane) framework for ICP. Assume we have a collecton of ponts (p, q ) wth normals n. We want to determne the optmal rotaton and translaton

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Multigrid Methods and Applications in CFD

Multigrid Methods and Applications in CFD Multgrd Metods and Applcatons n CFD Mcael Wurst 0 May 009 Contents Introducton Typcal desgn of CFD solvers 3 Basc metods and ter propertes for solvng lnear systems of equatons 4 Geometrc Multgrd 3 5 Algebrac

More information

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018 INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton

More information

Bounds for Spectral Radius of Various Matrices Associated With Graphs

Bounds for Spectral Radius of Various Matrices Associated With Graphs 45 5 Vol.45, No.5 016 9 AVANCES IN MATHEMATICS (CHINA) Sep., 016 o: 10.11845/sxjz.015015b Bouns for Spectral Raus of Varous Matrces Assocate Wth Graphs CUI Shuyu 1, TIAN Guxan, (1. Xngzh College, Zhejang

More information

Solving Singularly Perturbed Differential Difference Equations via Fitted Method

Solving Singularly Perturbed Differential Difference Equations via Fitted Method Avalable at ttp://pvamu.edu/aam Appl. Appl. Mat. ISSN: 193-9466 Vol. 8, Issue 1 (June 013), pp. 318-33 Applcatons and Appled Matematcs: An Internatonal Journal (AAM) Solvng Sngularly Perturbed Dfferental

More information

The Finite Element Method: A Short Introduction

The Finite Element Method: A Short Introduction Te Fnte Element Metod: A Sort ntroducton Wat s FEM? Te Fnte Element Metod (FEM) ntroduced by engneers n late 50 s and 60 s s a numercal tecnque for solvng problems wc are descrbed by Ordnary Dfferental

More information

Chat eld, C. and A.J.Collins, Introduction to multivariate analysis. Chapman & Hall, 1980

Chat eld, C. and A.J.Collins, Introduction to multivariate analysis. Chapman & Hall, 1980 MT07: Multvarate Statstcal Methods Mke Tso: emal mke.tso@manchester.ac.uk Webpage for notes: http://www.maths.manchester.ac.uk/~mkt/new_teachng.htm. Introducton to multvarate data. Books Chat eld, C. and

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

Tensor Subspace Analysis

Tensor Subspace Analysis Tensor Subspace Analyss Xaofe He 1 Deng Ca Partha Nyog 1 1 Department of Computer Scence, Unversty of Chcago {xaofe, nyog}@cs.uchcago.edu Department of Computer Scence, Unversty of Illnos at Urbana-Champagn

More information