Out of sample extensions of PCA, kernel PCA, and MDS

Size: px
Start display at page:

Download "Out of sample extensions of PCA, kernel PCA, and MDS"

Transcription

1 Out of sample extesos of PCA, erel PCA, ad MDS Math 85 Project, Fall 05 Wlso A. Florero-Salas Da L

2 ABLE OF CONENS. Itroducto.... Prcpal Compoet Aalyss (PCA).... he out of sample exteso of PCA he out of sample exteso of PCA DEMO Kerel PCA he out of sample exteso of Kerel PCA he out of sample exteso of KPCA (demo) Multdmesoal Scalg (MDS) he out of sample exteso of MDS he out of sample exteso of MDS (DEMO) he out of sample exteso of MDS (DEMO) Cocluso Appedx Refereces... 5 Page of 5

3 . INRODUCION Classfcato s the problem of categorzg a ew observato based o a set of observatos called the trag set, whose membershp s already ow. Classfcato problems usually volve the trag of a model, usg the trag set, to later mae predctos or classfy ew observatos to oe of the ow categores. I recet years the collecto of huge amouts of data has bee eased by mprovemets techology, ad t s ow commo to have observatos wth thousads, f ot mllos of features. Eve though there has bee a great jump techology, may moder computers are stll ot able to effcetly hadle observatos wth very large umber of features, whch some cases maes model trag ufeasble. However, may cases, t s stll possble to tra a model wth a subset of the features, or a trasformato of the feature space to a smaller space whch feature selecto s possble. hs leads us to the dea of Dmesoalty Reducto. Dmesoalty reducto (DR) s the process of reducg the umber of varables uder cosderato for the purpose of feature selecto or feature extracto. o ths ed, dmesoalty reducto allows the modeler to tra models usg less varables, ad some cases, obta a vsualzato of the data set two or three dmesos. hree commo DR techques ow the lterature are Prcpal Compoet Aalyss (PCA), Kerel PCA, ad Classcal Multdmesoal Scalg (MDS). o perform the correspodg trasformato each of these methods use the etre data set. he questo s ow: If ew data becomes avalable, how ca these ew observatos be corporated the ew feature space? I some cases redog the DR s eough, but that s ot our preset cocer. I other cases the data set may so large that retrag s o loger feasble. I ths cotext, we eed a way to corporate these ew observatos the ew feature space, wthout retrag ad f possble, recyclg formato already obtaed from the frst tme we performed dmesoalty reducto. hs dea of brgg ew observatos to the ew feature space s ow the lterature as out-of-sample-extesos, whch wll be the focus of ths paper. I the followg we brefly revew PCA, erel PCA, ad MDS before cosderg ther correspodg out-ofsample extesos.. PRINCIPAL COMPONEN ANALYSIS (PCA) Prcpal Compoet Aalyss (PCA) s a lear DR feature extracto tool. PCA attempts to fd a lear subspace of lower dmeso tha the orgal feature space, where the ew features have the largest varace [B006]. Oe way to derve the prcpal d compoets of a data set { x } s by maxmzg the trace of the covarace matrx of data pots { y }, gve by S : ( y y)( y y) Y, where y y ad where we assume that there s a V such that y. A equvalet way to derve the prcpal compoets s by solvg the problem of fdg the best -dmesoal subspaces ( le ) V x d that mmze the orthogoal dstaces. We tae ths approach here. Cocretely, let { x }, where,,, to solve [S003] where PS () s the projecto oto the subspace S. Let S m x PS ( x ) S d m represet a fxed pot ad B. he tas s d a orthoormal bass of. If x m B s a parametrc equato for the plae, the P ( x ) m BB ( x m). It ca be show that the above mmzato problem s equvalet to solvg S he reader s referred to the refereces for addtoal detals. Page of 5

4 m X XBB B F hs mmum s acheved whe Frobeus orm, ad X B Vd X ad XBB m x. Here X, where the colums correspod to the frst. he above the ca be summarzed the followg theorem U V heorem: he projecto of X oto the best-ft -plae s X U V s the best ra- approxmato to X colums of V uder the the SVD of X. I other words he ew coordates wth respect to the bass Vd XV.e., the rows of d U are called the prcpal compoets. hs theorem allows us to easly fd the frst prcpal compoets of X usg the followg algorthm Algorthm a: Prcpal Compoet Aalyss (PCA) X [ x, x,, x ] Iput: Data set Ouput: op prcpal compoets. Ceter the data:. Perform SVD o: x x x for all X U V XV U 3. Retur the rows of: d. HE OU OF SAMPLE EXENSION OF PCA I the prevous secto, we obtaed the matrx Vd whch mapped pots d x to pots pots are cetered to beg wth, the P ( x ) V V x or matrx form P ( X ) XV V subspace wth dmeso X x x x [,,, ], we smply cosder S d d XV S, where the subspace S d S d d y va a lear map. If the. o obta the pots the was costructed usg the data set. If a ew data set becomes avalable, how ca PCA be exteded to ths ew data set? We ow llustrate ths exteso wth fgure, Secto.3. Assume we have a ew data set. Accordg to the Algorthm a, we ceter the data. he usg ths cetered data we costruct the le S ad map these pots to the le. If a ew data set avalable, the we ca map t to the le usg the matrx Vd Z m becomes. However, because the orgal data set has bee cetered, the data pots ad the le (subspace) are a ew set of axs. o brg the ew data set Z to the curret set of axs, t must be cetered exactly the same way X was cetered. Fally, we may project the cetered data set Z summarzed Algorthm b ad a vsualzato Secto.3: va the matrx V d [G966]. hs s I ths paper, gve a matrx AM N, we defe AM to be the matrx oly cludg the frst colums. We wll also ths otato to deote ts dmesos. Page 3 of 5

5 Algorthm b: Out of sample (PCA) Iput: New data set Z [ z, z,, z ] m, Vd Ouput: op prcpal compoets. Ceter the data:. Retur the rows of: z z x for all ZVd Both Algorthms a ad b wrtte as a fucto MALAB ca be foud Appedx HE OU OF SAMPLE EXENSION OF PCA DEMO Fgure : (L) Orgal data set (M) cetered data alog wth best-ft le (R) curret projecto space ad ew ucetered data (red pots). Fgure : (L) New data pots brought to curret axs by ceterg (R) ew pots projected oto the curret space. It s worth comparg the out-of-ft exteso wth a retraed model that uses the etre data set. Notce the dfferece. Page 4 of 5

6 3. KERNEL PCA PCA s a lear method that caot properly hadle olear data. If the data s olear, the ma dea of Kerel PCA s to use a map () that taes each data vector appled [W0]. Cocretely, let data pots that x to a vector x ( x ) d a hgher dmesoal space (called feature space) where PCA ca be be gve, ad suppose : d D, where D d. Assume further ( x ) 0, meag that the feature vectors have zero mea. Defe : ( ) ( ) ( ) D x x x, ad cosder the SVD VD U V. he applyg PCA to va Algorthm the ew coordates wth respect to the bass D DD V U. Usually the are gve by the rows of D decomposto explctly. o remedate ths defe ( x, x ) : ( x ) ( x ) ad cosder ( x ) j j are uow ad t s ot possble to wor out the ( x ) ( x ) ( x, x j ) : K. he matrx K s called the Kerel matrx, whch uder the proper mappg whch s postve sem-defte. If the data s ot cetered the feature space, t ca be show that by cosderg ( x ) ( x ) ( x ), costructg we ca obta smlar equatos by replacg K D by K aalyss, we have we obta the matrx 3 () K K K K K, for whch prevous formulas or formulas that follow 4 [S998]. Proceedg wth our K U V U V U U so that KU D DD D DD D D U, whch s a egevalue problem. I other words, by solvg for the egevalues ad egevectors of K, we are able to obta the matrces eeded the SVD of. Note that f we cosder VD D DDVD D we obta VD VD D. Ad for the purpose of prcpal compoet extracto, we choose vectors so that vv, whch leads to uu / ( K) ( ), for,, the u by /. hs s equvalet to cosderg K U so that u. KPCA s summarzed the followg Algorthm a. Algorthm a: Kerel PCA Iput: Data set X [ x, x,, x ] Ouput: op prcpal compoets. Costruct the erel matrx K ( x, x j). Ceter K, va K K K K K 3. Solve the egevalue problem: KU U 4. Retur the rows of U u u where we chose u /, where U because we wat to scale, 3 Here defe [] 4 Place a tlde o all varables, ad the results are smlar Page 5 of 5

7 3. HE OU OF SAMPLE EXENSION OF KERNEL PCA o obta a out-of-sample exteso we proceed as before: ceter the ew data wth respect to the trag set, ad the apply the projecto matrx. Here the trasformato VD s explctly uow, but we ca use Kerels to our advatage le the prevous secto. Assume that s the ew data set. Defe z z z ceterg, so defe Z zz z m : ( ) ( ) ( ) md. hs set may eed Z ( z ) ( z ) ( x ) to obta the matrx : ( ) ( ) ( ) Z z z z m Applyg VD ZVD Z VD Z VD cosder ZVD Z VD Z VD, where we are assumg Note that. Let KZ : ( z ) ( x j ) V K U K U Z D Z Z m gves cotas cetered data. Otherwse K Z was costructed usg cetered data the feature space, whch s explctly uow; however, t s possble to wrte the erel terms of the orgal erels as he out of sample KPCA s summarzed Algorthm b. K K K K K Z Z m Z m Algorthm b: out of sample Kerel PCA Z zz z m, Iput: New data set U, Ouput: op prcpal compoets for ew data. Costruct the erel matrx KZ ( z, x j ). Ceter K Z, va 5 K K K K K Z Z m Z m 3. Retur the rows of: KZU KZ u u m, the Both Algorthms a ad b wrtte as a fucto MALAB ca be foud Appedx Here m m, where m [S998] Page 6 of 5

8 3.3 HE OU OF SAMPLE EXENSION OF KPCA (DEMO) Fgure 3. Left: Data set orgal dmesos. Gree pots correspod to out-of-sample observatos. Mddle & Rght: out-of-sample extesos. Gree observatos are mapped correctly to ther correspodg clusters. Fgure 4: A D perspectve of the above plot. 4. MULIDIMENSIONAL SCALING (MDS) Multdmesoal scalg (MDS) vsualzes a set of hgh dmesoal pots lower dmesos (usually two or three) based o ther parwse dstaces [Y985]. he problem MDS solves s to map the orgal data to lower dmesos whle preservg parwse dstaces. I other words, two pots that have large dstace rema far apart the reduced dmeso ad those pots that are close by shall be mapped close to each other. Mathematcally: Gve a set of pots ad ther parwse dstaces, fd pots { } y such that y y j dj, j s mmzed. o solve ths problem cosder the proxmty matrx D [ d j ]. he proxmty matrx s varat to chage locato ad rotato, ad we ca obta a uque soluto provded that we assume y 0. If we cosder the equalty y y j dj, t ca be show that D YY, where [ ] Y y y y D s the cetered proxmty matrx 6. o explctly fd the / Y U d j ad y we use the fact that D s utarly dagoalzable ad obta, where D U U. he above approach has bee see before: From data pots{ x }, create a 6 Some propertes of D clude: () D D () D [ y y j], ad (3) D 0 & D 0. Page 7 of 5

9 eghborhood or smlarty matrx D, ceter ths matrx f eeded, ad the solve a egevalue problem [B003;pg]. Cocretely, f we let S Dj j be the th row sum of D, the ceterg s doe va D D S S S j j j ad the embeddg correspods to the rows of Y u u. [B003; pg]. hs s summarzed Algorthm 3a: Algorthm 3a: MDS Iput: Date set [,,, ] (or sp to step ) X x x x Y y y y Ouput: Embeddg. Costruct the parwse squared dstace matrx D [ d j ] D. Ceter, va Dj Dj S S j S 3. Dagoalze D U U 4. Retur the rows of / U u u 4. HE OU OF SAMPLE EXENSION OF MDS Oe of the facts to cosder the out of sample exteso of MDS s to ote that smultaeously embeddg m ew objects s ot equvalet to dvdually embeddg the same objects oe at a tme, as dvdual embeddgs does ot attempt to approxmate dssmlartes betwee pars of ew objects [P008; pg 3]. For smplcty let m s straghtforward. Suppose that d deotes the squared dssmlarty of the ew object x d [ d d d ], where d x x. Costruct a ew matrx sample exteso, let us troduce otato ad a few deftos. m. he exteso for from the orgal objects. I other words, D d A d 0. Before we formalze the out of Defto: Gve m w, we say that,, m y y s w cetered f ad oly f m j wy j j 0 Defto: For m w such that w m 0, defe w w m m w( C) I C I mw mw Notce that m gves the same matrx that was used the ceterg of D Algorthm 3a. Moreover, uder approprate (D) choces of w, the matrx w( C ) has specal propertes that allow t to be factored for the purpose of MDS [P008], exactly as we dd wth the matrx D. he usefuless of ( ) wll be show the dscusso that follows, but before we start our aalyss, let w C Page 8 of 5

10 us revst the questo of why applyg MDS to A does ot solve the out-of-sample exteso problem. he reaso s due to the fact that applyg MDS to D etals approxmatg er products orgal pots. O the other had, f we let ( ) (A), computed wth respect to the cetrod of all must preserve the orgal ceterg. o do ths, let to fd a yx m (D), whch are computed wth respect to the cetrod of the ( ) [ ], applyg MDS to A etals approxmatg er products w [ 0] pots [P008;pg5]. hus to solve the out-of-sample problem we ad costruct that correspods to the out of sample exteso. Let the orgal pots ad costruct a ew matrx as follows: Y* [ Y y x ] (D) b w( A) : B b. he goal s Y y y y represet the embeddg of ( ). he the problem reduces to approxmatg B m B Y Y m b y y y y y x x x x * * y x If the term y y s dropped, the objectve becomes covex wth soluto x x Y Y yx approxmato 7 of a out-of-sample exteso correspodg to a ew ad uow data pot x Y b, where y x d x. If Y y Y Y b. here s a strog relatoshp betwee PCA ad CMDS [G966] whch t ca be show that X X yx X b ad that ( ) b x ( ) x x x x x so that / yx U b represets a va Y has full ra, the y x s a soluto to [00, pg 3-4]. I other words, to obta a out-sample-exteso for CMDS, we would frst have to compute PCA 8 o the orgal data set X, whch s uow by assumpto, ad thus ot very useful. o overcome ths dffcult, we ca explctly compute b ad wrte t terms of ow values. Frst, by defto, D b B I ( ) w A I w( ) b A w A Aw w Aw ( ) ( ) ( ) ( ) D d 0 D d D d 0 D d d 0 0 d 0 d d D d D d D D D D d 0 D d d d D D Sce we are oly terested b, ether tag the last colum or row of B, we have b d d D D 7 o obta the optmal soluto, the above olear optmzato problem must be solved umercally. 8 MDS ad PCA attempt to fd the most accurate data represetato a lower dmesoal space. MDS preserves the most smlartes of the orgal data set, whle PCA reduces dmeso by preservg most of the covarace of data. Page 9 of 5

11 he out-out-of-sample exteso for the case approxmate case s foud Appedx 6.3 m s summarzed Algorthm 3b, ad ts correspodg mplemetato for the Algorthm 3b: out of sample MDS (m = ) U / d Iput: Data set,,from MDS &, the square dssmlarty betwee ew object ad orgal objects Ouput: out of sample y x. Costruct vector b, b d d D D. For approxmate out-of-sample exteso (colum) vector retur y U b / x 3. For optmal out-of-sample exteso retur costruct Y* [ Y y ] x, retur y x that mmzes D B : b m B Y Y y x * * b, ad 9 Notce that the objectve fucto for the optmal soluto s a fourth degree polyomal, whch ca be solved usg gradet umercal methods [008;pg0] 4.3 HE OU OF SAMPLE EXENSION OF MDS (DEMO) Fgure 5: Map of Chese ctes based o ther parwse dstaces. op left: MDS appled to the etre data except Lhasa. Bottom left: out-of-sample exteso for Lhasa. op Rght ad Bottom: he same aalyss ad comparso doe for Hohot. 9 Here d d j j Page 0 of 5

12 Fgure 6: MDS o the seed data set 0. Left: MDS appled to the etre data set. Red crosses correspod to pots that wll be traced the out-of-sample exteso. Rght: MDS o a subset of the data set. Gree pots correspod to the out-of-sample extesos. 4.4 HE OU OF SAMPLE EXENSION OF MDS (DEMO) I secto 4.3, we provded the out-of-sample Algorthm for MDS for the case A m m. For the case m, aalogously defe to be the squared dssmlartes betwee all m objects. Frst, costruct the matrx ( ) m A that ( ) cossts of the MDS embeddg of all m orgal pots, ad let objects. As before, let m Z zz z m be the matrx cotag the o fd the out-of-sample exteso for these ew m (D) B YZ w( A) : B BYZ BZZ Z, ad obta the optmal Z m Y y y y represet the embeddg of the w [ 0 0] -dmesoal embeddg of m ew objects. m objects let, costruct the matrx [P008;pg] by solvg the optmzato problem: Y m B Y Z m B YZ B ZZ m Z m YZ ZZ Z 5. CONCLUSION I ths paper we dscussed the out-of-sample extesos for PCA, Kerel PCA, ad MDS. All three extesos share a ceterg step whch ca be doe easly the PCA case, or eeded to be appled to the Kerel matrx or dssmlarty matrx for Kerel PCA ad MDS, respectvely. o fd the out-of-sample exteso for PCA, ceterg ad mappg to the best-ft le s eough. As for Kerel PCA, a addtoal Kerel matrx, betwee the trag ad ew data eeds to be created, ad ceterg s also mportat before mag trasformatos usg the prevously bult matrces. For the thrd method, MDS, the costructo of matrces volvg trag ad ew data sets, ad ceterg s also requred. Ule the prevous two methods, exact aalytc solutos caot be foud, but rather a approxmate (based o PCA) or a optmal (by solvg a optmzato problem) soluto ca be computed. We have also provded some examples whch we show that out-of-sample extesos are ot equvalet to a embeddg or trasformato of the etre data set, especally methods that are sestve to outlers. 0 Ca be dowloaded at : Page of 5

13 6. APPENDIX Appedx 6. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%% Prcpal Compoet Aalyss (PCA) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Xtr = trag data set. Each row s a observato. % Xtst = ew observatos (out of sample) % = # of compoets to eep. fucto [XtrPCA,V,XtstPCA] = PCA(Xtr,Xtst,) meaxtr = mea(xtr); % ceter the data X_tlde = Xtr - repmat(meaxtr, sze(xtr,), ); [U,S,V] = svds(x_tlde,); XtrPCA = U*S; V = V(:,:); % projecto matrx % out of sample exteso XtstPCA = (Xtst-repmat(meaXtr, sze(xtst,), ))*V; Page of 5

14 Appedx 6. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%% Kerel PCA %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Xtr = data whch each row s a observato. % Xtst = ew observatos (out of sample) % = reduced dmeso; var = sgma^ % Here we use the Gaussa Kerel. Code ca easly be modfed for % other erels. fucto [XtrKPCA,XtstKPCA] = KPCA(Xtr,Xtst,,var) % Calculate parwse dstace matrces Dtr = pdst(xtr,xtr); % Costructg Kerel matrx usg Gaussa Kerel K = K(Dtr, var); % ceterg = sze(xtr,); Kc = K - K*oes(,)/ - oes(,)*k/ + oes(,)*k*oes(,)/(^); % Obta the evectors of K_tlde that correspod to the largest evalues. % hose evectors are the data pots already projected oto the respectve % prcpal compoets. [U,S] = eg(kc,'vector'); [~,dx] = sort(s,'desced'); S = S(dx); S = dag(s); U = U(:,dx); S = abs(s(:,:)); XtrKPCA = U(:,:)*sqrt(S); % out of sample exteso of KPCA Dtst = pdst(xtst,xtr); Kz = K(Dtst, var); % ceterg m = sze(xtst,); Kzc = Kz - oes(m,)*k/ - Kz*oes(,)/ + oes(m,)*k*oes(,)/(^); %XtstKPCA = Kzc*U(:,:)*v(sqrt(S)); XtstKPCA = bsxfu(@rdvde,kyc*xtrkpca,dag(s)'); ed %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Gram matrx fucto %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% fucto K = K(D, var) % (x,xj) = exp(-0.5* x-xj ^/var_j) K = exp(bsxfu(@rdvde, -0.5*D.^, var)); ed Page 3 of 5

15 Appedx 6.3 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%% Math 85 Project Fucto: MDS %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %INPU: X = dstace matrx, assumed to be symmetrc. % = target dmeso; Y = ew (lower) coordates dmeso % d = dstace of ew pot wth orgal pots as a row vector. fucto [Y, y_x, stress] = mds(x,d,) = sze(x,); D = X.^; mead = mea(d,); mmead = mea(mead); %D_tlde = 0.5(-D + oes(,)*d/ + D*oes(,)/ - oes(,)*d*oes(,) D_tlde = 0.5*(repmat(meaD',, ) + repmat(mead,, ) - D - mmead); % Costructg matrx Y [U,S] = eg(d_tlde,'vector'); [~,dx] = sort(s,'desced'); S = S(dx); S = abs(dag(s)); U = U(:,dx); Y = U(:,:)*sqrt(S(:,:)); % Computg stress: stress = sqrt(sum(sum(x,))/(^*l_dotdot)); stress = sqrt(*sum((squareform(x) - pdst(y)).^)/mmead)/; % Out-of-sample exteso d = d'.^; % b = d - oes(,)*d/ - D*oes(,)/ + oes(,)*d*oes(,)/^; b = -0.5*(d - mea(d) - mea(d,) + mmead); y_x = b'*y/s(:,:); % (sqrt(s)u'b)'; retur as row vector Page 4 of 5

16 7. REFERENCES [A003] Aderso, M.J ad Robso, J. Geeralzed dscrmat aalyss based o dstaces. Australa & New Zealad Joural of Statstcs, 45:30 38, 003 [B997] Borg, I., & Groee, P. (997). Moder multdmesoal scalg: theory ad appl- catos. New Yor: Sprger. [B003] Y. Bego, J.-F. Paemot, ad P. Vcet. Out-of-sample extesos for LLE, Isomap, MDS, egemaps, ad spectral clusterg. echcal Report 38, D epartemet d Iformatque et Recherche Op eratoelle, Uverst e de Motr eal, Motr eal, Qu ebec, Caada, July 003. [B006] Bshop, Chrstopher M. (ed.). Patter Recogto ad Mache Learg. Sprger, Cambrdge, U.K.,006. [G996] J. C. Gower. Some dstace propertes of latet root ad vector methods multvarate aalyss. Bometra, 53:35 338, 966. [S998]. Schölopf, B., Smola, A. ad Müller, K.-R. Nolear compoet aalyss as a erel egevalue problem. Neural Computato, 998 [S999]. Scholopf, B., Smola, A.,ad Muller, K.-R. (999). Kerel prcpal compoet aalyss. I B.Scholopf, C. J. C. Burges, ad A. J. Smola, edtors, Advaces Kerel Methods SV Learg, pages MI Press, Cambrdge, MA [S003]. Shles, J. A utoral o Prcpal Compoet Aalyss: Dervato, Dscusso, ad Sgular Value Decomposto [008] M. W. rosset ad C. E. Prebe. he out-of-sample problem for classcal multdmesoal scalg. Computatoal Statstcs ad Data Aalyss, 5: , Jue 008 [00] M. W. rosset ad M. ag. he out-of-sample problem for classcal multdmesoal scalg: Addedum, November 00 [W0]. Q. Wag, Kerel Prcpal Compoet Aalyss ad ts Applcatos Face Recogto ad Actve Shape Models. CoRR, 0. [Y985] Forrest W. Youg, Uversty of North Carola Kotz-Johso (Ed.) Ecyclopeda of Statstcal Sceces, Volume 5, Copyrght (c) 985 by Joh Wley & Sos, Ic Page 5 of 5

3D Geometry for Computer Graphics. Lesson 2: PCA & SVD

3D Geometry for Computer Graphics. Lesson 2: PCA & SVD 3D Geometry for Computer Graphcs Lesso 2: PCA & SVD Last week - egedecomposto We wat to lear how the matrx A works: A 2 Last week - egedecomposto If we look at arbtrary vectors, t does t tell us much.

More information

An Introduction to. Support Vector Machine

An Introduction to. Support Vector Machine A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork

More information

Kernel-based Methods and Support Vector Machines

Kernel-based Methods and Support Vector Machines Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg

More information

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971)) art 4b Asymptotc Results for MRR usg RESS Recall that the RESS statstc s a specal type of cross valdato procedure (see Alle (97)) partcular to the regresso problem ad volves fdg Y $,, the estmate at the

More information

QR Factorization and Singular Value Decomposition COS 323

QR Factorization and Singular Value Decomposition COS 323 QR Factorzato ad Sgular Value Decomposto COS 33 Why Yet Aother Method? How do we solve least-squares wthout currg codto-squarg effect of ormal equatos (A T A A T b) whe A s sgular, fat, or otherwse poorly-specfed?

More information

Chapter 9 Jordan Block Matrices

Chapter 9 Jordan Block Matrices Chapter 9 Jorda Block atrces I ths chapter we wll solve the followg problem. Gve a lear operator T fd a bass R of F such that the matrx R (T) s as smple as possble. f course smple s a matter of taste.

More information

ESS Line Fitting

ESS Line Fitting ESS 5 014 17. Le Fttg A very commo problem data aalyss s lookg for relatoshpetwee dfferet parameters ad fttg les or surfaces to data. The smplest example s fttg a straght le ad we wll dscuss that here

More information

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture) CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.

More information

Assignment 5/MATH 247/Winter Due: Friday, February 19 in class (!) (answers will be posted right after class)

Assignment 5/MATH 247/Winter Due: Friday, February 19 in class (!) (answers will be posted right after class) Assgmet 5/MATH 7/Wter 00 Due: Frday, February 9 class (!) (aswers wll be posted rght after class) As usual, there are peces of text, before the questos [], [], themselves. Recall: For the quadratc form

More information

Dimensionality Reduction

Dimensionality Reduction Dmesoalty Reducto Sav Kumar, Google Research, NY EECS-6898, Columba Uversty - Fall, 010 Sav Kumar 11/16/010 EECS6898 Large Scale Mache Learg 1 Curse of Dmesoalty May learg techques scale poorly wth data

More information

Introduction to local (nonparametric) density estimation. methods

Introduction to local (nonparametric) density estimation. methods Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest

More information

Dimensionality reduction Feature selection

Dimensionality reduction Feature selection CS 750 Mache Learg Lecture 3 Dmesoalty reducto Feature selecto Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 750 Mache Learg Dmesoalty reducto. Motvato. Classfcato problem eample: We have a put data

More information

Unsupervised Learning and Other Neural Networks

Unsupervised Learning and Other Neural Networks CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all

More information

CHAPTER 4 RADICAL EXPRESSIONS

CHAPTER 4 RADICAL EXPRESSIONS 6 CHAPTER RADICAL EXPRESSIONS. The th Root of a Real Number A real umber a s called the th root of a real umber b f Thus, for example: s a square root of sce. s also a square root of sce ( ). s a cube

More information

Cubic Nonpolynomial Spline Approach to the Solution of a Second Order Two-Point Boundary Value Problem

Cubic Nonpolynomial Spline Approach to the Solution of a Second Order Two-Point Boundary Value Problem Joural of Amerca Scece ;6( Cubc Nopolyomal Sple Approach to the Soluto of a Secod Order Two-Pot Boudary Value Problem W.K. Zahra, F.A. Abd El-Salam, A.A. El-Sabbagh ad Z.A. ZAk * Departmet of Egeerg athematcs

More information

Derivation of 3-Point Block Method Formula for Solving First Order Stiff Ordinary Differential Equations

Derivation of 3-Point Block Method Formula for Solving First Order Stiff Ordinary Differential Equations Dervato of -Pot Block Method Formula for Solvg Frst Order Stff Ordary Dfferetal Equatos Kharul Hamd Kharul Auar, Kharl Iskadar Othma, Zara Bb Ibrahm Abstract Dervato of pot block method formula wth costat

More information

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines It J Cotemp Math Sceces, Vol 5, 2010, o 19, 921-929 Solvg Costraed Flow-Shop Schedulg Problems wth Three Maches P Pada ad P Rajedra Departmet of Mathematcs, School of Advaced Sceces, VIT Uversty, Vellore-632

More information

Decomposition of Hadamard Matrices

Decomposition of Hadamard Matrices Chapter 7 Decomposto of Hadamard Matrces We hae see Chapter that Hadamard s orgal costructo of Hadamard matrces states that the Kroecer product of Hadamard matrces of orders m ad s a Hadamard matrx of

More information

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines CS 675 Itroducto to Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Mdterm eam October 9, 7 I-class eam Closed book Stud materal: Lecture otes Correspodg chapters

More information

A conic cutting surface method for linear-quadraticsemidefinite

A conic cutting surface method for linear-quadraticsemidefinite A coc cuttg surface method for lear-quadratcsemdefte programmg Mohammad R. Osoorouch Calfora State Uversty Sa Marcos Sa Marcos, CA Jot wor wth Joh E. Mtchell RPI July 3, 2008 Outle: Secod-order coe: defto

More information

Functions of Random Variables

Functions of Random Variables Fuctos of Radom Varables Chapter Fve Fuctos of Radom Varables 5. Itroducto A geeral egeerg aalyss model s show Fg. 5.. The model output (respose) cotas the performaces of a system or product, such as weght,

More information

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions Iteratoal Joural of Computatoal Egeerg Research Vol, 0 Issue, Estmato of Stress- Stregth Relablty model usg fte mxture of expoetal dstrbutos K.Sadhya, T.S.Umamaheswar Departmet of Mathematcs, Lal Bhadur

More information

PTAS for Bin-Packing

PTAS for Bin-Packing CS 663: Patter Matchg Algorthms Scrbe: Che Jag /9/00. Itroducto PTAS for B-Packg The B-Packg problem s NP-hard. If we use approxmato algorthms, the B-Packg problem could be solved polyomal tme. For example,

More information

PROJECTION PROBLEM FOR REGULAR POLYGONS

PROJECTION PROBLEM FOR REGULAR POLYGONS Joural of Mathematcal Sceces: Advaces ad Applcatos Volume, Number, 008, Pages 95-50 PROJECTION PROBLEM FOR REGULAR POLYGONS College of Scece Bejg Forestry Uversty Bejg 0008 P. R. Cha e-mal: sl@bjfu.edu.c

More information

Dimensionality Reduction and Learning

Dimensionality Reduction and Learning CMSC 35900 (Sprg 009) Large Scale Learg Lecture: 3 Dmesoalty Reducto ad Learg Istructors: Sham Kakade ad Greg Shakharovch L Supervsed Methods ad Dmesoalty Reducto The theme of these two lectures s that

More information

Generalization of the Dissimilarity Measure of Fuzzy Sets

Generalization of the Dissimilarity Measure of Fuzzy Sets Iteratoal Mathematcal Forum 2 2007 o. 68 3395-3400 Geeralzato of the Dssmlarty Measure of Fuzzy Sets Faramarz Faghh Boformatcs Laboratory Naobotechology Research Ceter vesa Research Isttute CECR Tehra

More information

L5 Polynomial / Spline Curves

L5 Polynomial / Spline Curves L5 Polyomal / Sple Curves Cotets Coc sectos Polyomal Curves Hermte Curves Bezer Curves B-Sples No-Uform Ratoal B-Sples (NURBS) Mapulato ad Represetato of Curves Types of Curve Equatos Implct: Descrbe a

More information

BERNSTEIN COLLOCATION METHOD FOR SOLVING NONLINEAR DIFFERENTIAL EQUATIONS. Aysegul Akyuz Dascioglu and Nese Isler

BERNSTEIN COLLOCATION METHOD FOR SOLVING NONLINEAR DIFFERENTIAL EQUATIONS. Aysegul Akyuz Dascioglu and Nese Isler Mathematcal ad Computatoal Applcatos, Vol. 8, No. 3, pp. 293-300, 203 BERNSTEIN COLLOCATION METHOD FOR SOLVING NONLINEAR DIFFERENTIAL EQUATIONS Aysegul Ayuz Dascoglu ad Nese Isler Departmet of Mathematcs,

More information

Application of Legendre Bernstein basis transformations to degree elevation and degree reduction

Application of Legendre Bernstein basis transformations to degree elevation and degree reduction Computer Aded Geometrc Desg 9 79 78 www.elsever.com/locate/cagd Applcato of Legedre Berste bass trasformatos to degree elevato ad degree reducto Byug-Gook Lee a Yubeom Park b Jaechl Yoo c a Dvso of Iteret

More information

to the estimation of total sensitivity indices

to the estimation of total sensitivity indices Applcato of the cotrol o varate ate techque to the estmato of total sestvty dces S KUCHERENKO B DELPUECH Imperal College Lodo (UK) skuchereko@mperalacuk B IOOSS Electrcté de Frace (Frace) S TARANTOLA Jot

More information

Lecture Notes 2. The ability to manipulate matrices is critical in economics.

Lecture Notes 2. The ability to manipulate matrices is critical in economics. Lecture Notes. Revew of Matrces he ablt to mapulate matrces s crtcal ecoomcs.. Matr a rectagular arra of umbers, parameters, or varables placed rows ad colums. Matrces are assocated wth lear equatos. lemets

More information

Non-uniform Turán-type problems

Non-uniform Turán-type problems Joural of Combatoral Theory, Seres A 111 2005 106 110 wwwelsevercomlocatecta No-uform Turá-type problems DhruvMubay 1, Y Zhao 2 Departmet of Mathematcs, Statstcs, ad Computer Scece, Uversty of Illos at

More information

Ideal multigrades with trigonometric coefficients

Ideal multigrades with trigonometric coefficients Ideal multgrades wth trgoometrc coeffcets Zarathustra Brady December 13, 010 1 The problem A (, k) multgrade s defed as a par of dstct sets of tegers such that (a 1,..., a ; b 1,..., b ) a j = =1 for all

More information

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions. Ordary Least Squares egresso. Smple egresso. Algebra ad Assumptos. I ths part of the course we are gog to study a techque for aalysg the lear relatoshp betwee two varables Y ad X. We have pars of observatos

More information

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i.

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i. CS 94- Desty Matrces, vo Neuma Etropy 3/7/07 Sprg 007 Lecture 3 I ths lecture, we wll dscuss the bascs of quatum formato theory I partcular, we wll dscuss mxed quatum states, desty matrces, vo Neuma etropy

More information

0/1 INTEGER PROGRAMMING AND SEMIDEFINTE PROGRAMMING

0/1 INTEGER PROGRAMMING AND SEMIDEFINTE PROGRAMMING CONVEX OPIMIZAION AND INERIOR POIN MEHODS FINAL PROJEC / INEGER PROGRAMMING AND SEMIDEFINE PROGRAMMING b Luca Buch ad Natala Vktorova CONENS:.Itroducto.Formulato.Applcato to Kapsack Problem 4.Cuttg Plaes

More information

Principal Components. Analysis. Basic Intuition. A Method of Self Organized Learning

Principal Components. Analysis. Basic Intuition. A Method of Self Organized Learning Prcpal Compoets Aalss A Method of Self Orgazed Learg Prcpal Compoets Aalss Stadard techque for data reducto statstcal patter matchg ad sgal processg Usupervsed learg: lear from examples wthout a teacher

More information

MATH 247/Winter Notes on the adjoint and on normal operators.

MATH 247/Winter Notes on the adjoint and on normal operators. MATH 47/Wter 00 Notes o the adjot ad o ormal operators I these otes, V s a fte dmesoal er product space over, wth gve er * product uv, T, S, T, are lear operators o V U, W are subspaces of V Whe we say

More information

Point Estimation: definition of estimators

Point Estimation: definition of estimators Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.

More information

18.413: Error Correcting Codes Lab March 2, Lecture 8

18.413: Error Correcting Codes Lab March 2, Lecture 8 18.413: Error Correctg Codes Lab March 2, 2004 Lecturer: Dael A. Spelma Lecture 8 8.1 Vector Spaces A set C {0, 1} s a vector space f for x all C ad y C, x + y C, where we take addto to be compoet wse

More information

Simple Linear Regression

Simple Linear Regression Statstcal Methods I (EST 75) Page 139 Smple Lear Regresso Smple regresso applcatos are used to ft a model descrbg a lear relatoshp betwee two varables. The aspects of least squares regresso ad correlato

More information

2006 Jamie Trahan, Autar Kaw, Kevin Martin University of South Florida United States of America

2006 Jamie Trahan, Autar Kaw, Kevin Martin University of South Florida United States of America SOLUTION OF SYSTEMS OF SIMULTANEOUS LINEAR EQUATIONS Gauss-Sedel Method 006 Jame Traha, Autar Kaw, Kev Mart Uversty of South Florda Uted States of Amerca kaw@eg.usf.edu Itroducto Ths worksheet demostrates

More information

1 Convergence of the Arnoldi method for eigenvalue problems

1 Convergence of the Arnoldi method for eigenvalue problems Lecture otes umercal lear algebra Arold method covergece Covergece of the Arold method for egevalue problems Recall that, uless t breaks dow, k steps of the Arold method geerates a orthogoal bass of a

More information

Chapter 5 Properties of a Random Sample

Chapter 5 Properties of a Random Sample Lecture 6 o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for the prevous lecture Cocepts: t-dstrbuto, F-dstrbuto Theorems: Dstrbutos of sample mea ad sample varace, relatoshp betwee sample mea ad sample

More information

1 Lyapunov Stability Theory

1 Lyapunov Stability Theory Lyapuov Stablty heory I ths secto we cosder proofs of stablty of equlbra of autoomous systems. hs s stadard theory for olear systems, ad oe of the most mportat tools the aalyss of olear systems. It may

More information

ANALYSIS ON THE NATURE OF THE BASIC EQUATIONS IN SYNERGETIC INTER-REPRESENTATION NETWORK

ANALYSIS ON THE NATURE OF THE BASIC EQUATIONS IN SYNERGETIC INTER-REPRESENTATION NETWORK Far East Joural of Appled Mathematcs Volume, Number, 2008, Pages Ths paper s avalable ole at http://www.pphm.com 2008 Pushpa Publshg House ANALYSIS ON THE NATURE OF THE ASI EQUATIONS IN SYNERGETI INTER-REPRESENTATION

More information

Support vector machines II

Support vector machines II CS 75 Mache Learg Lecture Support vector maches II Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Learl separable classes Learl separable classes: here s a hperplae that separates trag staces th o error

More information

A New Method for Decision Making Based on Soft Matrix Theory

A New Method for Decision Making Based on Soft Matrix Theory Joural of Scetfc esearch & eports 3(5): 0-7, 04; rtcle o. JS.04.5.00 SCIENCEDOMIN teratoal www.scecedoma.org New Method for Decso Mag Based o Soft Matrx Theory Zhmg Zhag * College of Mathematcs ad Computer

More information

A Robust Total Least Mean Square Algorithm For Nonlinear Adaptive Filter

A Robust Total Least Mean Square Algorithm For Nonlinear Adaptive Filter A Robust otal east Mea Square Algorthm For Nolear Adaptve Flter Ruxua We School of Electroc ad Iformato Egeerg X'a Jaotog Uversty X'a 70049, P.R. Cha rxwe@chare.com Chogzhao Ha, azhe u School of Electroc

More information

III-16 G. Brief Review of Grand Orthogonality Theorem and impact on Representations (Γ i ) l i = h n = number of irreducible representations.

III-16 G. Brief Review of Grand Orthogonality Theorem and impact on Representations (Γ i ) l i = h n = number of irreducible representations. III- G. Bref evew of Grad Orthogoalty Theorem ad mpact o epresetatos ( ) GOT: h [ () m ] [ () m ] δδ δmm ll GOT puts great restrcto o form of rreducble represetato also o umber: l h umber of rreducble

More information

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then Secto 5 Vectors of Radom Varables Whe workg wth several radom varables,,..., to arrage them vector form x, t s ofte coveet We ca the make use of matrx algebra to help us orgaze ad mapulate large umbers

More information

Can we take the Mysticism Out of the Pearson Coefficient of Linear Correlation?

Can we take the Mysticism Out of the Pearson Coefficient of Linear Correlation? Ca we tae the Mstcsm Out of the Pearso Coeffcet of Lear Correlato? Itroducto As the ttle of ths tutoral dcates, our purpose s to egeder a clear uderstadg of the Pearso coeffcet of lear correlato studets

More information

13. Parametric and Non-Parametric Uncertainties, Radial Basis Functions and Neural Network Approximations

13. Parametric and Non-Parametric Uncertainties, Radial Basis Functions and Neural Network Approximations Lecture 7 3. Parametrc ad No-Parametrc Ucertates, Radal Bass Fuctos ad Neural Network Approxmatos he parameter estmato algorthms descrbed prevous sectos were based o the assumpto that the system ucertates

More information

MAX-MIN AND MIN-MAX VALUES OF VARIOUS MEASURES OF FUZZY DIVERGENCE

MAX-MIN AND MIN-MAX VALUES OF VARIOUS MEASURES OF FUZZY DIVERGENCE merca Jr of Mathematcs ad Sceces Vol, No,(Jauary 0) Copyrght Md Reader Publcatos wwwjouralshubcom MX-MIN ND MIN-MX VLUES OF VRIOUS MESURES OF FUZZY DIVERGENCE RKTul Departmet of Mathematcs SSM College

More information

Rademacher Complexity. Examples

Rademacher Complexity. Examples Algorthmc Foudatos of Learg Lecture 3 Rademacher Complexty. Examples Lecturer: Patrck Rebesch Verso: October 16th 018 3.1 Itroducto I the last lecture we troduced the oto of Rademacher complexty ad showed

More information

CS286.2 Lecture 4: Dinur s Proof of the PCP Theorem

CS286.2 Lecture 4: Dinur s Proof of the PCP Theorem CS86. Lecture 4: Dur s Proof of the PCP Theorem Scrbe: Thom Bohdaowcz Prevously, we have prove a weak verso of the PCP theorem: NP PCP 1,1/ (r = poly, q = O(1)). Wth ths result we have the desred costat

More information

Solution of General Dual Fuzzy Linear Systems. Using ABS Algorithm

Solution of General Dual Fuzzy Linear Systems. Using ABS Algorithm Appled Mathematcal Sceces, Vol 6, 0, o 4, 63-7 Soluto of Geeral Dual Fuzzy Lear Systems Usg ABS Algorthm M A Farborz Aragh * ad M M ossezadeh Departmet of Mathematcs, Islamc Azad Uversty Cetral ehra Brach,

More information

A tighter lower bound on the circuit size of the hardest Boolean functions

A tighter lower bound on the circuit size of the hardest Boolean functions Electroc Colloquum o Computatoal Complexty, Report No. 86 2011) A tghter lower boud o the crcut sze of the hardest Boolea fuctos Masak Yamamoto Abstract I [IPL2005], Fradse ad Mlterse mproved bouds o the

More information

Some Notes on the Probability Space of Statistical Surveys

Some Notes on the Probability Space of Statistical Surveys Metodološk zvezk, Vol. 7, No., 200, 7-2 ome Notes o the Probablty pace of tatstcal urveys George Petrakos Abstract Ths paper troduces a formal presetato of samplg process usg prcples ad cocepts from Probablty

More information

CHAPTER VI Statistical Analysis of Experimental Data

CHAPTER VI Statistical Analysis of Experimental Data Chapter VI Statstcal Aalyss of Expermetal Data CHAPTER VI Statstcal Aalyss of Expermetal Data Measuremets do ot lead to a uque value. Ths s a result of the multtude of errors (maly radom errors) that ca

More information

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Multivariate Transformation of Variables and Maximum Likelihood Estimation Marquette Uversty Multvarate Trasformato of Varables ad Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Assocate Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Copyrght 03 by Marquette Uversty

More information

MULTIDIMENSIONAL HETEROGENEOUS VARIABLE PREDICTION BASED ON EXPERTS STATEMENTS. Gennadiy Lbov, Maxim Gerasimov

MULTIDIMENSIONAL HETEROGENEOUS VARIABLE PREDICTION BASED ON EXPERTS STATEMENTS. Gennadiy Lbov, Maxim Gerasimov Iteratoal Boo Seres "Iformato Scece ad Computg" 97 MULTIIMNSIONAL HTROGNOUS VARIABL PRICTION BAS ON PRTS STATMNTS Geady Lbov Maxm Gerasmov Abstract: I the wors [ ] we proposed a approach of formg a cosesus

More information

Bayes (Naïve or not) Classifiers: Generative Approach

Bayes (Naïve or not) Classifiers: Generative Approach Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg

More information

Descriptive Statistics

Descriptive Statistics Page Techcal Math II Descrptve Statstcs Descrptve Statstcs Descrptve statstcs s the body of methods used to represet ad summarze sets of data. A descrpto of how a set of measuremets (for eample, people

More information

Analysis of Lagrange Interpolation Formula

Analysis of Lagrange Interpolation Formula P IJISET - Iteratoal Joural of Iovatve Scece, Egeerg & Techology, Vol. Issue, December 4. www.jset.com ISS 348 7968 Aalyss of Lagrage Iterpolato Formula Vjay Dahya PDepartmet of MathematcsMaharaja Surajmal

More information

Research Article Multidimensional Hilbert-Type Inequalities with a Homogeneous Kernel

Research Article Multidimensional Hilbert-Type Inequalities with a Homogeneous Kernel Hdaw Publshg Corporato Joural of Iequaltes ad Applcatos Volume 29, Artcle ID 3958, 2 pages do:.55/29/3958 Research Artcle Multdmesoal Hlbert-Type Iequaltes wth a Homogeeous Kerel Predrag Vuovć Faculty

More information

Arithmetic Mean and Geometric Mean

Arithmetic Mean and Geometric Mean Acta Mathematca Ntresa Vol, No, p 43 48 ISSN 453-6083 Arthmetc Mea ad Geometrc Mea Mare Varga a * Peter Mchalča b a Departmet of Mathematcs, Faculty of Natural Sceces, Costate the Phlosopher Uversty Ntra,

More information

Announcements. Recognition II. Computer Vision I. Example: Face Detection. Evaluating a binary classifier

Announcements. Recognition II. Computer Vision I. Example: Face Detection. Evaluating a binary classifier Aoucemets Recogto II H3 exteded to toght H4 to be aouced today. Due Frday 2/8. Note wll take a whle to ru some thgs. Fal Exam: hursday 2/4 at 7pm-0pm CSE252A Lecture 7 Example: Face Detecto Evaluatg a

More information

TESTS BASED ON MAXIMUM LIKELIHOOD

TESTS BASED ON MAXIMUM LIKELIHOOD ESE 5 Toy E. Smth. The Basc Example. TESTS BASED ON MAXIMUM LIKELIHOOD To llustrate the propertes of maxmum lkelhood estmates ad tests, we cosder the smplest possble case of estmatg the mea of the ormal

More information

Newton s Power Flow algorithm

Newton s Power Flow algorithm Power Egeerg - Egll Beedt Hresso ewto s Power Flow algorthm Power Egeerg - Egll Beedt Hresso The ewto s Method of Power Flow 2 Calculatos. For the referece bus #, we set : V = p.u. ad δ = 0 For all other

More information

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy Bouds o the expected etropy ad KL-dvergece of sampled multomal dstrbutos Brado C. Roy bcroy@meda.mt.edu Orgal: May 18, 2011 Revsed: Jue 6, 2011 Abstract Iformato theoretc quattes calculated from a sampled

More information

Beam Warming Second-Order Upwind Method

Beam Warming Second-Order Upwind Method Beam Warmg Secod-Order Upwd Method Petr Valeta Jauary 6, 015 Ths documet s a part of the assessmet work for the subject 1DRP Dfferetal Equatos o Computer lectured o FNSPE CTU Prague. Abstract Ths documet

More information

Binary classification: Support Vector Machines

Binary classification: Support Vector Machines CS 57 Itroducto to AI Lecture 6 Bar classfcato: Support Vector Maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Supervsed learg Data: D { D, D,.., D} a set of eamples D, (,,,,,

More information

Supervised learning: Linear regression Logistic regression

Supervised learning: Linear regression Logistic regression CS 57 Itroducto to AI Lecture 4 Supervsed learg: Lear regresso Logstc regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Data: D { D D.. D D Supervsed learg d a set of eamples s

More information

h-analogue of Fibonacci Numbers

h-analogue of Fibonacci Numbers h-aalogue of Fboacc Numbers arxv:090.0038v [math-ph 30 Sep 009 H.B. Beaoum Prce Mohammad Uversty, Al-Khobar 395, Saud Araba Abstract I ths paper, we troduce the h-aalogue of Fboacc umbers for o-commutatve

More information

Chapter 2 - Free Vibration of Multi-Degree-of-Freedom Systems - II

Chapter 2 - Free Vibration of Multi-Degree-of-Freedom Systems - II CEE49b Chapter - Free Vbrato of Mult-Degree-of-Freedom Systems - II We ca obta a approxmate soluto to the fudametal atural frequecy through a approxmate formula developed usg eergy prcples by Lord Raylegh

More information

Investigating Cellular Automata

Investigating Cellular Automata Researcher: Taylor Dupuy Advsor: Aaro Wootto Semester: Fall 4 Ivestgatg Cellular Automata A Overvew of Cellular Automata: Cellular Automata are smple computer programs that geerate rows of black ad whte

More information

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements Aoucemets No-Parametrc Desty Estmato Techques HW assged Most of ths lecture was o the blacboard. These sldes cover the same materal as preseted DHS Bometrcs CSE 90-a Lecture 7 CSE90a Fall 06 CSE90a Fall

More information

A unified matrix representation for degree reduction of Bézier curves

A unified matrix representation for degree reduction of Bézier curves Computer Aded Geometrc Desg 21 2004 151 164 wwwelsevercom/locate/cagd A ufed matrx represetato for degree reducto of Bézer curves Hask Suwoo a,,1, Namyog Lee b a Departmet of Mathematcs, Kokuk Uversty,

More information

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab Lear Regresso Lear Regresso th Shrkage Some sldes are due to Tomm Jaakkola, MIT AI Lab Itroducto The goal of regresso s to make quattatve real valued predctos o the bass of a vector of features or attrbutes.

More information

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits Block-Based Compact hermal Modelg of Semcoductor Itegrated Crcuts Master s hess Defese Caddate: Jg Ba Commttee Members: Dr. Mg-Cheg Cheg Dr. Daqg Hou Dr. Robert Schllg July 27, 2009 Outle Itroducto Backgroud

More information

Summary of the lecture in Biostatistics

Summary of the lecture in Biostatistics Summary of the lecture Bostatstcs Probablty Desty Fucto For a cotuos radom varable, a probablty desty fucto s a fucto such that: 0 dx a b) b a dx A probablty desty fucto provdes a smple descrpto of the

More information

Lecture 9: Tolerant Testing

Lecture 9: Tolerant Testing Lecture 9: Tolerat Testg Dael Kae Scrbe: Sakeerth Rao Aprl 4, 07 Abstract I ths lecture we prove a quas lear lower boud o the umber of samples eeded to do tolerat testg for L dstace. Tolerat Testg We have

More information

Some Different Perspectives on Linear Least Squares

Some Different Perspectives on Linear Least Squares Soe Dfferet Perspectves o Lear Least Squares A stadard proble statstcs s to easure a respose or depedet varable, y, at fed values of oe or ore depedet varables. Soetes there ests a deterstc odel y f (,,

More information

Algebraic-Geometric and Probabilistic Approaches for Clustering and Dimension Reduction of Mixtures of Principle Component Subspaces

Algebraic-Geometric and Probabilistic Approaches for Clustering and Dimension Reduction of Mixtures of Principle Component Subspaces Algebrac-Geometrc ad Probablstc Approaches for Clusterg ad Dmeso Reducto of Mxtures of Prcple Compoet Subspaces ECE842 Course Project Report Chagfag Zhu Dec. 4, 2004 Algebrac-Geometrc ad Probablstc Approach

More information

New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision

New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision ew Schedule Dec. 8 same same same Oct. ^ weeks ^ week ^ week Fall 004 Patter Recogto for Vso 9.93 Patter Recogto for Vso Classfcato Berd Hesele Fall 004 Overvew Itroducto Lear Dscrmat Aalyss Support Vector

More information

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses Johs Hopks Uverst Departmet of Bostatstcs Math Revew for Itroductor Courses Ratoale Bostatstcs courses wll rel o some fudametal mathematcal relatoshps, fuctos ad otato. The purpose of ths Math Revew s

More information

Unimodality Tests for Global Optimization of Single Variable Functions Using Statistical Methods

Unimodality Tests for Global Optimization of Single Variable Functions Using Statistical Methods Malaysa Umodalty Joural Tests of Mathematcal for Global Optmzato Sceces (): of 05 Sgle - 5 Varable (007) Fuctos Usg Statstcal Methods Umodalty Tests for Global Optmzato of Sgle Varable Fuctos Usg Statstcal

More information

Lecture Note to Rice Chapter 8

Lecture Note to Rice Chapter 8 ECON 430 HG revsed Nov 06 Lecture Note to Rce Chapter 8 Radom matrces Let Y, =,,, m, =,,, be radom varables (r.v. s). The matrx Y Y Y Y Y Y Y Y Y Y = m m m s called a radom matrx ( wth a ot m-dmesoal dstrbuto,

More information

ECON 5360 Class Notes GMM

ECON 5360 Class Notes GMM ECON 560 Class Notes GMM Geeralzed Method of Momets (GMM) I beg by outlg the classcal method of momets techque (Fsher, 95) ad the proceed to geeralzed method of momets (Hase, 98).. radtoal Method of Momets

More information

4 Inner Product Spaces

4 Inner Product Spaces 11.MH1 LINEAR ALGEBRA Summary Notes 4 Ier Product Spaces Ier product s the abstracto to geeral vector spaces of the famlar dea of the scalar product of two vectors or 3. I what follows, keep these key

More information

Laboratory I.10 It All Adds Up

Laboratory I.10 It All Adds Up Laboratory I. It All Adds Up Goals The studet wll work wth Rema sums ad evaluate them usg Derve. The studet wll see applcatos of tegrals as accumulatos of chages. The studet wll revew curve fttg sklls.

More information

Lecture 3 Probability review (cont d)

Lecture 3 Probability review (cont d) STATS 00: Itroducto to Statstcal Iferece Autum 06 Lecture 3 Probablty revew (cot d) 3. Jot dstrbutos If radom varables X,..., X k are depedet, the ther dstrbuto may be specfed by specfyg the dvdual dstrbuto

More information

Third handout: On the Gini Index

Third handout: On the Gini Index Thrd hadout: O the dex Corrado, a tala statstca, proposed (, 9, 96) to measure absolute equalt va the mea dfferece whch s defed as ( / ) where refers to the total umber of dvduals socet. Assume that. The

More information

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b CS 70 Dscrete Mathematcs ad Probablty Theory Fall 206 Sesha ad Walrad DIS 0b. Wll I Get My Package? Seaky delvery guy of some compay s out delverg packages to customers. Not oly does he had a radom package

More information

C-1: Aerodynamics of Airfoils 1 C-2: Aerodynamics of Airfoils 2 C-3: Panel Methods C-4: Thin Airfoil Theory

C-1: Aerodynamics of Airfoils 1 C-2: Aerodynamics of Airfoils 2 C-3: Panel Methods C-4: Thin Airfoil Theory ROAD MAP... AE301 Aerodyamcs I UNIT C: 2-D Arfols C-1: Aerodyamcs of Arfols 1 C-2: Aerodyamcs of Arfols 2 C-3: Pael Methods C-4: Th Arfol Theory AE301 Aerodyamcs I Ut C-3: Lst of Subects Problem Solutos?

More information

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses Johs Hopks Uverst Departmet of Bostatstcs Math Revew for Itroductor Courses Ratoale Bostatstcs courses wll rel o some fudametal mathematcal relatoshps, fuctos ad otato. The purpose of ths Math Revew s

More information

Lecture 3. Sampling, sampling distributions, and parameter estimation

Lecture 3. Sampling, sampling distributions, and parameter estimation Lecture 3 Samplg, samplg dstrbutos, ad parameter estmato Samplg Defto Populato s defed as the collecto of all the possble observatos of terest. The collecto of observatos we take from the populato s called

More information

ENGI 3423 Simple Linear Regression Page 12-01

ENGI 3423 Simple Linear Regression Page 12-01 ENGI 343 mple Lear Regresso Page - mple Lear Regresso ometmes a expermet s set up where the expermeter has cotrol over the values of oe or more varables X ad measures the resultg values of aother varable

More information

Support vector machines

Support vector machines CS 75 Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Outle Outle: Algorthms for lear decso boudary Support vector maches Mamum marg hyperplae.

More information