Supplementary material for the paper Optimal Transport for Domain Adaptation
|
|
- Mitchell Percival Stokes
- 5 years ago
- Views:
Transcription
1 1 Supplemetary materal for the paper Optmal Trasport for Doma Adaptato Ncolas Courty, Member, IEEE, Rem Flamary, Devs Tua, Seor Member, IEEE, Ala Rakotomamojy, Member, IEEE I. OPTIMAL TRANSPORT FOR AFFINE TRANSFORMATIONS Our ma hypothess s that the trasformato T( ) preserves the codtoal dstrbuto P s (y x s ) = P t (y T(x s )) Hece, f we are able to estmate a trasformato T ( ) such that for all x, T (x) = T(x), we ca compute the exact codtoal probablty dstrbuto of ay example x the target doma through P t (y x) = P s (y T 1 (x)) supposg that T s vertble. We ca remark that the ablty of our approach to recover the codtoal probablty of a example essetally depeds o how well the codto T (x) = T(x) x s respected. I the followg, we prove that, for emprcal dstrbutos ad uder some mld codtos (postve defte A), the estmated trasportato map T obtaed by mmzg the l 2 groud metrc does recover exactly the true trasformato T. We wat to show that, f µ t (x) = µ s (Ax+b) where A S + ad b R d, the the estmated optmal trasportato pla T (x) = Ax + b. Ths bols dow to provg the followg theorem. Theorem 1.1: Let µ s ad µ t be two dscrete dstrbutos each composed of dracs as defed Equato (1) the paper. If the followg codtos hold 1) The source samples µ s are x s Rd, 1,..., such that x s xs j f j. 2) All weghts the source ad target dstrbutos are equal to 1. 3) The target samples are defed as x t = Axs + b.e. a affe traformato of the source samples. 4) b R d ad A S + s a strctly postve defte matrx. 5) The cost fucto c(x s, x t ) = x s x t 2. the the soluto T of the optmal trasport problem gves T (x s ) = Axs + b = xt 1,...,. Frst, ote that the OT problem (ad partcular the sum ad postvty costrats) forces the soluto γ to be a doubly stochastc matrx weghted by 1. Sce the objectve fucto s lear, the soluto wll be a vertex of the set of doubly stochastc matrces whch s a permutato matrx as stated by the Brkhoff-vo Neuma Theorem [1]. To prove Theorem 1.1 we frst show that, gve the above codtos, f the soluto of the dscrete OT problem s the detty matrx γ = 1 I, the the correspodg trasportato T (x) gves T (x s ) = Axs +b = xt 1,...,. Ths property results from the terpolato formula Equato (14), whch states that a terpolato alog the Wasserste mafold from the source to target dstrbuto wth t [, 1] s defed as: ˆµ =,j Whe γ = 1 I, t leads to γ (, j)δ (1 t)x s +tx t j. ˆµ = 1 δ (1 t)x s +tx t. Ths equato shows that the mass from each sample x s s trasported wthout ay splttg to ts correspodg target sample x t hece T (x s ) = xt. Ideed, we have (1 t)x s + tx t = (1 t)x s + t(ax s + b) (1) = (1 t ta)x s + tb, whch for t = 1 gves T (x s ) = xt. Hece, the OT soluto γ = 1 I yelds a exact recostructo of the trasformato o the samples ad T (x s ) = Axs + b = x t 1,...,. Cosequetly, to prove Theorem 1.1 we just eed to prove that the OT soluto γ s equal to 1 I gve the above codtos. Ths s doe the followg for some partcular cases of affe trasformatos ad for the geeral case where A S + ad b R d. For better readablty we wll deote a sample the source doma x s as x, whereas a sample the target doma s x t. A. Case wth o trasformato (A = I d ad b =, Fgure 1) I ths case, there s o dsplacemet. γ = argm γ, C F (2) The soluto s obvously γ = 1 I, sce t does ot volate the costrats ad s the oly possble soluto wth a loss (ay mass ot o the dagoal of γ would mply a crease the loss).
2 2.4.2 Ital dstrbutos 2 trasp Ital dstrbutos 2 trasp Fg. 1: Optmal trasport the affe trasformato 1: case wth o trasformato Fg. 3: Optmal trasport the affe trasformato 3: geeral case Ital dstrbutos trasp wll have a loss strctly larger that the loss of the detty permutato. Therefore, the soluto of the optmzato problem s the detty permutato. Ths shows that a traslato of the data does ot chage the soluto of the optmal trasport Fg. 2: Optmal trasport the affe trasformato 2: case wth traslato B. Case wth traslato (A = I d ad b R d, Fgure 2) I ths case, the metrc matrx becomes C such that C(, j) = x x j b 2 = x x j 2 + b 2 2b (x x j ). I ths case the soluto γ = I. The correspodg permutato perm I leads to the loss J(I) = 1 x x b 2 = b 2. I the geeral case, the loss for a permutato perm ca be expressed as = 1 x x perm() 2 + b 2 = b x x perm() 2 Sce x x j, f j, for ay permutato term perm() the x x perm() 2 >. Ths meas that ay permutato that s ot the detty permutato C. Geeral case (A S + ad b R d, Fgure 3) I ths case, we wll set b = sce, as prove prevously, a traslato does ot chage the optmal trasport soluto. The loss ca be expressed as J(γ) = 1 x Ax perm() 2 = 1 = 1 x 2 + Ax perm() 2 2x Ax perm() x 2 + Ax 2 2 x Ax perm(). The soluto of the optmzato wll the be the permutato that maxmzes the term x Ax perm(). A s a postve defte matrx, whch meas that the prevous term ca be see as a sum of scalar products A1/2 x, A 1/2 x perm() = x, x perm(), where x = A 1/2 x. It s easy to show that J(γ) = 1 x x perm() b 2 x x perm() 2 = 2 x 2 2 x x perm(), = 1 x x perm() 2 + whch meas that x b x x perm(). (x x perm() ) b The detty permutato s the a maxmum of the cross = 1 x x perm() 2 + b 2 2 scalar product ad a mmum of J. Sce x x j, f (x x ) b j the equalty stads oly for the detty permutato. Ths meas that oce aga the soluto of the optmzato problem s the detty matrx. Note that f A has egatve egevalues, the trasport fals recostructg the orgal trasformato, as llustrated Fgure 4. II. INFLUENCE OF THE CHOICE OF THE COST FUNCTION We exame here role played by the type of cost fucto C used the adaptato. Ths cost fucto allows aturally to take to accout the specfcty of the data space.
3 3 Ital dstrbutos A. The geeralzed codtoal gradet algorthm trasp We are terested the problem of mmzg uder costrats a composte fucto such as m F (γ) = f(γ) + g(γ), (3) Fg. 4: Optmal trasport the affe trasformato 4: case wth egatve egevalues I the case of the Offce-Caltech dataset, the features are hstograms. We have tested our OT-IT adaptato strategy wth alteratve costs that operate drectly o hstograms (l 2 ad l 1 orms), as t s a commo practce to cosder those metrcs whe comparg dstrbutos. Also, as mostly doe the lterature, we cosder the pre-pocessed features (ormalzato + zscore) together wth Eucldea ad squared Eucldea orms. As t ca be see Table I, the best results are ot always obtaed wth the same metrc. I the paper, we choose to oly cosder the l 2 orm o preprocessed data, as t s the oe that gave the best result o average, but t s relevat to questo what s the best metrc for a adaptato task. Several other metrcs could be cosdered, lke geodesc dstaces the case where data lve o a mafold. Also, a better talored metrc could be leart from the data. Some recet works are cosderg ths opto [2], [3] for other tasks. Ths costtutes a terestg perspectve for the followg extesos of our methodology for the doma adaptato problem. TABLE I: Results of chagg the cost fucto matrx C o the Offce-Caltech dataset. Domas l 2 l 2 +preprocessg l 2 2 +preprocessg l 1 C A C W C D A C A W A D W C W A W D D C D A D W mea III. REMARKS ON GENERALIZED CONDITIONAL GRADIENT We frst preset the geeralzed codtoal gradet algorthm as troduced by Bredes et al. [4] ad, the subsequet sectos, we wll provde addtoal propertes that are of terest whe applyg ths algorthm for regularzed optmal trasport problems where both f( ) ad g( ) are covex ad dfferetable fuctos ad B s a compact set of R. Oe mght wat to beeft from ths composte structure durg the optmzato procedure. For stace, f we have a effcet solver for optmzg m f, γ + g(γ) (4) we propose ths solver the optmzato scheme stead of learzg the whole objectve fucto (as oe would do wth a codtoal gradet algorthm [5]). The resultg approach s defed Algorthm 1. Coceptually, our algorthm les -betwee the orgal optmzato problem ad the codtoal gradet. Ideed, f we do ot cosder ay learzato, step 3 of the algorthm s equvalet to solvg the orgal problem ad oe terate wll suffce for covergece. If we use a full learzato as the codtoal gradet approach, step 3 s equvalet to solvg a rough approxmato of the orgal problem. By learzg oly a part of the objectve fucto, we thus optmze a better approxmato of that fucto. Ths wll lead to a provably better certfcate of optmalty tha the oe of the codtoal gradet algorthm [6]. Ths makes us thus beleve that f a effcet solver of the partally learzed problem s avalable, our algorthm s of strog terest. Note that ths partal learzato dea has already bee troduced by Bredes et al. [4] for solvg problem (3). Ther theoretcal results related to the resultg algorthm apply whe f( ) s dfferetable, g( ), s covex ad both f ad g satsfy some others mld codtos lke coercvty. These results state that the geeralzed codtoal gradet algorthm s a descet method ad that ay lmt pot of the algorthm s a statoary pot of f + g. I what follows, we provde some results whe f ad g are dfferetable. Some of these results provde ovel sghts o the geeralzed gradet algorthms (relato betwee optmalty ad mmzer of the search drecto, covergece rate, ad optmalty certfcate) whle some are redudat to those proposed by Bredes (covergece). B. Covergece of the algorthm 1) Reformulatg the search drecto: Before dscussg covergece of the algorthm, we frst reformulated ts step 3 so as to make ts propertes more accessble ad ts covergece aalyss more ameable. The reformulato we propose s s k = argm f(γ k ), s γ k + g(s) g(γ k ) (5) ad t s easy to ote that the problem le 3 of Algorthm 1 s equvalet to ths oe ad leads to the same soluto.
4 4 Algorthm 1 Geeralzed Codtoal Gradet (GGG) 1: Italze k = ad γ P 2: repeat 3: Solve problem s k = argm f(γ k ), γ + g(γ) 4: Fd the optmal step α k wth γ = s k γ k α k = argm α 1 f(γ k + α γ) + g(γ k + α γ) or choose α k so that t satsfes the Armjo rule. 5: γ k+1 γ k + α k γ, set k k + 1 6: utl Covergece 2) Relato betwee mmzers of Problems 3 ad 5 : The above formulato allows us to derve a property that hghlgths the relato betwee problems 3 ad 5. Proposto 3.1: x s a mmzer of problem (3) f ad oly f γ = argm f(γ ), s γ + g(s) g(γ ). (6) Proof: The proof reles o optmalty codtos of costraed covex optmzato problem. Ideed, for a covex ad dfferetable f ad g, γ s soluto of problem (3) f ad oly f [7] f(γ ) g(γ ) N B (γ ), (7) where N B (γ) s the ormal coe of B at γ. I the same way, a mmzer s of problem (5) at γ k ca also be characterzed as f(γ k ) g(s ) N B (s ). (8) Now suppose that γ s a mmzer of problem (3). It s easy to see that f we choose γ k = γ, the because γ satsfes Equato (7), Equato (8) also holds. Coversely, f γ s a mmzer of problem (5) at γ, the γ also satsfes Equato (7). 3) Itermedate results ad gap certfcate: We prove several lemmas ad exhbt a gap certfcate that provdes a boud o the dfferece of the objectve value alog the teratos to the optmal objectve value. As oe may remark, our algorthm s very smlar to a codtoal gradet algorthm, the Frak-Wolfe algorthm. As such, our proof of covergece of the algorthm wll follow smlar les as those used by Bertsekas. Our covergece results s based o the followg proposto ad defto gve [5]. Proposto 3.2: from [5]. Let {γ k } be a sequece geerated by the feasble drecto method γ k+1 = γ k +α k γ wth γ k = s k γ k. Assume that { γ k } s gradetrelated ad that α k s chose by the lmted mmzato or the Armjo rule, the every lmt pot of {γ k } s a statoary pot. Defto A sequece γ k s sad to be gradet-related to the sequece γ k f, for ay subsequece of {γ k } k K that coverges to a o-statoary pot, the correspodg subsequece { γ k } k K s bouded ad satsfes lm sup F (γ k ) γ k < Bascally, ths property says that f a subsequece coverges to a o-statoary pot, the at the lmt pot the feasble drecto defed by γ s stll a descet drecto. Before provg that the sequece defed by { γ k } s gradet-related, we prove useful lemmas. Lemma 3.3: For ay γ k B, each γ k = s k γ k defes a feasble descet drecto. Proof: By defto, s k belogs to the covex set B. Hece, for ay α k [, 1], γ k+1 defes a feasble pot. Hece γ k s a feasble drecto. Now let us show that t also defes a descet drecto. By defto of the mmzer s k, we have for all s B f(γ k ), s k γ k + g(s k ) g(γ k ) f(γ k ), s γ k + g(s) g(γ k ). Because the above equalty also holds for s = γ k, we have f(γ k ), s k γ k + g(s k ) g(γ k ). (9) By covexty of g( ) we have g(s k ) g(γ k ) g(γ k ), s k γ k, whch, plugged equato, (9) leads to f(γ k ) + g(γ k ), s k γ k ad thus F (γ k ), s k γ k, whch proves that γ k s a descet drecto. The ext lemma provdes a terestg feature of our algorthm. Ideed, the lemma states that the dfferece betwee the optmal objectve value ad the curret objectve value ca be easly motored. Lemma 3.4: For all γ k B, the followg property holds [ f(γ k ), s γ k +g(s) g(γ k ) ] F (γ ) F (γ k ), m where γ s a mmzer of F. I addto, f γ k does ot belog to the set of mmzers of F ( ), the the secod equalty s strct. Proof: By covexty of f, we have f(γ ) f(γ k ) f(γ k ) (γ γ k ). By addg g(γ ) g(γ k ) to both sdes of the equalty, we obta F (γ ) F (γ k ) f(γ k ) (γ γ k ) + g(γ ) g(γ k ) ad because γ s a mmzer of F, we also have F (γ ) F (γ k ). Hece, the followg holds f(γ k ), γ γ k +g(γ ) g(γ k ) F (γ ) F (γ k )
5 5 ad we also have 5) Rate of covergece: We ca show that the objectve value F (x k ) coverges towards F (x ) a lear rate f we m f(γ k ), s γ k +g(s) g(γ k ) F (γ ) F (γ k ), have some addtoal smoothess codto/dts of F ( ). We ca easly prove ths statemet by followg the steps whch cocludes the frst part of the proof. Fally, f γ k s ot a mmzer of F, the we aturally have > F (γ ) F (γ k ). proposed by Jagg et al. [6] for the codtoal gradet algorthm. We make the hypothess that there exsts a costat C F so that for ay x, y B ad ay α [, 1], the equalty F ((1 α)x + αy) F (x) + α F (x) (y x) + C F 2 α2 4) Proof of covergece: Now that we have all the peces of the proof, let us show the key gredet. Lemma 3.5: The sequece { γ k } of our algorthm s gradet-related. Proof: For showg that our drecto sequece s gradet-related, we have to show that, gve a subsequece {γ k } k K that coverges to a o-statoary pot γ, the sequece { γ k } k K s bouded ad that lm sup F (γ k ) γ k <. Boudedess of the sequece aturally derves from the facts that s k B, γ k B ad the set B s bouded. The secod part of the proof starts by showg that F (γ k ), s k γ k = f(γ k ) + g(γ), s k γ k f(γ k ), s k γ k + g(s k ) g(γ k ), where the last equalty s obtaed owg to the covexty of g. Because that equalty holds for the mmzer, t also holds for ay vector s B : F (γ k ), s k γ k f(γ k ), s γ k + g(s) g(γ k ). Takg lmt yelds to lm sup F (γ k ), s k γ k f( γ), s γ +g(s) g( γ) for all s B. As such, ths equalty also holds for the mmzer lm sup F (γ k ), s k γ k m f( γ), s γ + g(s) g( γ). Now, sce γ s ot statoary, t s ot optmal ad t does ot belog to the mmzer of F, hece accordg to the above lemma 3.4, m f( γ), s γ + g(s) g( γ) <, whch cocludes the proof. Ths latter lemma proves that our drecto sequece s gradet-related, thus proposto 3.2 apples. holds. Based o ths equalty, for a sequece {x k } obtaed from the geeralzed codtoal gradet algorthm we have F (x k+1 ) F (x ) = F ((1 α k )x k + α k s k ) F (x ) F (x k ) F (x ) + α k F (x k ) (s k x k ) + C F 2 α2 k. (1) Let us deote h(x k ) = F (x k ) F (x ). Now by addg to both sdes of the equalty α k [g(s k ) g(x k )], we have h(x k+1 ) + α k [g(s k ) g(x k )] h(x k ) + α k [ f(x k ) (s k x k ) + g(s k ) g(x k )] +α k g(x k ) (s k x k ) + C F 2 α2 k h(x k ) + α k [ f(x k ) (x x k ) + g(x ) g(x k )] +α k g(x k ) (s k x k ) + C F 2 α2 k, where the secod equalty comes from the defto of the search drecto s k. Because f( ) s covex, we have f(x ) f(x k ) f(x k ) (x x k ). Thus, we have h(x k+1 ) + α k [g(s k ) g(x k )] h(x k ) + α k [f(x ) f(x k ) + g(x ) g(x k )] +α k g(x k ) (s k x k ) + C F 2 α2 k (1 α k )h(x k ) + α k g(x k ) (s k x k ) + C F 2 α2 k. Now, aga owg to the covexty of g( ), we have g(s k ) + g(x k ) g(x k ) (s k x k ). Usg ths fact the last above equalty leads to h(x k+1 ) (1 α k )h(x k ) + C F 2 α2 k. (11) Based o ths result, we ca ow state the followg Theorem 3.6: For each k 1, the terates x k of Algorthm 1 satsfy F (x k ) F (x ) 2C F k + 2. Proof: The proof stads o Equato (11) ad o the same ducto as the oe used by Jagg et al [6]. Note that ths covergece property would also hold f we choose the step sze as α k = 2 k+2.
6 6 TABLE II: Computatoal tme (secods) for the best set of regularzato parameters OT-exact OT-IT OT-GL OT-Laplace U M M U P1 P P1 P P1 P P2 P P2 P P2 P P3 P P3 P P3 P P4 P P4 P P4 P C A C W C D A C A W A D W C W A W D D C D A D W mea Fg. 5: Example of the evoluto of the objectve value alog (top) the teratos ad (bottom) the rug tmes for the geeralzed codtoal gradet (CGS) ad the codtoal gradet (CG + Mosek) algorthms. 6) Computatoal performaces: Let us frst show that the GCG algorthm s more effcet tha a classcal codtoal gradet method as the oe used [8]. We llustrate ths Fgure 5, showg the covergece (top pael) ad the correspodg computatoal tmes (bottom pael). We take as a example the case of computg the OT-GL trasport pla of dgts 1 ad 2 USPS to those MNIST. For ths example, we have allowed a maxmum of 5 teratos. Regardg covergece of the cost fucto alog the teratos, we ca clearly see that, whle GCG reaches early-optmal objectve value aroud 1 teratos, the CG approach s stll far from covergece after 5 teratos. I addto, the per-terato cost s sgfcatly hgher for the codtoal gradet algorthm. For ths example, we save a order of magtude of rug tme yet CG has ot coverged. We ow study the computatoal performaces of the dfferet optmal trasport strateges the vsual object recogto tasks cosdered above. We use Pytho mplemetatos of the dfferet OT methods. The test were ru o a smple Macbook pro stato, wth a 2.4 Ghz processor. The orgal OT-Exact soluto to the optmal trasport problem s computed wth the MOSEK [9] lear programmg solver 1, whereas the other strateges follow 1 other publcly avalable solvers were cosdered, but t tured out ths partcular oe was a order of magtude faster tha the others our ow mplemetato based o the Skhor-Kopp method. For the regularzed optmal trasport OT-GL ad OT-Laplace we used the codtoal gradet splttg algorthm (the source code wll be made avalable upo the acceptace of the artcle). We report Table II the computatoal tme eeded by the OT methods. As expected, OT-IT s the less computatoally tesve. The soluto of the exact optmal trasport (OT-exact) s loger to compute by a factor 4. Also, as expected, the two regularzed versos OT- GL ad OT-Laplace are the most demadg methods. We recall here that the maxmum umber of er loop of the GCG approach was set to 1, meag that each of those methods made 1 calls to the Skhor-Kopp solver used by OT-IT. However, the added computatoal cost s mostly due to the le search procedure (le 4 Algorthm 1), whch volves several computatos of the cost fucto. We expla the dfferece betwee OT-GL ad OT-Laplace by the dfferece of computato tme eeded by ths procedure. Summg up, oe ca otce that, eve for large problems (case P1 P4 for stace, volvg varables), the computato tme s ot prohbtve ad remas tractable. REFERENCES [1] J. vo Neuma, A certa zero-sum two-perso game equvalet to the optmal assgmet problem, Cotrbutos to the Theory of Games, vol. 2, [2] G. Ze, E. Rcc, ad N. Sebe, Smultaeous groud metrc learg ad matrx factorzato wth earth mover s dstace, Patter Recogto (ICPR), d Iteratoal Coferece o, Aug 214, pp
7 [3] M. Cutur ad D. Avs, Groud metrc learg, J. Mach. Lear. Res., vol. 15, o. 1, pp , Ja [4] Krsta Bredes, Drk Lorez, ad Peter Maass, Equvalece of a geeralzed codtoal gradet method ad the method of surrogate fuctoals, Zetrum für Techomathematk, 25. [5] Dmtr P Bertsekas, Nolear programmg, Athea scetfc Belmot, [6] Mart Jagg, Revstg frak-wolfe: Projecto-free sparse covex optmzato, Proceedgs of the 3th Iteratoal Coferece o Mache Learg (ICML-13), 213, pp [7] D. Bertsekas, A. Nedc, ad A. Ozdaglar, Covex Aalyss ad Optmzato, Athea Scetfc, 23. [8] Sra Ferradas, Ncolas Papadaks, Gabrel Peyré, ad Jea-Fraços Aujol, Regularzed dscrete optmal trasport, SIAM Joural o Imagg Sceces, vol. 7, o. 3, 214. [9] Erlg Aderse ad Kud Aderse, The mosek teror pot optmzer for lear programmg: A mplemetato of the homogeeous algorthm, Hgh Performace Optmzato, vol. 33 of Appled Optmzato, pp Sprger US, 2. 7
Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)
CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.
More informationLecture 16: Backpropogation Algorithm Neural Networks with smooth activation functions
CO-511: Learg Theory prg 2017 Lecturer: Ro Lv Lecture 16: Bacpropogato Algorthm Dsclamer: These otes have ot bee subected to the usual scruty reserved for formal publcatos. They may be dstrbuted outsde
More information1 Lyapunov Stability Theory
Lyapuov Stablty heory I ths secto we cosder proofs of stablty of equlbra of autoomous systems. hs s stadard theory for olear systems, ad oe of the most mportat tools the aalyss of olear systems. It may
More informationA Remark on the Uniform Convergence of Some Sequences of Functions
Advaces Pure Mathematcs 05 5 57-533 Publshed Ole July 05 ScRes. http://www.scrp.org/joural/apm http://dx.do.org/0.436/apm.05.59048 A Remark o the Uform Covergece of Some Sequeces of Fuctos Guy Degla Isttut
More informationSolving Constrained Flow-Shop Scheduling. Problems with Three Machines
It J Cotemp Math Sceces, Vol 5, 2010, o 19, 921-929 Solvg Costraed Flow-Shop Schedulg Problems wth Three Maches P Pada ad P Rajedra Departmet of Mathematcs, School of Advaced Sceces, VIT Uversty, Vellore-632
More informationarxiv: v1 [cs.lg] 22 Feb 2015
SDCA wthout Dualty Sha Shalev-Shwartz arxv:50.0677v cs.lg Feb 05 Abstract Stochastc Dual Coordate Ascet s a popular method for solvg regularzed loss mmzato for the case of covex losses. I ths paper we
More informationMATH 247/Winter Notes on the adjoint and on normal operators.
MATH 47/Wter 00 Notes o the adjot ad o ormal operators I these otes, V s a fte dmesoal er product space over, wth gve er * product uv, T, S, T, are lear operators o V U, W are subspaces of V Whe we say
More informationChapter 9 Jordan Block Matrices
Chapter 9 Jorda Block atrces I ths chapter we wll solve the followg problem. Gve a lear operator T fd a bass R of F such that the matrx R (T) s as smple as possble. f course smple s a matter of taste.
More informationCS286.2 Lecture 4: Dinur s Proof of the PCP Theorem
CS86. Lecture 4: Dur s Proof of the PCP Theorem Scrbe: Thom Bohdaowcz Prevously, we have prove a weak verso of the PCP theorem: NP PCP 1,1/ (r = poly, q = O(1)). Wth ths result we have the desred costat
More informationTESTS BASED ON MAXIMUM LIKELIHOOD
ESE 5 Toy E. Smth. The Basc Example. TESTS BASED ON MAXIMUM LIKELIHOOD To llustrate the propertes of maxmum lkelhood estmates ad tests, we cosder the smplest possble case of estmatg the mea of the ormal
More informationMu Sequences/Series Solutions National Convention 2014
Mu Sequeces/Seres Solutos Natoal Coveto 04 C 6 E A 6C A 6 B B 7 A D 7 D C 7 A B 8 A B 8 A C 8 E 4 B 9 B 4 E 9 B 4 C 9 E C 0 A A 0 D B 0 C C Usg basc propertes of arthmetc sequeces, we fd a ad bm m We eed
More informationResearch Article A New Iterative Method for Common Fixed Points of a Finite Family of Nonexpansive Mappings
Hdaw Publshg Corporato Iteratoal Joural of Mathematcs ad Mathematcal Sceces Volume 009, Artcle ID 391839, 9 pages do:10.1155/009/391839 Research Artcle A New Iteratve Method for Commo Fxed Pots of a Fte
More informationDepartment of Agricultural Economics. PhD Qualifier Examination. August 2011
Departmet of Agrcultural Ecoomcs PhD Qualfer Examato August 0 Istructos: The exam cossts of sx questos You must aswer all questos If you eed a assumpto to complete a questo, state the assumpto clearly
More informationPart 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))
art 4b Asymptotc Results for MRR usg RESS Recall that the RESS statstc s a specal type of cross valdato procedure (see Alle (97)) partcular to the regresso problem ad volves fdg Y $,, the estmate at the
More informationSummary of the lecture in Biostatistics
Summary of the lecture Bostatstcs Probablty Desty Fucto For a cotuos radom varable, a probablty desty fucto s a fucto such that: 0 dx a b) b a dx A probablty desty fucto provdes a smple descrpto of the
More informationDiscrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b
CS 70 Dscrete Mathematcs ad Probablty Theory Fall 206 Sesha ad Walrad DIS 0b. Wll I Get My Package? Seaky delvery guy of some compay s out delverg packages to customers. Not oly does he had a radom package
More informationCHAPTER 4 RADICAL EXPRESSIONS
6 CHAPTER RADICAL EXPRESSIONS. The th Root of a Real Number A real umber a s called the th root of a real umber b f Thus, for example: s a square root of sce. s also a square root of sce ( ). s a cube
More informationPoint Estimation: definition of estimators
Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.
More informationarxiv:math/ v1 [math.gm] 8 Dec 2005
arxv:math/05272v [math.gm] 8 Dec 2005 A GENERALIZATION OF AN INEQUALITY FROM IMO 2005 NIKOLAI NIKOLOV The preset paper was spred by the thrd problem from the IMO 2005. A specal award was gve to Yure Boreko
More informationPROJECTION PROBLEM FOR REGULAR POLYGONS
Joural of Mathematcal Sceces: Advaces ad Applcatos Volume, Number, 008, Pages 95-50 PROJECTION PROBLEM FOR REGULAR POLYGONS College of Scece Bejg Forestry Uversty Bejg 0008 P. R. Cha e-mal: sl@bjfu.edu.c
More informationRuntime analysis RLS on OneMax. Heuristic Optimization
Lecture 6 Rutme aalyss RLS o OeMax trals of {,, },, l ( + ɛ) l ( ɛ)( ) l Algorthm Egeerg Group Hasso Platter Isttute, Uversty of Potsdam 9 May T, We wat to rgorously uderstad ths behavor 9 May / Rutme
More informationEconometric Methods. Review of Estimation
Ecoometrc Methods Revew of Estmato Estmatg the populato mea Radom samplg Pot ad terval estmators Lear estmators Ubased estmators Lear Ubased Estmators (LUEs) Effcecy (mmum varace) ad Best Lear Ubased Estmators
More informationChapter 5 Properties of a Random Sample
Lecture 6 o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for the prevous lecture Cocepts: t-dstrbuto, F-dstrbuto Theorems: Dstrbutos of sample mea ad sample varace, relatoshp betwee sample mea ad sample
More informationUnimodality Tests for Global Optimization of Single Variable Functions Using Statistical Methods
Malaysa Umodalty Joural Tests of Mathematcal for Global Optmzato Sceces (): of 05 Sgle - 5 Varable (007) Fuctos Usg Statstcal Methods Umodalty Tests for Global Optmzato of Sgle Varable Fuctos Usg Statstcal
More informationCSE 5526: Introduction to Neural Networks Linear Regression
CSE 556: Itroducto to Neural Netorks Lear Regresso Part II 1 Problem statemet Part II Problem statemet Part II 3 Lear regresso th oe varable Gve a set of N pars of data , appromate d by a lear fucto
More information= lim. (x 1 x 2... x n ) 1 n. = log. x i. = M, n
.. Soluto of Problem. M s obvously cotuous o ], [ ad ], [. Observe that M x,..., x ) M x,..., x ) )..) We ext show that M s odecreasg o ], [. Of course.) mles that M s odecreasg o ], [ as well. To show
More informationLecture 9: Tolerant Testing
Lecture 9: Tolerat Testg Dael Kae Scrbe: Sakeerth Rao Aprl 4, 07 Abstract I ths lecture we prove a quas lear lower boud o the umber of samples eeded to do tolerat testg for L dstace. Tolerat Testg We have
More informationBounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy
Bouds o the expected etropy ad KL-dvergece of sampled multomal dstrbutos Brado C. Roy bcroy@meda.mt.edu Orgal: May 18, 2011 Revsed: Jue 6, 2011 Abstract Iformato theoretc quattes calculated from a sampled
More informationDimensionality Reduction and Learning
CMSC 35900 (Sprg 009) Large Scale Learg Lecture: 3 Dmesoalty Reducto ad Learg Istructors: Sham Kakade ad Greg Shakharovch L Supervsed Methods ad Dmesoalty Reducto The theme of these two lectures s that
More informationUNIT 2 SOLUTION OF ALGEBRAIC AND TRANSCENDENTAL EQUATIONS
Numercal Computg -I UNIT SOLUTION OF ALGEBRAIC AND TRANSCENDENTAL EQUATIONS Structure Page Nos..0 Itroducto 6. Objectves 7. Ital Approxmato to a Root 7. Bsecto Method 8.. Error Aalyss 9.4 Regula Fals Method
More informationLECTURE 24 LECTURE OUTLINE
LECTURE 24 LECTURE OUTLINE Gradet proxmal mmzato method Noquadratc proxmal algorthms Etropy mmzato algorthm Expoetal augmeted Lagraga mehod Etropc descet algorthm **************************************
More informationBayes (Naïve or not) Classifiers: Generative Approach
Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg
More information0/1 INTEGER PROGRAMMING AND SEMIDEFINTE PROGRAMMING
CONVEX OPIMIZAION AND INERIOR POIN MEHODS FINAL PROJEC / INEGER PROGRAMMING AND SEMIDEFINE PROGRAMMING b Luca Buch ad Natala Vktorova CONENS:.Itroducto.Formulato.Applcato to Kapsack Problem 4.Cuttg Plaes
More informationA tighter lower bound on the circuit size of the hardest Boolean functions
Electroc Colloquum o Computatoal Complexty, Report No. 86 2011) A tghter lower boud o the crcut sze of the hardest Boolea fuctos Masak Yamamoto Abstract I [IPL2005], Fradse ad Mlterse mproved bouds o the
More informationLecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model
Lecture 7. Cofdece Itervals ad Hypothess Tests the Smple CLR Model I lecture 6 we troduced the Classcal Lear Regresso (CLR) model that s the radom expermet of whch the data Y,,, K, are the outcomes. The
More informationOrdinary Least Squares Regression. Simple Regression. Algebra and Assumptions.
Ordary Least Squares egresso. Smple egresso. Algebra ad Assumptos. I ths part of the course we are gog to study a techque for aalysg the lear relatoshp betwee two varables Y ad X. We have pars of observatos
More informationRademacher Complexity. Examples
Algorthmc Foudatos of Learg Lecture 3 Rademacher Complexty. Examples Lecturer: Patrck Rebesch Verso: October 16th 018 3.1 Itroducto I the last lecture we troduced the oto of Rademacher complexty ad showed
More informationArithmetic Mean and Geometric Mean
Acta Mathematca Ntresa Vol, No, p 43 48 ISSN 453-6083 Arthmetc Mea ad Geometrc Mea Mare Varga a * Peter Mchalča b a Departmet of Mathematcs, Faculty of Natural Sceces, Costate the Phlosopher Uversty Ntra,
More informationX ε ) = 0, or equivalently, lim
Revew for the prevous lecture Cocepts: order statstcs Theorems: Dstrbutos of order statstcs Examples: How to get the dstrbuto of order statstcs Chapter 5 Propertes of a Radom Sample Secto 55 Covergece
More informationQR Factorization and Singular Value Decomposition COS 323
QR Factorzato ad Sgular Value Decomposto COS 33 Why Yet Aother Method? How do we solve least-squares wthout currg codto-squarg effect of ormal equatos (A T A A T b) whe A s sgular, fat, or otherwse poorly-specfed?
More informationPTAS for Bin-Packing
CS 663: Patter Matchg Algorthms Scrbe: Che Jag /9/00. Itroducto PTAS for B-Packg The B-Packg problem s NP-hard. If we use approxmato algorthms, the B-Packg problem could be solved polyomal tme. For example,
More informationECON 5360 Class Notes GMM
ECON 560 Class Notes GMM Geeralzed Method of Momets (GMM) I beg by outlg the classcal method of momets techque (Fsher, 95) ad the proceed to geeralzed method of momets (Hase, 98).. radtoal Method of Momets
More informationMultiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades
STAT 101 Dr. Kar Lock Morga 11/20/12 Exam 2 Grades Multple Regresso SECTIONS 9.2, 10.1, 10.2 Multple explaatory varables (10.1) Parttog varablty R 2, ANOVA (9.2) Codtos resdual plot (10.2) Trasformatos
More information( ) 2 2. Multi-Layer Refraction Problem Rafael Espericueta, Bakersfield College, November, 2006
Mult-Layer Refracto Problem Rafael Espercueta, Bakersfeld College, November, 006 Lght travels at dfferet speeds through dfferet meda, but refracts at layer boudares order to traverse the least-tme path.
More information1 Convergence of the Arnoldi method for eigenvalue problems
Lecture otes umercal lear algebra Arold method covergece Covergece of the Arold method for egevalue problems Recall that, uless t breaks dow, k steps of the Arold method geerates a orthogoal bass of a
More informationIntroduction to local (nonparametric) density estimation. methods
Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest
More informationChapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements
Aoucemets No-Parametrc Desty Estmato Techques HW assged Most of ths lecture was o the blacboard. These sldes cover the same materal as preseted DHS Bometrcs CSE 90-a Lecture 7 CSE90a Fall 06 CSE90a Fall
More informationAitken delta-squared generalized Juncgk-type iterative procedure
Atke delta-squared geeralzed Jucgk-type teratve procedure M. De la Se Isttute of Research ad Developmet of Processes. Uversty of Basque Coutry Campus of Leoa (Bzkaa) PO Box. 644- Blbao, 488- Blbao. SPAIN
More information1 Solution to Problem 6.40
1 Soluto to Problem 6.40 (a We wll wrte T τ (X 1,...,X where the X s are..d. wth PDF f(x µ, σ 1 ( x µ σ g, σ where the locato parameter µ s ay real umber ad the scale parameter σ s > 0. Lettg Z X µ σ we
More informationThe internal structure of natural numbers, one method for the definition of large prime numbers, and a factorization test
Fal verso The teral structure of atural umbers oe method for the defto of large prme umbers ad a factorzato test Emmaul Maousos APM Isttute for the Advacemet of Physcs ad Mathematcs 3 Poulou str. 53 Athes
More informationhp calculators HP 30S Statistics Averages and Standard Deviations Average and Standard Deviation Practice Finding Averages and Standard Deviations
HP 30S Statstcs Averages ad Stadard Devatos Average ad Stadard Devato Practce Fdg Averages ad Stadard Devatos HP 30S Statstcs Averages ad Stadard Devatos Average ad stadard devato The HP 30S provdes several
More information1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i.
CS 94- Desty Matrces, vo Neuma Etropy 3/7/07 Sprg 007 Lecture 3 I ths lecture, we wll dscuss the bascs of quatum formato theory I partcular, we wll dscuss mxed quatum states, desty matrces, vo Neuma etropy
More informationExercises for Square-Congruence Modulo n ver 11
Exercses for Square-Cogruece Modulo ver Let ad ab,.. Mark True or False. a. 3S 30 b. 3S 90 c. 3S 3 d. 3S 4 e. 4S f. 5S g. 0S 55 h. 8S 57. 9S 58 j. S 76 k. 6S 304 l. 47S 5347. Fd the equvalece classes duced
More informationSupport vector machines
CS 75 Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Outle Outle: Algorthms for lear decso boudary Support vector maches Mamum marg hyperplae.
More information9 U-STATISTICS. Eh =(m!) 1 Eh(X (1),..., X (m ) ) i.i.d
9 U-STATISTICS Suppose,,..., are P P..d. wth CDF F. Our goal s to estmate the expectato t (P)=Eh(,,..., m ). Note that ths expectato requres more tha oe cotrast to E, E, or Eh( ). Oe example s E or P((,
More informationInvestigation of Partially Conditional RP Model with Response Error. Ed Stanek
Partally Codtoal Radom Permutato Model 7- vestgato of Partally Codtoal RP Model wth Respose Error TRODUCTO Ed Staek We explore the predctor that wll result a smple radom sample wth respose error whe a
More informationThe Mathematical Appendix
The Mathematcal Appedx Defto A: If ( Λ, Ω, where ( λ λ λ whch the probablty dstrbutos,,..., Defto A. uppose that ( Λ,,..., s a expermet type, the σ-algebra o λ λ λ are defed s deoted by ( (,,...,, σ Ω.
More information18.413: Error Correcting Codes Lab March 2, Lecture 8
18.413: Error Correctg Codes Lab March 2, 2004 Lecturer: Dael A. Spelma Lecture 8 8.1 Vector Spaces A set C {0, 1} s a vector space f for x all C ad y C, x + y C, where we take addto to be compoet wse
More informationComplete Convergence and Some Maximal Inequalities for Weighted Sums of Random Variables
Joural of Sceces, Islamc Republc of Ira 8(4): -6 (007) Uversty of Tehra, ISSN 06-04 http://sceces.ut.ac.r Complete Covergece ad Some Maxmal Iequaltes for Weghted Sums of Radom Varables M. Am,,* H.R. Nl
More informationd dt d d dt dt Also recall that by Taylor series, / 2 (enables use of sin instead of cos-see p.27 of A&F) dsin
Learzato of the Swg Equato We wll cover sectos.5.-.6 ad begg of Secto 3.3 these otes. 1. Sgle mache-fte bus case Cosder a sgle mache coected to a fte bus, as show Fg. 1 below. E y1 V=1./_ Fg. 1 The admttace
More informationFor combinatorial problems we might need to generate all permutations, combinations, or subsets of a set.
Addtoal Decrease ad Coquer Algorthms For combatoral problems we mght eed to geerate all permutatos, combatos, or subsets of a set. Geeratg Permutatos If we have a set f elemets: { a 1, a 2, a 3, a } the
More informationAssignment 5/MATH 247/Winter Due: Friday, February 19 in class (!) (answers will be posted right after class)
Assgmet 5/MATH 7/Wter 00 Due: Frday, February 9 class (!) (aswers wll be posted rght after class) As usual, there are peces of text, before the questos [], [], themselves. Recall: For the quadratc form
More informationStrong Convergence of Weighted Averaged Approximants of Asymptotically Nonexpansive Mappings in Banach Spaces without Uniform Convexity
BULLETIN of the MALAYSIAN MATHEMATICAL SCIENCES SOCIETY Bull. Malays. Math. Sc. Soc. () 7 (004), 5 35 Strog Covergece of Weghted Averaged Appromats of Asymptotcally Noepasve Mappgs Baach Spaces wthout
More informationFunctions of Random Variables
Fuctos of Radom Varables Chapter Fve Fuctos of Radom Varables 5. Itroducto A geeral egeerg aalyss model s show Fg. 5.. The model output (respose) cotas the performaces of a system or product, such as weght,
More informationX X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then
Secto 5 Vectors of Radom Varables Whe workg wth several radom varables,,..., to arrage them vector form x, t s ofte coveet We ca the make use of matrx algebra to help us orgaze ad mapulate large umbers
More informationSpecial Instructions / Useful Data
JAM 6 Set of all real umbers P A..d. B, p Posso Specal Istructos / Useful Data x,, :,,, x x Probablty of a evet A Idepedetly ad detcally dstrbuted Bomal dstrbuto wth parameters ad p Posso dstrbuto wth
More informationClass 13,14 June 17, 19, 2015
Class 3,4 Jue 7, 9, 05 Pla for Class3,4:. Samplg dstrbuto of sample mea. The Cetral Lmt Theorem (CLT). Cofdece terval for ukow mea.. Samplg Dstrbuto for Sample mea. Methods used are based o CLT ( Cetral
More informationUnsupervised Learning and Other Neural Networks
CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all
More informationF. Inequalities. HKAL Pure Mathematics. 進佳數學團隊 Dr. Herbert Lam 林康榮博士. [Solution] Example Basic properties
進佳數學團隊 Dr. Herbert Lam 林康榮博士 HKAL Pure Mathematcs F. Ieualtes. Basc propertes Theorem Let a, b, c be real umbers. () If a b ad b c, the a c. () If a b ad c 0, the ac bc, but f a b ad c 0, the ac bc. Theorem
More informationAn Introduction to. Support Vector Machine
A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork
More informationAnalysis of Lagrange Interpolation Formula
P IJISET - Iteratoal Joural of Iovatve Scece, Egeerg & Techology, Vol. Issue, December 4. www.jset.com ISS 348 7968 Aalyss of Lagrage Iterpolato Formula Vjay Dahya PDepartmet of MathematcsMaharaja Surajmal
More informationNumerical Analysis Formulae Booklet
Numercal Aalyss Formulae Booklet. Iteratve Scemes for Systems of Lear Algebrac Equatos:.... Taylor Seres... 3. Fte Dfferece Approxmatos... 3 4. Egevalues ad Egevectors of Matrces.... 3 5. Vector ad Matrx
More informationBeam Warming Second-Order Upwind Method
Beam Warmg Secod-Order Upwd Method Petr Valeta Jauary 6, 015 Ths documet s a part of the assessmet work for the subject 1DRP Dfferetal Equatos o Computer lectured o FNSPE CTU Prague. Abstract Ths documet
More informationGeneralized Linear Regression with Regularization
Geeralze Lear Regresso wth Regularzato Zoya Bylsk March 3, 05 BASIC REGRESSION PROBLEM Note: I the followg otes I wll make explct what s a vector a what s a scalar usg vec t or otato, to avo cofuso betwee
More informationCIS 800/002 The Algorithmic Foundations of Data Privacy October 13, Lecture 9. Database Update Algorithms: Multiplicative Weights
CIS 800/002 The Algorthmc Foudatos of Data Prvacy October 13, 2011 Lecturer: Aaro Roth Lecture 9 Scrbe: Aaro Roth Database Update Algorthms: Multplcatve Weghts We ll recall aga) some deftos from last tme:
More informationHomework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015
Fall 05 Homework : Solutos Problem : (Practce wth Asymptotc Notato) A essetal requremet for uderstadg scalg behavor s comfort wth asymptotc (or bg-o ) otato. I ths problem, you wll prove some basc facts
More informationANALYSIS ON THE NATURE OF THE BASIC EQUATIONS IN SYNERGETIC INTER-REPRESENTATION NETWORK
Far East Joural of Appled Mathematcs Volume, Number, 2008, Pages Ths paper s avalable ole at http://www.pphm.com 2008 Pushpa Publshg House ANALYSIS ON THE NATURE OF THE ASI EQUATIONS IN SYNERGETI INTER-REPRESENTATION
More informationChapter 2 - Free Vibration of Multi-Degree-of-Freedom Systems - II
CEE49b Chapter - Free Vbrato of Mult-Degree-of-Freedom Systems - II We ca obta a approxmate soluto to the fudametal atural frequecy through a approxmate formula developed usg eergy prcples by Lord Raylegh
More informationChapter 14 Logistic Regression Models
Chapter 4 Logstc Regresso Models I the lear regresso model X β + ε, there are two types of varables explaatory varables X, X,, X k ad study varable y These varables ca be measured o a cotuous scale as
More informationRegression and the LMS Algorithm
CSE 556: Itroducto to Neural Netorks Regresso ad the LMS Algorthm CSE 556: Regresso 1 Problem statemet CSE 556: Regresso Lear regresso th oe varable Gve a set of N pars of data {, d }, appromate d b a
More informationSimulation Output Analysis
Smulato Output Aalyss Summary Examples Parameter Estmato Sample Mea ad Varace Pot ad Iterval Estmato ermatg ad o-ermatg Smulato Mea Square Errors Example: Sgle Server Queueg System x(t) S 4 S 4 S 3 S 5
More informationLecture 3. Sampling, sampling distributions, and parameter estimation
Lecture 3 Samplg, samplg dstrbutos, ad parameter estmato Samplg Defto Populato s defed as the collecto of all the possble observatos of terest. The collecto of observatos we take from the populato s called
More informationESS Line Fitting
ESS 5 014 17. Le Fttg A very commo problem data aalyss s lookg for relatoshpetwee dfferet parameters ad fttg les or surfaces to data. The smplest example s fttg a straght le ad we wll dscuss that here
More informationEstimation of Stress- Strength Reliability model using finite mixture of exponential distributions
Iteratoal Joural of Computatoal Egeerg Research Vol, 0 Issue, Estmato of Stress- Stregth Relablty model usg fte mxture of expoetal dstrbutos K.Sadhya, T.S.Umamaheswar Departmet of Mathematcs, Lal Bhadur
More informationParameter, Statistic and Random Samples
Parameter, Statstc ad Radom Samples A parameter s a umber that descrbes the populato. It s a fxed umber, but practce we do ot kow ts value. A statstc s a fucto of the sample data,.e., t s a quatty whose
More information( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model
Chapter 3 Asmptotc Theor ad Stochastc Regressors The ature of eplaator varable s assumed to be o-stochastc or fed repeated samples a regresso aalss Such a assumpto s approprate for those epermets whch
More informationEVALUATION OF FUNCTIONAL INTEGRALS BY MEANS OF A SERIES AND THE METHOD OF BOREL TRANSFORM
EVALUATION OF FUNCTIONAL INTEGRALS BY MEANS OF A SERIES AND THE METHOD OF BOREL TRANSFORM Jose Javer Garca Moreta Ph. D research studet at the UPV/EHU (Uversty of Basque coutry) Departmet of Theoretcal
More informationDIFFERENTIAL GEOMETRIC APPROACH TO HAMILTONIAN MECHANICS
DIFFERENTIAL GEOMETRIC APPROACH TO HAMILTONIAN MECHANICS Course Project: Classcal Mechacs (PHY 40) Suja Dabholkar (Y430) Sul Yeshwath (Y444). Itroducto Hamltoa mechacs s geometry phase space. It deals
More informationBERNSTEIN COLLOCATION METHOD FOR SOLVING NONLINEAR DIFFERENTIAL EQUATIONS. Aysegul Akyuz Dascioglu and Nese Isler
Mathematcal ad Computatoal Applcatos, Vol. 8, No. 3, pp. 293-300, 203 BERNSTEIN COLLOCATION METHOD FOR SOLVING NONLINEAR DIFFERENTIAL EQUATIONS Aysegul Ayuz Dascoglu ad Nese Isler Departmet of Mathematcs,
More informationA conic cutting surface method for linear-quadraticsemidefinite
A coc cuttg surface method for lear-quadratcsemdefte programmg Mohammad R. Osoorouch Calfora State Uversty Sa Marcos Sa Marcos, CA Jot wor wth Joh E. Mtchell RPI July 3, 2008 Outle: Secod-order coe: defto
More informationKLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames
KLT Tracker Tracker. Detect Harrs corers the frst frame 2. For each Harrs corer compute moto betwee cosecutve frames (Algmet). 3. Lk moto vectors successve frames to get a track 4. Itroduce ew Harrs pots
More informationLINEARLY CONSTRAINED MINIMIZATION BY USING NEWTON S METHOD
Jural Karya Asl Loreka Ahl Matematk Vol 8 o 205 Page 084-088 Jural Karya Asl Loreka Ahl Matematk LIEARLY COSTRAIED MIIMIZATIO BY USIG EWTO S METHOD Yosza B Dasrl, a Ismal B Moh 2 Faculty Electrocs a Computer
More informationKernel-based Methods and Support Vector Machines
Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg
More informationENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections
ENGI 441 Jot Probablty Dstrbutos Page 7-01 Jot Probablty Dstrbutos [Navd sectos.5 ad.6; Devore sectos 5.1-5.] The jot probablty mass fucto of two dscrete radom quattes, s, P ad p x y x y The margal probablty
More informationL5 Polynomial / Spline Curves
L5 Polyomal / Sple Curves Cotets Coc sectos Polyomal Curves Hermte Curves Bezer Curves B-Sples No-Uform Ratoal B-Sples (NURBS) Mapulato ad Represetato of Curves Types of Curve Equatos Implct: Descrbe a
More information2006 Jamie Trahan, Autar Kaw, Kevin Martin University of South Florida United States of America
SOLUTION OF SYSTEMS OF SIMULTANEOUS LINEAR EQUATIONS Gauss-Sedel Method 006 Jame Traha, Autar Kaw, Kev Mart Uversty of South Florda Uted States of Amerca kaw@eg.usf.edu Itroducto Ths worksheet demostrates
More informationCS 1675 Introduction to Machine Learning Lecture 12 Support vector machines
CS 675 Itroducto to Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Mdterm eam October 9, 7 I-class eam Closed book Stud materal: Lecture otes Correspodg chapters
More informationChapter 4 Multiple Random Variables
Revew for the prevous lecture: Theorems ad Examples: How to obta the pmf (pdf) of U = g (, Y) ad V = g (, Y) Chapter 4 Multple Radom Varables Chapter 44 Herarchcal Models ad Mxture Dstrbutos Examples:
More informationPGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation
PGE 30: Formulato ad Soluto Geosystems Egeerg Dr. Balhoff Iterpolato Numercal Methods wth MATLAB, Recktewald, Chapter 0 ad Numercal Methods for Egeers, Chapra ad Caale, 5 th Ed., Part Fve, Chapter 8 ad
More informationApplication of Legendre Bernstein basis transformations to degree elevation and degree reduction
Computer Aded Geometrc Desg 9 79 78 www.elsever.com/locate/cagd Applcato of Legedre Berste bass trasformatos to degree elevato ad degree reducto Byug-Gook Lee a Yubeom Park b Jaechl Yoo c a Dvso of Iteret
More information