Adaptive Manifold Learning
|
|
- Rosalyn Hubbard
- 6 years ago
- Views:
Transcription
1 Adaptve Manfold Learnng Jng Wang, Zhenyue Zhang Department of Mathematcs Zhejang Unversty, Yuquan Campus, Hangzhou, 327, P. R. Chna Hongyuan Zha Department of Computer Scence Pennsylvana State Unversty Unversty Park, PA 682 Abstract Recently, there have been several advances n the machne learnng and pattern recognton communtes for developng manfold learnng algorthms to construct nonlnear low-dmensonal manfolds from sample data ponts embedded n hgh-dmensonal spaces. In ths paper, we develop algorthms that address two key ssues n manfold learnng: ) the adaptve selecton of the neghborhood szes; and 2) better fttng the local geometrc structure to account for the varatons n the curvature of the manfold and ts nterplay wth the samplng densty of the data set. We also llustrate the effectveness of our methods on some synthetc data sets. Introducton Recently, there have been advances n the machne learnng communty for developng effectve and effcent algorthms for constructng nonlnear low-dmensonal manfolds from sample data ponts embedded n hgh-dmensonal spaces, emphaszng smple algorthmc mplementaton and avodng optmzaton problems prone to local mnma. The proposed algorthms nclude Isomap [6], locally lnear embeddng (LLE) [3] and ts varatons, manfold chartng [], hessan LLE [2] and local tangent space algnment (LTSA) [7], and they have been successfully appled n several computer vson and pattern recognton problems. Several drawbacks and possble extensons of the algorthms have been ponted out n [4, 7] and the focus of ths paper s to address two key ssues n manfold learnng: ) how to adaptvely select the neghborhood szes n the k-nearest neghbor computaton to construct the local connectvty; and 2) how to account for the varatons n the curvature of the manfold and ts nterplay wth the samplng densty of the data set. We wll dscuss those two ssues n the context of local tangent space algnment (LTSA) [7], a varaton of locally lnear embeddng (LLE) [3] (see also [5],[]). We beleve the basc deas we proposed can be smlarly appled to other manfold learnng algorthms. We frst outlne the basc steps of LTSA and llustrate ts falure modes usng two smple examples. Gven a data set X = [x,..., x N ] wth x R m, sampled (possbly wth nose) from a d-dmensonal manfold (d < m), LTSA proceeds n the followng steps. ) LOCAL NEIGHBORHOOD CONSTRUCTION. For each x, =,..., N, determne a set X = [x,..., x k ] of ts neghbors (k nearest neghbors, for example).
2 .3 k = 4.5 k = 6.5 k = Fgure : The data sets (frst column) and computed coordnates τ by LTSA vs. the centered arc-length coordnates Top row: Example. Bottom row: Example 2. 2) LOCAL LINEAR FITTING. Compute an orthonormal bass Q for the d-dmensonal tangent space of the manfold at x, and the orthogonal projecton of each x j to the tangent space: θ () j = Q T (x j x ) where x s the mean of the neghbors. 3) LOCAL COORDINATES ALIGNMENT. Algn the N local projectons Θ = [θ (),, θ() k ], =,..., N, to obtan the global coordnates τ,..., τ N. Such an algnment s acheved by mnmzng the global reconstructon error E 2 2 T (I ee T ) L Θ 2 2 (.) k over all possble L R d d and row-orthonormal T = [τ,..., τ N ] R d N, where T = [τ,..., τ k ] wth the ndex set {,..., k } determned by the neghborhood of each x, and e s a vector of all ones. Two strateges are commonly used for selectng the local neghborhood sze k : one s k nearest neghborhood ( k-nn wth a constant k for all the sample ponts) and the other s ɛ- neghborhood [3, 6]. The effectveness of the manfold learnng algorthms ncludng LTSA depends on the manner of how the nearby neghborhoods overlap wth each other and the varaton of the curvature of the manfold and ts nterplay wth the samplng densty [4]. We llustrate those ssues wth two smple examples. Example. We sample data ponts from a half unt crcle x = [cos(t ), sn(t )] T, =..., N. It s easy to see that t represent the arc-length of the crcle. We choose t [, π] accordng to t + t =.(. + cos(t ) ) startng at t =, and set N = 52 so that t N π and t N+ > π. Clearly, the half crcle has unt curvature everywhere. Ths s an example of hghly-varyng samplng densty. Example 2. The date set s generated as x = [t, e t2 ] T, =..., N, where t [ 6, 6] are unformly dstrbuted. The curvature of the -D curve at parameter value t s gven by c g (t) = 2 2t2 e t2 ( + 4t 2 e 2t2 ) 3/2
3 whch changes from mn t c g (t) = to max t c g (t) = 2 over t [ 6, 6]. We set N = 8. Ths s an example of hghly-varyng curvature. For the above two data sets, LTSA wth constant k-nn strategy fals for any reasonable k we have tested. So does LTSA wth constant ɛ-neghborhoods. In the frst column of Fgure, we plot these two data sets. The computed coordnates by LTSA wth constant k- neghborhoods are plotted aganst the centered arc-length coordnates for a selected range of k (deally, the plots should dsplay ponts on a straght lne of slops ±π/4). 2 Adaptve Neghborhood Selecton In ths secton, we propose a neghborhood contracton and expanson algorthm for adaptvely selectng k at each sample pont x. We assume that the data are generated from a parameterzed manfold, x = f(τ ), =,..., N, where f : Ω R d R m. If f s smooth enough, usng frst-order Taylor expanson at a fxed τ, for a neghborng τ, we have f( τ) = f(τ) + J f (τ) ( τ τ) + ɛ(τ, τ), (2.2) where J f (τ) R m d s the Jacob matrx of f at τ and ɛ(τ, τ) represents the error term determned by the Hessan of f, ɛ(τ, τ) c f (τ) τ τ 2 2, where c f (τ) represents the curvature of the manfold at τ. Settng τ = τ and τ = τ j gves x j = x + J f (τ ) (τ j τ ) + ɛ(τ, τ j ). (2.3) A pont x j can be regarded as a neghbor of x wth respect to the tangent space spanned by the columns of J f (τ ) f τ j τ 2 s small and ɛ(τ, τ j ) 2 J f (τ ) (τ j τ ) 2. The above condtons, however, are dffcult to verfy n practce snce we do not know J f (τ ). To get around ths problem, consder an orthogonal bass matrx Q of the tangent space spanned by the columns of J f (τ ) whch can be approxmately computed by the SVD of X x e T, where x s the mean of the neghbors x j = f(τ j ), j =,..., k. Note that x = k k j= x j = x + J f (τ ) ( τ τ ) + ɛ, where ɛ s the mean of ɛ(τ, τ ),..., ɛ(τ, τ k ). Elmnatng x n (2.3) by the representaton above yelds x j = x + J f (τ ) (τ j τ ) + ɛ () j wth ɛ () j = ɛ(τ, τ j ) ɛ. Let θ () j = Q T (x j x ), we have x j = x + Q θ () j + ɛ () j. Thus, x j can be selected as a neghbor of x f the orthogonal projecton θ () j s small and ɛ () j 2 = x j x Q θ () j 2 Q θ () j 2 = θ () j 2. (2.4) Assume all the x j satsfy the above nequalty, then we should approxmately have (I Q Q T )(X x e T ) F η Q T (X x e T ) F (2.5) We wll use (2.5) as a crteron for adaptve neghbor selecton, startng wth a K-NN at each sample pont x wth a large enough ntal K and deletng ponts one by one untl (2.5) holds. Ths process wll termnate when the neghborhood sze equals d + k for some small k and (2.5) s not true. In that case, we may need to reselect a k-nn that mnmzes the rato (I QQT )(X xet ) F Q T (X xet ) F NEIGHBORHOOD CONTRACTION. as the neghborhood set as s detaled below.
4 C. Determne the ntal K and K-NN neghborhood X (K) = [x,..., x K ] for x, ordered n non-decreasng dstances to x, Set k = K. C. Let x (k) x x x 2 x... x K x. be the column mean of X (k). Compute the orthogonal bass matrx Q (k), the d largest sngular vectors of X (k) x (k) e T. Set Θ (k) = (Q (k) ) T (X (k) x (k) e T ). C2. If X (k) x (k) e T Q (k) Θ (k) F < η Θ (k) F, then set X = X (k), Θ = Θ (k), and termnate. C3. If k > d+k, then delete the last column of X (k) to obtan X (k ), set k := k, and go to step C, otherwse, go to step C4. C4. Let k = arc mn d+k j K Θ (k). X (j) x (j) e T Q (j) Θ (j) F Θ (j), and set X = X (k), Θ = F Step C4 means that f there s no k-nn (k d + k ) satsfyng (2.5), then the contracted neghborhood X should be one that mnmzes X xet Q Θ F Θ F. Once the contracton step s done we can stll add back some of unselected x j to ncrease the overlap of nearby neghborhoods whle stll keep (2.5) ntact. In fact, we can add x j f x j x Q θ j η θ j whch s demonstrated n the followng result (we refer to [8] for the proof). Theorem 2. Let X = [x,..., x k ] satsfy (2.5). Furthermore, we assume x j x Q θ () j η θ() j, j = k +,..., k + p, (2.6) where θ () j = Q T (x j x ). Denote by x the column mean of the expanded matrx X = [X, x k+,... x k+p ]. Then for the left-sngular vector matrx Q correspondng to the d largest sngular values of X x e T, (I Q QT )( X x e T ) F η ( Q T ( X x e T ) F + k+p p k + p j=k+ θ () j 2). The above result shows that f the mean of the projectons θ () j of the expandng neghbors s small and/or the number of the expandng ponts are relatvely small, then approxmately, NEIGHBORHOOD EXPANSION. (I Q QT )( X x e T ) F η Q T ( X x e T ) F. E. Set k to be the column number of X obtaned by the neghborhood contractng step. For j = k +,..., K, compute θ () j = Q T (x j x ). E. Denote by J the ndex subset of j s, k < j K, such that (I Q Q T )(x j x ) 2 θ () j 2. Expand X by addng x j, j J. Example 3. We construct the data ponts as x = [sn(t ), cos(t ),.2t ] T, =,..., N, wth t [, 4π] unformly dstrbuted, whch s plotted n the top-left panel n Fgure 2.
5 .8 (a) k=7. (b) k=8. (c) k= (d) k=3.5 (e) k=5.5 (f) k=3.5 (g) k= Fgure 2: Plots of the data sets (top left), the computed coordnates τ by LTSA vs. the centered arc-length coordnates (a c), the computed coordnates τ by LTSA wth neghborhood C contracton vs the centered arc-length coordnates (e g), and the computed coordnates τ by LTSA wth neghborhood contracton and expanson vs. the centered arc-length coordnates (bottom left) LTSA wth constant k-nn fals for any k: small k leads to lack of necessary overlap among the neghborhoods whle for large k, the computed tangent space can not represent the local geometry well. In (a c) of Fgure 2, we plot the coordnates computed by LTSA vs. the arc-length of the curve. Contractng the neghborhoods wthout expanson also results n bad results, because of small szes of the resultng neghborhoods, see (e g) of Fgure 2. Panel (d) of Fgure 2 gves an excellent result computed by LTSA wth both neghborhood contracton and expanson. We want menton that our adaptve strateges also work well for nosy data sets, we refer the readers to [8] for some examples. 3 Algnment ncorporatng varatons of manfold curvature Let X = [x,..., x k ] conssts of the neghbors determned by the contracton and expanson steps n the above secton. In (.), we can show that the sze of the error term E 2 depends on the sze of the curvature of manfold at sample pont x [8]. To make the mnmzaton n (.) more unform, we need to factor out the effect of the varatons of the curvature. To ths end, we pose the followng mnmzaton problem, mn T,{L } k (T (I k ee T ) L Θ )D 2 2, (3.7) where D = dag(φ(θ () ),..., φ(θ() k )), and φ(θ () j ) s proportonal to the curvature of the manfold at the parameter value θ, the computaton of whch wll be dscussed below. For fxed T, the optmal L s gven by L = T (I k k ee T )Θ + = T Θ +. Substtutng t nto (3.7), we have the reduced mnmzaton problem mn T k T (I k k ee T Θ + Θ )D 2 2 Imposng the normalzaton condton T T T = I, a soluton to the mnmzaton problem above s gven by the d egenvectors correspondng to the second to (d + )st smallest
6 egenvalues of the followng matrx B (SW ) dag(d 2 /k,..., D 2 n/k n )(SW ) T, where W = (I k k ee T )(I k Θ + Θ ). Second-order analyss of the error term n (.) shows that we can set φ (θ () j ) = γ + c f (τ ) θ () j 2 wth a small postve constant γ to ensure φ (θ () j ) >, and c f (τ ) represents the mean of curvatures c f (τ, τ j ) for all neghbors of x. Let Q denote the orthonormal matrx of the largest d rght sngular vectors of X (I k ee T ). We can approxmately compute c f (τ ) as follows. c f (τ ) k k l=2 arccos(σ mn (Q T Q l )) θ l 2. where σ mn ( ) s the smallest sngular value of a matrx. Then the dagonal weghts φ(θ ) can be computed as φ (θ () j ) = η + θ j 2 2 k k l=2 arccos(σ mn (Q T Q l )) θ l 2. Wth the above preparaton, we are now ready to present the adaptve LTSA algorthm. Gven a data set X = [x,..., x N ], the approach conssts of the followng steps: Step. Determnng the neghborhood X = [x,..., x k ] for each x, =,..., N, usng the neghborhood contracton/expanson steps n Secton 2. Step 2. Compute the truncated SVD, say Q Σ V T of X (I k ee T ) wth d columns n both Q and V, the projectons θ () l = Q T (x l x ) wth the mean x of the neghbors, and denote Θ = [θ (),..., θ() k ]. Step 3. Estmate the curvatures as follows. For each =,..., N, c = k arccos(σ mn (Q T Q l )) k l=2 θ () l, 2 Step 4. Construct algnment matrx. For =,..., N, set W = I k [ e, V ][ e, V ] T, k k D = γi+ dag(c θ () 2 2,..., c θ () k 2 2), where γ s a small constant number (usually we set γ =. 6 ). Set ntal B =. Update B teratvely by B(I, I ) := B(I, I ) + W D D W T /k, =,..., N. Step 5. Algn global coordnates. Compute the d + smallest egen-vectors of B and pck up the egenvector [u 2,..., u d+ ] matrx correspondng to the 2nd to d + st smallest egenvalues, and set T = [u 2,..., u d+ ] T. 4 Expermental Results In ths secton, we present several numercal examples to llustrate the performance of the adaptve LTSA algorthm. The test data sets nclude curves n 2D/3D Eucldean spaces.
7 .5 k=4.5 k=6.5 k=8.5 k= Fgure 3: The computed coordnates τ by LTSA takng nto account curvature and varable sze of neghborhood. Frst we apply the adaptve LTSA to the date sets shown n Examples and 2. Adaptve LTSA wth dfferent startng k s works every well. See Fgure 3. It shows that for these tow data sets, the adaptve LTSA s not senstve to the choce of the startng k or the varatons n samplng denstes and manfold curvatures. Next, we consder the swss-roll surface defned by f(s, t) = [s cos(s), t, s sn(s)] T. It s easy to see that J f (s, t) T J f (s, t) = dag( + s 2, ). Denotng s = s(r) the nverse transformaton of r = r(s) defned by r(s) = the swss-roll surface can be parameterzed as s + α2 dα = 2 (s + s 2 + arcsnh(s)), ˆf(r, t) = [s(t) cos(s(r)), t, s(r) sn(s(r))] T and ˆf s sometrc wth respect to (r, t). In the left fgure of Fgure 4, we show there s a dstorton between the computed coordnates by LTSA wth the best-ft neghborhood sze (bottom left) and the generatng coordnates (r, t) T (top rght). In the rght panel of the bottom row of the left fgure of Fgure 4, we plot the computed coordnates by the adaptve LTSA wth ntal neghborhood sze k = 3. (In fact, the adaptve LTSA s nsenstve to k and we wll get smlar results wth a larger or smaller ntal k). We can see that the computed coordnates by the adaptve LTSA can recover the generatng coordnates well wthout much dstorton. Fnally we appled both LTSA and the adaptve LTSA to a 2D manfold wth 3 peaks embedded n a dmensonal space. The data ponts are generated as follows. Frst we generate N = 2 3D ponts, y = (t, s, h(t, s )) T, where t and s randomly dstrbuted n the nterval [.5,.5] and h(t, s) s defned by h(t, s) = e 2t2 2s 2 e t2 (s+) 2 e (+t)2 s 2. Then we embed the 3D ponts nto a D space by x Q = Qy, x H = Hy, where Q R 3 s a random orthonormal matrx resultng n an orthogonal transformaton and H R 3 a matrx wth ts sngular values unformly dstrbuted n (, ) resultng n an affne transformaton. In the top row of the rght fgure of Fgure 4, we plot the
8 swss role Generatng Coordnate. (a).4 (b) (c).6 (d) Fgure 4: Left fgure: 3D swss-roll and the generatng coordnates (top row), computed 2D coordnates by LTSA wth the best neghborhood sze k = 5 (bottom left) and computed 2D coordnates by adaptve LTSA (bottom rght). Rght fgure: coordnates computed by LTSA for the orthogonally embedded D data set {x Q } (a) and the affnely embedded D data set {x H } (b), and the coordnates computed by the adaptve LTSA for {xq } (c) and {x H } (d). computed coordnates by LTSA for x Q (shown n (a)) and x H (shown n (b)) wth best-ft neghborhood sze k = 5. We can see the deformatons (stretchng and compresson) are qute promnent. In the bottom row of the rght fgure of Fgure 4, we plot the computed coordnates by the adaptve LTSA for x Q (shown n (c)) and x H (shown n (d)) wth ntal neghborhood sze k = 5. It s clear that the adaptve LTSA gves a much better result. References [] M. Brand. Chartng a manfold. Advances n Neural Informaton Processng Systems, 5, MIT Press, 23. [2] D. Donoho and C. Grmes. Hessan Egenmaps: new tools for nonlnear dmensonalty reducton. Proceedngs of Natonal Academy of Scence, , 23. [3] S. Rowes and L. Saul. Nonlnear dmensonalty reducton by locally lnear embeddng. Scence, 29: , 2. [4] L. Saul and S. Rowes. Thnk globally, ft locally: unsupervsed learnng of nonlnear manfolds. Journal of Machne Learnng Research, 4:9-55, 23. [5] E. Teh and S. Rowes. Automatc Algnment of Local Representatons. Advances n Neural Informaton Processng Systems, 5, MIT Press, 23. [6] J. Tenenbaum, V. De Slva and J. Langford. A global geometrc framework for nonlnear dmenson reducton. Scence, 29: , 2. [7] Z. Zhang and H. Zha. Prncpal Manfolds and Nonlnear Dmensonalty Reducton va Tangent Space Algnment. SIAM J. Scentfc Computng, 26:33 338, 24. [8] J. Wang, Z. Zhang and H. Zha. Adaptve Manfold Learnng. Techncal Report CSE- 4-2, Dept. CSE, Pennsylvana State Unversty, 24.
Lecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationCSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography
CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationHongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)
ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationSolutions to exam in SF1811 Optimization, Jan 14, 2015
Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable
More informationTensor Subspace Analysis
Tensor Subspace Analyss Xaofe He 1 Deng Ca Partha Nyog 1 1 Department of Computer Scence, Unversty of Chcago {xaofe, nyog}@cs.uchcago.edu Department of Computer Scence, Unversty of Illnos at Urbana-Champagn
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More information1 Derivation of Point-to-Plane Minimization
1 Dervaton of Pont-to-Plane Mnmzaton Consder the Chen-Medon (pont-to-plane) framework for ICP. Assume we have a collecton of ponts (p, q ) wth normals n. We want to determne the optmal rotaton and translaton
More information= = = (a) Use the MATLAB command rref to solve the system. (b) Let A be the coefficient matrix and B be the right-hand side of the system.
Chapter Matlab Exercses Chapter Matlab Exercses. Consder the lnear system of Example n Secton.. x x x y z y y z (a) Use the MATLAB command rref to solve the system. (b) Let A be the coeffcent matrx and
More informationLecture 4: Constant Time SVD Approximation
Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),
More informationSemi-supervised Classification with Active Query Selection
Sem-supervsed Classfcaton wth Actve Query Selecton Jao Wang and Swe Luo School of Computer and Informaton Technology, Beng Jaotong Unversty, Beng 00044, Chna Wangjao088@63.com Abstract. Labeled samples
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More informationA New Refinement of Jacobi Method for Solution of Linear System Equations AX=b
Int J Contemp Math Scences, Vol 3, 28, no 17, 819-827 A New Refnement of Jacob Method for Soluton of Lnear System Equatons AX=b F Naem Dafchah Department of Mathematcs, Faculty of Scences Unversty of Gulan,
More informationMarkov Chain Monte Carlo Lecture 6
where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways
More informationFormulas for the Determinant
page 224 224 CHAPTER 3 Determnants e t te t e 2t 38 A = e t 2te t e 2t e t te t 2e 2t 39 If 123 A = 345, 456 compute the matrx product A adj(a) What can you conclude about det(a)? For Problems 40 43, use
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationStatistical pattern recognition
Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve
More informationCHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE
CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng
More informationρ some λ THE INVERSE POWER METHOD (or INVERSE ITERATION) , for , or (more usually) to
THE INVERSE POWER METHOD (or INVERSE ITERATION) -- applcaton of the Power method to A some fxed constant ρ (whch s called a shft), x λ ρ If the egenpars of A are { ( λ, x ) } ( ), or (more usually) to,
More informationCSE 252C: Computer Vision III
CSE 252C: Computer Vson III Lecturer: Serge Belonge Scrbe: Catherne Wah LECTURE 15 Kernel Machnes 15.1. Kernels We wll study two methods based on a specal knd of functon k(x, y) called a kernel: Kernel
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationMMA and GCMMA two methods for nonlinear optimization
MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons
More informationLINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity
LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have
More informationLecture 3. Ax x i a i. i i
18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest
More informationWhich Separator? Spring 1
Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal
More informationInductance Calculation for Conductors of Arbitrary Shape
CRYO/02/028 Aprl 5, 2002 Inductance Calculaton for Conductors of Arbtrary Shape L. Bottura Dstrbuton: Internal Summary In ths note we descrbe a method for the numercal calculaton of nductances among conductors
More informationUnified Subspace Analysis for Face Recognition
Unfed Subspace Analyss for Face Recognton Xaogang Wang and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, Hong Kong {xgwang, xtang}@e.cuhk.edu.hk Abstract PCA, LDA
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationMatrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD
Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationStanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011
Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationThe lower and upper bounds on Perron root of nonnegative irreducible matrices
Journal of Computatonal Appled Mathematcs 217 (2008) 259 267 wwwelsevercom/locate/cam The lower upper bounds on Perron root of nonnegatve rreducble matrces Guang-Xn Huang a,, Feng Yn b,keguo a a College
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationDifference Equations
Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1
More informationCS 468 Lecture 16: Isometry Invariance and Spectral Techniques
CS 468 Lecture 16: Isometry Invarance and Spectral Technques Justn Solomon Scrbe: Evan Gawlk Introducton. In geometry processng, t s often desrable to characterze the shape of an object n a manner that
More informationOn the correction of the h-index for career length
1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat
More informationEfficient, General Point Cloud Registration with Kernel Feature Maps
Effcent, General Pont Cloud Regstraton wth Kernel Feature Maps Hanchen Xong, Sandor Szedmak, Justus Pater Insttute of Computer Scence Unversty of Innsbruck 30 May 2013 Hanchen Xong (Un.Innsbruck) 3D Regstraton
More informationISOMAP and LLE 姚遠 2019
ISOMAP and LLE 姚遠 2019 Fsher 1922... the objectve of statstcal methods s the reducton of data. A quantty of data... s to be replaced by relatvely few quanttes whch shall adequately represent... the relevant
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationIV. Performance Optimization
IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationInexact Newton Methods for Inverse Eigenvalue Problems
Inexact Newton Methods for Inverse Egenvalue Problems Zheng-jan Ba Abstract In ths paper, we survey some of the latest development n usng nexact Newton-lke methods for solvng nverse egenvalue problems.
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationPerron Vectors of an Irreducible Nonnegative Interval Matrix
Perron Vectors of an Irreducble Nonnegatve Interval Matrx Jr Rohn August 4 2005 Abstract As s well known an rreducble nonnegatve matrx possesses a unquely determned Perron vector. As the man result of
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More information1 Matrix representations of canonical matrices
1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:
More informationCS4495/6495 Introduction to Computer Vision. 3C-L3 Calibrating cameras
CS4495/6495 Introducton to Computer Vson 3C-L3 Calbratng cameras Fnally (last tme): Camera parameters Projecton equaton the cumulatve effect of all parameters: M (3x4) f s x ' 1 0 0 0 c R 0 I T 3 3 3 x1
More informationHigh resolution entropy stable scheme for shallow water equations
Internatonal Symposum on Computers & Informatcs (ISCI 05) Hgh resoluton entropy stable scheme for shallow water equatons Xaohan Cheng,a, Yufeng Ne,b, Department of Appled Mathematcs, Northwestern Polytechncal
More informationSome Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS)
Some Comments on Acceleratng Convergence of Iteratve Sequences Usng Drect Inverson of the Iteratve Subspace (DIIS) C. Davd Sherrll School of Chemstry and Bochemstry Georga Insttute of Technology May 1998
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16
STAT 39: MATHEMATICAL COMPUTATIONS I FALL 218 LECTURE 16 1 why teratve methods f we have a lnear system Ax = b where A s very, very large but s ether sparse or structured (eg, banded, Toepltz, banded plus
More informationThe Second Eigenvalue of Planar Graphs
Spectral Graph Theory Lecture 20 The Second Egenvalue of Planar Graphs Danel A. Spelman November 11, 2015 Dsclamer These notes are not necessarly an accurate representaton of what happened n class. The
More informationAssortment Optimization under MNL
Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.
More informationNorms, Condition Numbers, Eigenvalues and Eigenvectors
Norms, Condton Numbers, Egenvalues and Egenvectors 1 Norms A norm s a measure of the sze of a matrx or a vector For vectors the common norms are: N a 2 = ( x 2 1/2 the Eucldean Norm (1a b 1 = =1 N x (1b
More informationCHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD
CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB
More informationMeshless Surfaces. presented by Niloy J. Mitra. An Nguyen
Meshless Surfaces presented by Nloy J. Mtra An Nguyen Outlne Mesh-Independent Surface Interpolaton D. Levn Outlne Mesh-Independent Surface Interpolaton D. Levn Pont Set Surfaces M. Alexa, J. Behr, D. Cohen-Or,
More informationGeneral viscosity iterative method for a sequence of quasi-nonexpansive mappings
Avalable onlne at www.tjnsa.com J. Nonlnear Sc. Appl. 9 (2016), 5672 5682 Research Artcle General vscosty teratve method for a sequence of quas-nonexpansve mappngs Cuje Zhang, Ynan Wang College of Scence,
More informationLossy Compression. Compromise accuracy of reconstruction for increased compression.
Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost
More informationGlobal Sensitivity. Tuesday 20 th February, 2018
Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values
More informationLinear Feature Engineering 11
Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19
More information2.3 Nilpotent endomorphisms
s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms
More informationChapter 11: Simple Linear Regression and Correlation
Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests
More informationSupport Vector Machines CS434
Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We
More informationSupplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso
Supplement: Proofs and Techncal Detals for The Soluton Path of the Generalzed Lasso Ryan J. Tbshran Jonathan Taylor In ths document we gve supplementary detals to the paper The Soluton Path of the Generalzed
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More informationAnnexes. EC.1. Cycle-base move illustration. EC.2. Problem Instances
ec Annexes Ths Annex frst llustrates a cycle-based move n the dynamc-block generaton tabu search. It then dsplays the characterstcs of the nstance sets, followed by detaled results of the parametercalbraton
More informationAn efficient algorithm for multivariate Maclaurin Newton transformation
Annales UMCS Informatca AI VIII, 2 2008) 5 14 DOI: 10.2478/v10065-008-0020-6 An effcent algorthm for multvarate Maclaurn Newton transformaton Joanna Kapusta Insttute of Mathematcs and Computer Scence,
More informationInner Product. Euclidean Space. Orthonormal Basis. Orthogonal
Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,
More informationProf. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model
EXACT OE-DIMESIOAL ISIG MODEL The one-dmensonal Isng model conssts of a chan of spns, each spn nteractng only wth ts two nearest neghbors. The smple Isng problem n one dmenson can be solved drectly n several
More informationThe equation of motion of a dynamical system is given by a set of differential equations. That is (1)
Dynamcal Systems Many engneerng and natural systems are dynamcal systems. For example a pendulum s a dynamcal system. State l The state of the dynamcal system specfes t condtons. For a pendulum n the absence
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationLecture 21: Numerical methods for pricing American type derivatives
Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016
U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationA Novel Biometric Feature Extraction Algorithm using Two Dimensional Fisherface in 2DPCA subspace for Face Recognition
A Novel ometrc Feature Extracton Algorthm usng wo Dmensonal Fsherface n 2DPA subspace for Face Recognton R. M. MUELO, W.L. WOO, and S.S. DLAY School of Electrcal, Electronc and omputer Engneerng Unversty
More informationMath 217 Fall 2013 Homework 2 Solutions
Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationStructure from Motion. Forsyth&Ponce: Chap. 12 and 13 Szeliski: Chap. 7
Structure from Moton Forsyth&once: Chap. 2 and 3 Szelsk: Chap. 7 Introducton to Structure from Moton Forsyth&once: Chap. 2 Szelsk: Chap. 7 Structure from Moton Intro he Reconstructon roblem p 3?? p p 2
More informationThe Minimum Universal Cost Flow in an Infeasible Flow Network
Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran
More informationFeb 14: Spatial analysis of data fields
Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s
More informationSpectral Graph Theory and its Applications September 16, Lecture 5
Spectral Graph Theory and ts Applcatons September 16, 2004 Lecturer: Danel A. Spelman Lecture 5 5.1 Introducton In ths lecture, we wll prove the followng theorem: Theorem 5.1.1. Let G be a planar graph
More informationModule 9. Lecture 6. Duality in Assignment Problems
Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept
More informationChapter 13: Multiple Regression
Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationRandić Energy and Randić Estrada Index of a Graph
EUROPEAN JOURNAL OF PURE AND APPLIED MATHEMATICS Vol. 5, No., 202, 88-96 ISSN 307-5543 www.ejpam.com SPECIAL ISSUE FOR THE INTERNATIONAL CONFERENCE ON APPLIED ANALYSIS AND ALGEBRA 29 JUNE -02JULY 20, ISTANBUL
More informationSalmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2
Salmon: Lectures on partal dfferental equatons 5. Classfcaton of second-order equatons There are general methods for classfyng hgher-order partal dfferental equatons. One s very general (applyng even to
More informationSupplementary material: Margin based PU Learning. Matrix Concentration Inequalities
Supplementary materal: Margn based PU Learnng We gve the complete proofs of Theorem and n Secton We frst ntroduce the well-known concentraton nequalty, so the covarance estmator can be bounded Then we
More informationNorm Bounds for a Transformed Activity Level. Vector in Sraffian Systems: A Dual Exercise
ppled Mathematcal Scences, Vol. 4, 200, no. 60, 2955-296 Norm Bounds for a ransformed ctvty Level Vector n Sraffan Systems: Dual Exercse Nkolaos Rodousaks Department of Publc dmnstraton, Panteon Unversty
More informationLeast-Squares Fitting of a Hyperplane
Least-Squares Fttng of a Hyperplane Robert K. Monot October 20, 2002 Abstract A method s developed for fttng a hyperplane to a set of data by least-squares, allowng for ndependent uncertantes n all coordnates
More informationWorkshop: Approximating energies and wave functions Quantum aspects of physical chemistry
Workshop: Approxmatng energes and wave functons Quantum aspects of physcal chemstry http://quantum.bu.edu/pltl/6/6.pdf Last updated Thursday, November 7, 25 7:9:5-5: Copyrght 25 Dan Dll (dan@bu.edu) Department
More informationA New Evolutionary Computation Based Approach for Learning Bayesian Network
Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationElectronic Quantum Monte Carlo Calculations of Energies and Atomic Forces for Diatomic and Polyatomic Molecules
RESERVE HIS SPACE Electronc Quantum Monte Carlo Calculatons of Energes and Atomc Forces for Datomc and Polyatomc Molecules Myung Won Lee 1, Massmo Mella 2, and Andrew M. Rappe 1,* 1 he Maknen heoretcal
More information