Similarities, Distances and Manifold Learning

Size: px
Start display at page:

Download "Similarities, Distances and Manifold Learning"

Transcription

1 Smlartes, Dstances and Manfold Learnng Prof. Rchard C. Wlson Dept. of Computer Scence Unversty of York

2 Part I: Eucldean Space Poston, Smlarty and Dstance Manfold Learnng n Eucldean space Some famous technques Part II: Non-Eucldean Manfolds Assessng Data Nature and Propertes of Manfolds Data Manfolds Learnng some specal types of manfolds Part III: Advanced Technques Methods for ntrnscally curved manfolds Thanks to Edwn Hancock, Elza Xu, Bob Dun for contrbutons And support from the EU SIMBAD project

3 Part I: Eucldean Space

4 Poston The man arena for pattern recognton and machne learnng problems s vector space A set of n well defned features collected nto a vector R n Also defned are addton of vectors and multplcaton by a scalar Feature vector poston

5 Smlarty To make meanngful progress, we need a noton of smlarty, the nner product x, y The nner-product x,y can be consdered to be a smlarty between x and y In Eucldean space, poston, smlarty are all neatly connected Poston Smlarty Dstance (squared) x, y x y x, y d 2 ( x, y) x y x y, x y

6 The Golden Tro In Eucldean space, the concepts of poston, smlarty and dstance are elegantly connected Poston X Dstance D Smlarty K

7 Pont poston matrx In a normal manfold learnng problem, we have a set of samples X={x 1,x 2,...,x m } These can be collected together n a matrx X x x X x T 1 T 2 T m I use ths conventon, but others may wrte them vertcally

8 Centrng A common and mportant operaton s centrng movng the mean to the orgn Centred ponts behave better JX / m matrx s the mean matrx, so J s the all-ones matrx X JX / m s the centred Ths can be done wth C C I J / m X CX C s the centrng matrx (and s symmetrc C=C T )

9 Poston-Smlarty The smlarty matrx K s defned as K j x, x From the defnton of X, we smply get The Gram matrx s the smlarty matrx of the centred ponts (from the defnton of X) K c K XX CXX.e. a centrng operaton on K T C T T j CKC Poston X Smlarty K K c s really a kernel matrx for the ponts (lnear kernel)

10 Poston-Smlarty To go from K to X, we need to consder the egendecomposton of K K K UΛU XX As long as we can take the square root of Λ then we can fnd X as T T 1/2 X UΛ Poston X Smlarty K

11 kernel embeddng Kernel embeddng Fnds a Eucldean manfold from object smlartes K UΛU 1/2 X UΛ Embeds a kernel matrx nto a set of ponts n Eucldean space (the ponts are automatcally centred) K must have no negatve egenvalues,.e. t s a kernel matrx (Mercer condton) T

12 Smlarty-Dstance Smlarty K Dstance D j s j jj j j j j j j D K K K d, 2 2, 2,,, ), ( x x x x x x x x x x x x j s j jj D K K K, 2 We can easly determne D s from K

13 Smlarty-Dstance What about fndng K from D s? D 2K s, j K K jj j Lookng at the top equaton, we mght magne that K=-½ D s s a sutable choce Not centred; the relatonshp s actually K 1 2 CD C s

14 Classc MDS Classc Multdmensonal Scalng embeds a (squared) dstance matrx nto Eucldean space Usng what we have so far, the algorthm s smple 1 K CDsC 2 T UΛU K X UΛ 1/2 Ths s MDS Compute the kernel Egendecompose the kernel Embed the kernel Poston X Dstance D

15 The Golden Tro MDS Poston X Kernel Embeddng Dstance D Smlarty K 1 K CDsC 2 D 2K s, j K K jj j

16 Kernel methods A kernel s functon k(,j) whch computes an nner-product k(, j) x, x But wthout needng to know the actual ponts (the space s mplct) Usng a kernel functon we can drectly compute K wthout knowng X Poston X j Dstance D Smlarty K Kernel functon

17 Kernel methods The mpled space may be very hgh dmensonal, but a true kernel wll always produce a postve semdefnte K and the mpled space wll be Eucldean Many (most?) PR algorthms can be kernelzed Made to use K rather than X or D The trck s to note that any nterestng vector should le n the space spanned by the examples we are gven Hence t can be wrtten as a lnear combnaton u x X Look for α nstead of u 1 T 1 α x 2 2 m x m

18 Kernel PCA What about PCA? PCA solves the followng problem Let s kernelze: Xu X u Σu u u u u T T T n 1 arg mn arg mn * 1 1 α K α α XX XX α α X X X α X Xu X u ) ( ) ( 1 1 T T T T T T T T T T n n n n

19 Kernel PCA K 2 has the same egenvectors as K, so the egenvectors of PCA are the same as the egenvectors of K The egenvalues of PCA are related to the egenvectors of K by 1 2 PCA n K Kernel PCA s a kernel embeddng wth an externally provded kernel matrx

20 Kernel PCA So kernel PCA gves the same soluton as kernel embeddng The egenvalues are modfed a bt They are essentally the same thng n Eucldean space MDS uses the kernel and kernel embeddng MDS and PCA are essentally the same thng n Eucldean space Kernel embeddng, MDS and PCA all gve the same answer for a set of ponts n Eucldean space

21 Some useful observatons Your smlarty matrx s Eucldean ff t has no negatve egenvalues (.e. t s a kernel matrx and PSD) By smlar reasonng, your dstance matrx s Eucldean ff the smlarty matrx derved from t s PSD If the feature space s small but the number of samples s large, then the covarance matrx s small and t s better to do normal PCA (on the covarance matrx) If the feature space s large and the number of samples s small, then the kernel matrx wll be small and t s better to do kernel embeddng

22 Part II: Non-Eucldean Manfolds

23 Non-lnear data Much of the data n computer vson les n a hghdmensonal feature space but s constraned n some way The space of all mages of a face s a subspace of the space of all possble mages The subspace s hghly non-lnear but low dmensonal (descrbed by a few parameters)

24 Non-lnear data Ths cannot be exploted by the lnear subspace methods lke PCA These assume that the subspace s a Eucldean space as well A classc example s the swss roll data:

25 Flat Manfolds Fundamentally dfferent types of data, for example: The embeddng of ths data nto the hgh-dmensonal space s hghly curved Ths s called extrnsc curvature, the curvature of the manfold wth respect to the embeddng space Now magne that ths manfold was a pece of paper; you could unroll the paper nto a flat plane wthout dstortng t No ntrnsc curvature, n fact t s homeomorphc to Eucldean space

26 Ths manfold s dfferent: Curved manfold It must be stretched to map t onto a plane It has non-zero ntrnsc curvature A flatlander lvng on ths manfold can tell that t s curved, for example by measurng the rato of the radus to the crcumference of a crcle In the frst case, we mght stll hope to fnd Eucldean embeddng We can never fnd a dstorton free Eucldean embeddng of the second (n the sense that the dstances wll always have errors)

27 Intrnscally Eucldean Manfolds We cannot use the prevous methods on the second type of manfold, but there s stll hope for the frst The manfold s embedded n Eucldean space, but Eucldean dstance s not the correct way to measure dstance The Eucldean dstance shortcuts the manfold The geodesc dstance calculates the shortest path along the manfold

28 Geodescs The geodesc generalzes the concept of dstance to curved manfolds The shortest path jonng two ponts whch les completely wthn the manfold If we can correctly compute the geodesc dstances, and the manfold s ntrnscally flat, we should get Eucldean dstances whch we can plug nto our Eucldean geometry machne Poston X Geodesc Dstances Dstance D Smlarty K

29 ISOMAP ISOMAP s exactly such an algorthm Approxmate geodesc dstances are computed for the ponts from a graph Nearest neghbours graph For neghbours, Eucldean dstance geodesc dstances For non-neghbours, geodesc dstance approxmated by shortest dstance n graph Once we have dstances D, can use MDS to fnd Eucldean embeddng

30 ISOMAP: Neghbourhood graph Shortest path algorthm MDS ISOMAP ISOMAP s dstance-preservng embedded dstances should be close to geodesc dstances

31 Laplacan Egenmap The Laplacan Egenmap s another graph-based method of embeddng non-lnear manfolds nto Eucldean space As wth ISOMAP, form a neghbourhood graph for the dataponts Fnd the graph Laplacan as follows The adjacency matrx A s A j d e 0 2 j t f and The degree matrx D s the dagonal matrx D A j j The normalzed graph Laplacan s L I D j are connected otherwse AD 1/ 2 1/ 2

32 Laplacan Egenmap We fnd the Laplacan egenmap embeddng usng the egendecomposton of L L The embedded postons are X Smlar to ISOMAP UU D 1/ 2 U Structure preservng not dstance preservng T

33 Locally-Lnear Embeddng Locally-lnear Embeddng s another classc method whch also begns wth a neghbourhood graph We make pont (n the orgnal data) from a weghted sum of the neghbourng ponts j xˆ Wjx W j s 0 for any pont j not n the neghbourhood (and for =j) We fnd the weghts by mnmsng the reconstructon error mn xˆ x Subject to the constrans that the weghts are non-negatve and sum to 1 Gves a relatvely smple closed-form soluton W 0, 1 j W j j 2 j j

34 Locally-Lnear Embeddng These weghts encode how well a pont j represents a pont and can be nterpreted as the adjacency between and j A low dmensonal embeddng s found by then fndng ponts to mnmse the error mn 2 yˆ y yˆ In other words, we fnd a low-dmensonal embeddng whch preserves the adjacency relatonshps The soluton to ths embeddng problem turns out to be smply the egenvectors of the matrx M j W y T M ( I W) ( I W) LLE s scale-free: the fnal ponts have the covarance matrx I Unt scale j j

35 Comparson LLE mght seem lke qute a dfferent process to the prevous two, but actually very smlar We can nterpret the process as producng a kernel matrx followed by scale-free kernel embeddng k T T K ( k 1) I J W W W W n T K UΛ U X U ISOMAP Lap. Egenmap LLE Representaton Neghbourhood graph Neghbourhood graph Neghbourhood graph Smlarty matrx Embeddng From geodesc dstances X 1/ U 2 Graph Laplacan X D 1/ 2 U Reconstructon weghts X U

36 Comparson ISOMAP s the only method whch drectly computes and uses the geodesc dstances The other two depend ndrectly on the dstances through local structure LLE s scale-free, so the orgnal dstance scale s lost, but the local structure s preserved Computng the necessary local dmensonalty to fnd the Computng the necessary local dmensonalty to fnd the correct nearest neghbours s a problem for all such methods

37 Part II: Indefnte Smlartes

38 Non-Eucldean data Data s Eucldean ff K s psd Unless you are usng a kernel functon, ths s often not true Why does ths happen?

39 What type of data do I have? Startng pont: dstance matrx However we do not know apror f our measurements are representable on a manfold We wll call them dssmlartes Our startng pont to answer the queston What type of data do I have? wll be a matrx of dssmlartes D between objects Types of dssmlartes Eucldean (no ntrnsc curvature) Non-Eucldean, metrc (curved manfold) Non-metrc (no pont-lke manfold representaton)

40 Causes Example: Chcken peces data Dstance by algnment Global algnment of everythng could fnd Eucldean dstances Only local algnments are practcal

41 Causes Dssmlartes may also be non-metrc The data s metrc f t obeys the metrc condtons 1. D j 0 (nonegatvty) 2. D j = 0 ff =j (dentty of ndscernables) 3. D j = D j (symmetry) 4. D j D k + D kj (trangle nequalty) Reasonable dssmlartes should meet 1&2

42 Causes Symmetry D j = D j May not be symmetrc by defnton Algnment: j may fnd a better soluton than j

43 Causes Trangle volatons D j D k + D kj Extended objects D D D k kj j k j Fnally, nose n the measure of D can cause all of these effects

44 Fnd the smlarty matrx K 1 2 Tests(1) CD C The data s Eucldean ff K s postve semdefnte (no negatve egenvalues) K s a kernel, explct embeddng from kernel embeddng We can then use K n a kernel algorthm Negatve egenfracton (NEF) Between 0 and 0.5 NEF 0 0 for Eucldean smlartes s

45 Tests(2) 3. D j = D j (symmetry) Mean, maxmum asymmetry Easy to check by lookng at pars 4. D j D k + D kj (trangle nequalty) Number, maxmum volaton Check these for your data (3 rd nvolves checkng all trples possbly expensve) Metrc data s embeddable on a (curved) Remannan manfold

46 Determnng the causes The negatve egenvalues 0-1 Nose Extended Objects Egenvalue Sphercal manfold Egenvalue Egenvalue

47 Correctons If the data s non-metrc or non-eucldean, we can correct t Symmetry volatons 1 Average D j Dj ( Dj D j ) 2 For mn-cost dstances D j Dj mn( Dj, D j ) may be more approprate Trangle volatons D Constant offset j D j c ( j) Ths wll also remove non-eucldean behavour for large enough c Eucldean volatons Dscard negatve egenvalues Even when the volatons are caused by nose, some nformaton s stll lost There are many other approaches * * On Eucldean correctons for non-eucldean dssmlartes, Dun, Pekalska, Harol, Lee and Bunke, S+SSPR 08

48 Part III: Technques for non-eucldean Embeddngs

49 Known Manfolds Sometmes we have data whch les on a known but non- Eucldean manfold Examples n Computer Vson Surface normals Rotaton matrces Flow tensors (DT-MRI) Ths s not Manfold Learnng, as we already know what the manfold s What tools do we need to be able to process data lke ths? As before, dstances are the key

50 Example: 2D drecton Drecton of an edge n an mage, encoded as a unt vector x 1 x x 2 The average of the drecton vector sn t even a drecton vector (not unt length), let alone the correct average drecton The normal defnton of mean s not correct Because the manfold s curved x 1 n x

51 Tangent space The tangent space (T P ) s the Eucldean space whch s parallel to the manfold(m) at a partcular pont (P) M P T P The tangent space s a very useful tool because t s Eucldean

52 Exponental map: Exponental Map Exp P : T A Exp P X Exp P maps a pont X on the tangent plane onto a pont A on the manfold P s the centre of the mappng and s at the orgn on the tangent space P M The mappng s one-to-one n a local regon of P The most mportant property of the mappng s that the dstances to the centre P are preserved d T P ( X, P) d ( A, P) The geodesc dstance on the manfold equals the Eucldean dstance on the tangent plane (for dstances to the centre only) M

53 Exponental map The log map goes the other way, from manfold to tangent plane Log : M T P X Log P M p

54 Exponental Map Example on the crcle: Embed the crcle n the complex plane The manfold representng the crcle s a complex number wth magntude 1 and can be wrtten x+y=exp() Im P e P Re

55 In ths case t turns out that the map s related to the normal exp and log functons M T P P e P P A P P A e e P A A X log log Log X A e A A P A P P X P X A exp ) ( exp exp exp Exp

56 Intrnsc mean The mean of a set of samples s usually defned as the sum of the samples dvded by the number Ths s only true n Eucldean space A more general formula x arg mn x Mnmses the dstances from the mean to the samples (equvalent n Eucldean space) d ( x, x ) 2 g

57 Intrnsc mean We can compute ths ntrnsc mean usng the exponental map If we knew what the mean was, then we can use the mean as the centre of a map X Log From the propertes of the Exp-map, the dstances are the same d e So the mean on the tangent plane s equal to the mean on the manfold M A ( X, M ) d ( A, M ) g

58 Intrnsc mean Start wth a guess at the mean and move towards correct answer Ths gves us the followng algorthm Guess at a mean M 0 1. Map on to tangent plane usng M 2. Compute the mean on the tangent plane to get new estmate M +1 M k1 Exp M k 1 n Log M k A

59 Intrnsc Mean For many manfolds, ths procedure wll converge to the ntrnsc mean Convergence not always guaranteed Other statstcs and probablty dstrbutons on manfolds are problematc. Can hypothess a normal dstrbuton on tangent plane, but dstortons nevtable

60 Some useful manfolds and maps Some useful manfolds and exponental maps Drectonal vectors (surface normals etc.) a, a 1 x ( a p cos ) (Log map) sn p cos a (Exp map) sn x a, p unt vectors, x les n an (n-1)d space

61 Some useful manfolds and maps Symmetrc postve defnte matrces (covarance, flow tensors etc) A, X u T P A P Au log P exp P u AP XP P P (Log map) (Exp map) A s symmetrc postve defnte, X s just symmetrc log s the matrx log defned as a generalzed matrx functon

62 Some useful manfolds and maps Orthogonal matrces (rotaton matrces, egenvector matrces) A, AA T X log P A P exp T I A (Log map) X (Exp map) A orthogonal, X antsymmetrc (X+X T =0) These are the matrx exp and log functons as before In fact there are multple solutons to the matrx log Only one s the requred real antsymmetrc matrx; not easy to fnd Rest are complex

63

64 Embeddng on S n On S 2 (surface of a sphere n 3D) the followng parametersaton s well known x ( r sn cos, r sn sn, r cos) The dstance between two ponts (the length of the geodesc) s d j r cos 1 sn sn x y x xy cos cos x y T d xy y

65 More Sphercal Geometry But on a sphere, the dstance s the hghlghted arc-length Much neater to use nner-product x, y xy cos xy r 2 cos xy d xy r xy r cos 1 x, y 2 r And works n any number of dmensons x rθ xy y θ xy

66 Sphercal Embeddng Say we had the dstances between some objects (d j ), measured on the surface of a [hyper]sphere of dmenson n The sphere (and objects) can be embedded nto an n+1 dmensonal space Let X be the matrx of pont postons Z=XX T s a kernel matrx But Zj x, x j And x, y d Z xy j r cos x, x 1 j We can compute Z from D and fnd the sphercal embeddng! r 2 r 2 cos d r j

67 Sphercal Embeddng But wat, we don t know what r s! The dstances D are non-eucldean, and f we use the wrong radus, Z s not a kernel matrx Negatve egenvalues Use ths to fnd the radus Choose r to mnmse the negatve egenvalues r* arg mn Z( r) r o

68 Example: Texture Mappng As an alternatve to unwrappng object onto a plane and texture-mappng the plane Embed onto a sphere and texture-map the sphere Plane Sphere

69 Backup sldes

70 Laplacan and related processes As well as embeddng objects onto manfolds, we can model many nterestng processes on manfolds Example: the way heat flows across a manfold can be very nformatve du 2 u heat equaton dt 2 s the Laplacan and n 3D Eucldean space t x y z On a sphere t s sn r sn r sn s

71 Heat flow Heat flow allows us to do nterestng thngs on a manfold Smoothng: Heat flow s a dffuson process (wll smooth the data) Charactersng the manfold (heat content, heat kernel coeffcents...) The Laplacan depends on the geometry of the manfold We may not know ths It may be hard to calculate explctly Graph Laplacan

72 Graph Laplacan Gven a set of dataponts on the manfold, descrbe them by a graph Vertces are dataponts, edges are adjacency relaton Adjacency matrx (for example) A j exp( d Then the graph Laplacan s L V A 2 j 2 d j / ) V A j j The graph Laplacan s a dscrete approxmaton of the manfold Laplacan

73 Heat Kernel Usng the graph Laplacan, we can easly mplement heatflow methods on the manfold usng the heat-kernel du Lu heat equaton dt H exp( Lt) heat kernel Can dffuse a functon on the manfold by f ' Hf

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Norms, Condition Numbers, Eigenvalues and Eigenvectors

Norms, Condition Numbers, Eigenvalues and Eigenvectors Norms, Condton Numbers, Egenvalues and Egenvectors 1 Norms A norm s a measure of the sze of a matrx or a vector For vectors the common norms are: N a 2 = ( x 2 1/2 the Eucldean Norm (1a b 1 = =1 N x (1b

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

p 1 c 2 + p 2 c 2 + p 3 c p m c 2

p 1 c 2 + p 2 c 2 + p 3 c p m c 2 Where to put a faclty? Gven locatons p 1,..., p m n R n of m houses, want to choose a locaton c n R n for the fre staton. Want c to be as close as possble to all the house. We know how to measure dstance

More information

CSE 252C: Computer Vision III

CSE 252C: Computer Vision III CSE 252C: Computer Vson III Lecturer: Serge Belonge Scrbe: Catherne Wah LECTURE 15 Kernel Machnes 15.1. Kernels We wll study two methods based on a specal knd of functon k(x, y) called a kernel: Kernel

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Affine and Riemannian Connections

Affine and Riemannian Connections Affne and Remannan Connectons Semnar Remannan Geometry Summer Term 2015 Prof Dr Anna Wenhard and Dr Gye-Seon Lee Jakob Ullmann Notaton: X(M) space of smooth vector felds on M D(M) space of smooth functons

More information

1 Matrix representations of canonical matrices

1 Matrix representations of canonical matrices 1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

CS 468 Lecture 16: Isometry Invariance and Spectral Techniques

CS 468 Lecture 16: Isometry Invariance and Spectral Techniques CS 468 Lecture 16: Isometry Invarance and Spectral Technques Justn Solomon Scrbe: Evan Gawlk Introducton. In geometry processng, t s often desrable to characterze the shape of an object n a manner that

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Statistical pattern recognition

Statistical pattern recognition Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

CS4495/6495 Introduction to Computer Vision. 3C-L3 Calibrating cameras

CS4495/6495 Introduction to Computer Vision. 3C-L3 Calibrating cameras CS4495/6495 Introducton to Computer Vson 3C-L3 Calbratng cameras Fnally (last tme): Camera parameters Projecton equaton the cumulatve effect of all parameters: M (3x4) f s x ' 1 0 0 0 c R 0 I T 3 3 3 x1

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Vector Norms. Chapter 7 Iterative Techniques in Matrix Algebra. Cauchy-Bunyakovsky-Schwarz Inequality for Sums. Distances. Convergence.

Vector Norms. Chapter 7 Iterative Techniques in Matrix Algebra. Cauchy-Bunyakovsky-Schwarz Inequality for Sums. Distances. Convergence. Vector Norms Chapter 7 Iteratve Technques n Matrx Algebra Per-Olof Persson persson@berkeley.edu Department of Mathematcs Unversty of Calforna, Berkeley Math 128B Numercal Analyss Defnton A vector norm

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Feb 14: Spatial analysis of data fields

Feb 14: Spatial analysis of data fields Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

Spectral Graph Theory and its Applications September 16, Lecture 5

Spectral Graph Theory and its Applications September 16, Lecture 5 Spectral Graph Theory and ts Applcatons September 16, 2004 Lecturer: Danel A. Spelman Lecture 5 5.1 Introducton In ths lecture, we wll prove the followng theorem: Theorem 5.1.1. Let G be a planar graph

More information

Eigenvalues of Random Graphs

Eigenvalues of Random Graphs Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the

More information

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2 Salmon: Lectures on partal dfferental equatons 5. Classfcaton of second-order equatons There are general methods for classfyng hgher-order partal dfferental equatons. One s very general (applyng even to

More information

The Second Eigenvalue of Planar Graphs

The Second Eigenvalue of Planar Graphs Spectral Graph Theory Lecture 20 The Second Egenvalue of Planar Graphs Danel A. Spelman November 11, 2015 Dsclamer These notes are not necessarly an accurate representaton of what happened n class. The

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Fisher Linear Discriminant Analysis

Fisher Linear Discriminant Analysis Fsher Lnear Dscrmnant Analyss Max Wellng Department of Computer Scence Unversty of Toronto 10 Kng s College Road Toronto, M5S 3G5 Canada wellng@cs.toronto.edu Abstract Ths s a note to explan Fsher lnear

More information

Spectral Clustering. Shannon Quinn

Spectral Clustering. Shannon Quinn Spectral Clusterng Shannon Qunn (wth thanks to Wllam Cohen of Carnege Mellon Unverst, and J. Leskovec, A. Raaraman, and J. Ullman of Stanford Unverst) Graph Parttonng Undrected graph B- parttonng task:

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Lecture 3: Dual problems and Kernels

Lecture 3: Dual problems and Kernels Lecture 3: Dual problems and Kernels C4B Machne Learnng Hlary 211 A. Zsserman Prmal and dual forms Lnear separablty revsted Feature mappng Kernels for SVMs Kernel trck requrements radal bass functons SVM

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

LECTURE 9 CANONICAL CORRELATION ANALYSIS

LECTURE 9 CANONICAL CORRELATION ANALYSIS LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of

More information

Moments of Inertia. and reminds us of the analogous equation for linear momentum p= mv, which is of the form. The kinetic energy of the body is.

Moments of Inertia. and reminds us of the analogous equation for linear momentum p= mv, which is of the form. The kinetic energy of the body is. Moments of Inerta Suppose a body s movng on a crcular path wth constant speed Let s consder two quanttes: the body s angular momentum L about the center of the crcle, and ts knetc energy T How are these

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Unified Subspace Analysis for Face Recognition

Unified Subspace Analysis for Face Recognition Unfed Subspace Analyss for Face Recognton Xaogang Wang and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, Hong Kong {xgwang, xtang}@e.cuhk.edu.hk Abstract PCA, LDA

More information

Grover s Algorithm + Quantum Zeno Effect + Vaidman

Grover s Algorithm + Quantum Zeno Effect + Vaidman Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

COMPLEX NUMBERS AND QUADRATIC EQUATIONS COMPLEX NUMBERS AND QUADRATIC EQUATIONS INTRODUCTION We know that x 0 for all x R e the square of a real number (whether postve, negatve or ero) s non-negatve Hence the equatons x, x, x + 7 0 etc are not

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

Chat eld, C. and A.J.Collins, Introduction to multivariate analysis. Chapman & Hall, 1980

Chat eld, C. and A.J.Collins, Introduction to multivariate analysis. Chapman & Hall, 1980 MT07: Multvarate Statstcal Methods Mke Tso: emal mke.tso@manchester.ac.uk Webpage for notes: http://www.maths.manchester.ac.uk/~mkt/new_teachng.htm. Introducton to multvarate data. Books Chat eld, C. and

More information

From Biot-Savart Law to Divergence of B (1)

From Biot-Savart Law to Divergence of B (1) From Bot-Savart Law to Dvergence of B (1) Let s prove that Bot-Savart gves us B (r ) = 0 for an arbtrary current densty. Frst take the dvergence of both sdes of Bot-Savart. The dervatve s wth respect to

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Some Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS)

Some Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS) Some Comments on Acceleratng Convergence of Iteratve Sequences Usng Drect Inverson of the Iteratve Subspace (DIIS) C. Davd Sherrll School of Chemstry and Bochemstry Georga Insttute of Technology May 1998

More information

A Quantum Gauss-Bonnet Theorem

A Quantum Gauss-Bonnet Theorem A Quantum Gauss-Bonnet Theorem Tyler Fresen November 13, 2014 Curvature n the plane Let Γ be a smooth curve wth orentaton n R 2, parametrzed by arc length. The curvature k of Γ s ± Γ, where the sgn s postve

More information

Lecture 10 Support Vector Machines. Oct

Lecture 10 Support Vector Machines. Oct Lecture 10 Support Vector Machnes Oct - 20-2008 Lnear Separators Whch of the lnear separators s optmal? Concept of Margn Recall that n Perceptron, we learned that the convergence rate of the Perceptron

More information

Causal Diamonds. M. Aghili, L. Bombelli, B. Pilgrim

Causal Diamonds. M. Aghili, L. Bombelli, B. Pilgrim Causal Damonds M. Aghl, L. Bombell, B. Plgrm Introducton The correcton to volume of a causal nterval due to curvature of spacetme has been done by Myrhem [] and recently by Gbbons & Solodukhn [] and later

More information

a b a In case b 0, a being divisible by b is the same as to say that

a b a In case b 0, a being divisible by b is the same as to say that Secton 6.2 Dvsblty among the ntegers An nteger a ε s dvsble by b ε f there s an nteger c ε such that a = bc. Note that s dvsble by any nteger b, snce = b. On the other hand, a s dvsble by only f a = :

More information

1 GSW Iterative Techniques for y = Ax

1 GSW Iterative Techniques for y = Ax 1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

Random Walks on Digraphs

Random Walks on Digraphs Random Walks on Dgraphs J. J. P. Veerman October 23, 27 Introducton Let V = {, n} be a vertex set and S a non-negatve row-stochastc matrx (.e. rows sum to ). V and S defne a dgraph G = G(V, S) and a drected

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

However, since P is a symmetric idempotent matrix, of P are either 0 or 1 [Eigen-values

However, since P is a symmetric idempotent matrix, of P are either 0 or 1 [Eigen-values Fall 007 Soluton to Mdterm Examnaton STAT 7 Dr. Goel. [0 ponts] For the general lnear model = X + ε, wth uncorrelated errors havng mean zero and varance σ, suppose that the desgn matrx X s not necessarly

More information

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.

Lecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem. prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove

More information

THEOREMS OF QUANTUM MECHANICS

THEOREMS OF QUANTUM MECHANICS THEOREMS OF QUANTUM MECHANICS In order to develop methods to treat many-electron systems (atoms & molecules), many of the theorems of quantum mechancs are useful. Useful Notaton The matrx element A mn

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Solution 1 for USTC class Physics of Quantum Information

Solution 1 for USTC class Physics of Quantum Information Soluton 1 for 018 019 USTC class Physcs of Quantum Informaton Shua Zhao, Xn-Yu Xu and Ka Chen Natonal Laboratory for Physcal Scences at Mcroscale and Department of Modern Physcs, Unversty of Scence and

More information

Solutions to Problem Set 6

Solutions to Problem Set 6 Solutons to Problem Set 6 Problem 6. (Resdue theory) a) Problem 4.7.7 Boas. n ths problem we wll solve ths ntegral: x sn x x + 4x + 5 dx: To solve ths usng the resdue theorem, we study ths complex ntegral:

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Quantum Mechanics I - Session 4

Quantum Mechanics I - Session 4 Quantum Mechancs I - Sesson 4 Aprl 3, 05 Contents Operators Change of Bass 4 3 Egenvectors and Egenvalues 5 3. Denton....................................... 5 3. Rotaton n D....................................

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Stanford University Graph Partitioning and Expanders Handout 3 Luca Trevisan May 8, 2013

Stanford University Graph Partitioning and Expanders Handout 3 Luca Trevisan May 8, 2013 Stanford Unversty Graph Parttonng and Expanders Handout 3 Luca Trevsan May 8, 03 Lecture 3 In whch we analyze the power method to approxmate egenvalues and egenvectors, and we descrbe some more algorthmc

More information

MEM 255 Introduction to Control Systems Review: Basics of Linear Algebra

MEM 255 Introduction to Control Systems Review: Basics of Linear Algebra MEM 255 Introducton to Control Systems Revew: Bascs of Lnear Algebra Harry G. Kwatny Department of Mechancal Engneerng & Mechancs Drexel Unversty Outlne Vectors Matrces MATLAB Advanced Topcs Vectors A

More information

Important Instructions to the Examiners:

Important Instructions to the Examiners: Summer 0 Examnaton Subject & Code: asc Maths (70) Model Answer Page No: / Important Instructons to the Examners: ) The Answers should be examned by key words and not as word-to-word as gven n the model

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

Error Bars in both X and Y

Error Bars in both X and Y Error Bars n both X and Y Wrong ways to ft a lne : 1. y(x) a x +b (σ x 0). x(y) c y + d (σ y 0) 3. splt dfference between 1 and. Example: Prmordal He abundance: Extrapolate ft lne to [ O / H ] 0. [ He

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

= = = (a) Use the MATLAB command rref to solve the system. (b) Let A be the coefficient matrix and B be the right-hand side of the system.

= = = (a) Use the MATLAB command rref to solve the system. (b) Let A be the coefficient matrix and B be the right-hand side of the system. Chapter Matlab Exercses Chapter Matlab Exercses. Consder the lnear system of Example n Secton.. x x x y z y y z (a) Use the MATLAB command rref to solve the system. (b) Let A be the coeffcent matrx and

More information

Digital Signal Processing

Digital Signal Processing Dgtal Sgnal Processng Dscrete-tme System Analyss Manar Mohasen Offce: F8 Emal: manar.subh@ut.ac.r School of IT Engneerng Revew of Precedent Class Contnuous Sgnal The value of the sgnal s avalable over

More information

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0 Bezer curves Mchael S. Floater August 25, 211 These notes provde an ntroducton to Bezer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of the

More information

Complex Numbers Alpha, Round 1 Test #123

Complex Numbers Alpha, Round 1 Test #123 Complex Numbers Alpha, Round Test #3. Wrte your 6-dgt ID# n the I.D. NUMBER grd, left-justfed, and bubble. Check that each column has only one number darkened.. In the EXAM NO. grd, wrte the 3-dgt Test

More information

Some Reading. Clustering and Unsupervised Learning. Some Data. K-Means Clustering. CS 536: Machine Learning Littman (Wu, TA)

Some Reading. Clustering and Unsupervised Learning. Some Data. K-Means Clustering. CS 536: Machine Learning Littman (Wu, TA) Some Readng Clusterng and Unsupervsed Learnng CS 536: Machne Learnng Lttman (Wu, TA) Not sure what to suggest for K-Means and sngle-lnk herarchcal clusterng. Klenberg (00). An mpossblty theorem for clusterng

More information

Homework Notes Week 7

Homework Notes Week 7 Homework Notes Week 7 Math 4 Sprng 4 #4 (a Complete the proof n example 5 that s an nner product (the Frobenus nner product on M n n (F In the example propertes (a and (d have already been verfed so we

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Maximum Likelihood Estimation (MLE)

Maximum Likelihood Estimation (MLE) Maxmum Lkelhood Estmaton (MLE) Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 01 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? + + + + + + + + + Intuton of Margn Consder ponts

More information

1 Derivation of Point-to-Plane Minimization

1 Derivation of Point-to-Plane Minimization 1 Dervaton of Pont-to-Plane Mnmzaton Consder the Chen-Medon (pont-to-plane) framework for ICP. Assume we have a collecton of ponts (p, q ) wth normals n. We want to determne the optmal rotaton and translaton

More information

1 Vectors over the complex numbers

1 Vectors over the complex numbers Vectors for quantum mechancs 1 D. E. Soper 2 Unversty of Oregon 5 October 2011 I offer here some background for Chapter 1 of J. J. Sakura, Modern Quantum Mechancs. 1 Vectors over the complex numbers What

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Formulas for the Determinant

Formulas for the Determinant page 224 224 CHAPTER 3 Determnants e t te t e 2t 38 A = e t 2te t e 2t e t te t 2e 2t 39 If 123 A = 345, 456 compute the matrx product A adj(a) What can you conclude about det(a)? For Problems 40 43, use

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Physics 53. Rotational Motion 3. Sir, I have found you an argument, but I am not obliged to find you an understanding.

Physics 53. Rotational Motion 3. Sir, I have found you an argument, but I am not obliged to find you an understanding. Physcs 53 Rotatonal Moton 3 Sr, I have found you an argument, but I am not oblged to fnd you an understandng. Samuel Johnson Angular momentum Wth respect to rotatonal moton of a body, moment of nerta plays

More information

Math1110 (Spring 2009) Prelim 3 - Solutions

Math1110 (Spring 2009) Prelim 3 - Solutions Math 1110 (Sprng 2009) Solutons to Prelm 3 (04/21/2009) 1 Queston 1. (16 ponts) Short answer. Math1110 (Sprng 2009) Prelm 3 - Solutons x a 1 (a) (4 ponts) Please evaluate lm, where a and b are postve numbers.

More information