Dimensionality Reduction Notes 2

Size: px
Start display at page:

Download "Dimensionality Reduction Notes 2"

Transcription

1 Dmensonalty Reducton Notes 2 Jelan Nelson mnlek@seas.harvard.edu August 11, Optmalty theorems for JL Yesterday we saw for MJL that we could acheve target dmenson m = O(ε 2 log N), and for DJL we could acheve m = O(ε 2 log(1/δ)). The followng theorems tell us that not much mprovement s possble for MJL, and for DJL we have the optmal bound. Theorem 1 ([Alo03]). For any N > 1 and ε < 1/2, there exst N + 1 ponts n R N such that achevng the MJL guarantee wth dstorton 1 + ε requres m mn{n, ε 2 (log N)/ log(1/ε)}. The log(1/ε) loss n the lower bound can be removed f the map must be lnear. Theorem 2 ([LN14]). For any N > 1 and ε < 1/2, there exst N O(1) ponts n R N such that achevng the MJL guarantee wth dstorton 1 + ε usng a lnear map requres m mn{n, ε 2 log N}. For DJL, the upper bound s optmal. Theorem 3 ([JW13, KMN11]). For any ε, δ < 1/2, any DJL dstrbuton must have m mn{n, ε 2 log(1/δ)}. 1

2 2 Example applcaton: determnstc l 1 pont query and heavy htters Yesterday s notes gves an example applcaton of JL to k-means clusterng. Today we gve another applcaton. In the l 1 pont query problem a vector x R n s updated n the turnstle streamng model. A query s an ndex [n], and the response to the query should be a value x such that x x ε x 1. We show an argument of [NNW14] that the JL lemma mples the exstence of a fxed determnstc Π R m n wth m ε 2 log n such that such a x can be recovered from Πx. Defnton 1. We say that a matrx Π wth columns Π 1,..., Π n s ε-ncoherent f (1) Π 2 = 1 for all, and (2) for all j, Π, Π j ε. Theorem 4. If Π R m n s ε-ncoherent, then there s a polynomal tme recovery algorthm A Π such that gven any y = Πx, f we defne x = A Π (y) then x x ε x 1. Proof. The recovery algorthm wll be A Π (y) = Π T y = Π T Πx. Thus x = e T Π T Πx = n Π, Π j x = x + j j=1 Π, Π j x = x ± ε x 1. Now we show the exstence of such Π wth small m. Lemma 1. ε (0, 1/2), there s ε-ncoherent Π wth m ε 2 log n. Proof. Consder the set of vectors {0, e 1,..., e n }. By the JL lemma, there exsts Π wth O(ε 2 log n) rows, and havng columns Π such that (1) Π 2 = Π e 2 = 1 ± ε/3, and (2) Π Π j 2 = Π e Π e j 2 = (1 ± ε/3) 2 for all j. Let Π be the matrx whose th column s Π / Π 2. Then Π 2 = 1 for all, as desred. Furthermore 2(1 ± ε) 2 = Π Π j 2 2 = Π Π j Π, Π j. Note Π 2 2 and Π j 2 2 are both 1 ± O(ε), mplyng Π, Π j = O(ε). The lemma follows by applyng ths argument wth ε scaled down by a constant. 2

3 3 Faster JL Typcally we have some hgh-dmensonal computatonal geometry problem, and we use JL to speed up our algorthm n two steps: (1) apply a JL map Π to reduce the problem to low dmenson m, then (2) solve the lowerdmensonal problem. As m s made smaller, typcally (2) becomes faster. However, deally we would also lke step (1) to be as fast as possble. In ths secton, we nvestgate two approaches to speed up the computaton of Πx. One of the analyses wll make use of the followng Bernsten bound. Theorem 5 (Bernsten s nequalty). Let X 1,..., X n be ndependent random varables that are each at most K almost surely, and where n E(X E X ) 2 = σ 2. Then for all p 1 n =1 =1 X E X p σ p + Kp. Proof. Let r 1,..., r n be ndependent Rademachers. Then (X E X ) p 2 r X p (symmetrzaton) p ( = p p X 2 ) 1/2 p (Khntchne) (1) X 2 1/2 p/2 X 2 1/2 p σ p + p X 2 E X 2 1/2 p (trangle nequalty) σ p + p r X 2 1/2 p (symmetrzaton) σ p + p 3/4 ( X 4 ) 1/2 1/2 p (Khntchne) σ p + p 3/4 K ( X 2 ) 1/2 1/2 p (2) 3

4 Defnng E = ( X2 ) 1/2 1/2 p and comparng (1) wth (2), for some constant C > 0 E 2 C p 1/4 K E Cσ 0. Thus E must be smaller than the larger root of the above quadratc equaton, mplyng our desred upper bound on E Sparse JL One natural way to speed up JL s to make Π sparse. If Π has s non-zero entres per column, then Πx can be multpled n tme O(s x 0 ), where x 0 = { : x 0}. The goal s then to make s, m as small as possble. The followng matrx Π was ntroduced n [CCF04], and t was analyzed for DJL n [TZ12]. In ths constructon, one pcks a hash functon h : [n] [m] from a parwse ndependent famly, and a functon σ : [n] { 1, 1} from a 4-wse ndependent famly. Then for each [n], Π h(), = σ(), and the rest of the th column s 0. It was shown n [TZ12] that ths dstrbuton provdes DJL for m 1/(ε 2 δ). Note that s = 1 as descrbed here. The analyss s smply va Chebyshev s nequalty, after dong an expectaton and varance calculaton. The reason for the poor dependence n m on the falure probablty δ s that we use Chebyshev s nequalty. Ths was avoded yesterday by usng Hanson-Wrght,.e. a bound on the p-norms of quadratc forms. Recall that a bound on p-norms gves tal bounds va Markov s nequalty, and f one unrolls the proof fully yesterday, one would fnd that yesterday s lecture obtaned δ falure probablty by usng the Hanson-Wrght p-norm bound for p = Θ(log(1/δ)). That s to say, the mprovement yesterday came from boundng a hgher moment than p = 2 (.e. Chebyshev). To mprove the dependence of m on 1/δ, we allow ourselves to ncrease s. Here we analyze the Sparse JL Transform (SJLT) [KN14]. Ths s a JL dstrbuton over Π havng exactly s non-zero entres per column. As prevously, we assume x R n has x 2 = 1. Our random Π R m n satsfes Π r, = η r, σ r, / s for some nteger 1 s m. The σ r, are ndependent Rademachers. The η r, are Bernoull random varables satsfyng: For all r,, E η r, = s/m. For any, m r=1 η r, = s. That s, each column of Π has exactly s non-zero entres. 4

5 The η r, are negatvely correlated. That s, for any subset S of [m] [n], we have E (r,) S η r, (r,) S E η r, = (s/m) S. We would lke to show the followng, whch s the man theorem of [KN14]. Theorem 6. As long as m ε 2 log(1/δ) and s εm, x : x 2 = 1, P Π ( Πx > ε) < δ. (3) Proof. Abusng notaton and treatng σ as an mn-dmensonal vector, Z = Πx = 1 s m def η r, η r,j σ r, σ r,j x x j = σ T A x,η σ, r=1 j Thus by Hanson-Wrght Z p p A x,η F + p A x,η p p A x,η F p + p A x,η p. A x,η s a block dagonal matrx wth m blocks, where the rth block s (1/s)x (r) (x (r) ) T but wth the dagonal zeroed out. Here x (r) s the vector wth (x (r) ) = η r, x. Now we just need to bound A x,η F p and A x,η p. Snce A x,η s block-dagonal, ts operator norm s the largest operator norm of any block. The egenvalue of the rth block s at most (1/s) max{ x (r) 2 2, x (r) 2 } 1/s, and thus A x,η 1/s wth probablty 1. Next, defne Q,j = m r=1 η r,η r,j so that A x,η 2 F = 1 x 2 s 2 x 2 j Q,j. We wll show for p s 2 /m that for all, j, Q,j p p, where we take the p-norm over η. Therefore for ths p, j A x,η F p = A x,η 2 F 1/2 p/2 1 x 2 s 2 x 2 j Q,j p 1/2 j ( ) 1 1/2 x 2 x 2 j Q,j p s j (trangle nequalty) 5

6 1 m Then by Markov s nequalty and the settngs of p, s, m, P( Πx > ε) = P( σ T A x,η σ > ε) < ε p C p (m p/2 + s p ) < δ. We now show Q,j p p, for whch we use Bernsten s nequalty. Suppose η a1,,..., η as, are all 1, where a 1 < a 2 <... < a s. Now, note Q,j can be wrtten as s t=1 Y t, where Y t s an ndcator random varable for the event that η at,j = 1. The Y t are not ndependent, but for any nteger p 1 ther pth moment s upper bounded by the case that the Y t are ndependent Bernoull each of expectaton s/m (ths can be seen by smply expandng ( t Y t) p then comparng wth the ndependent Bernoull case monomal by monomal n the expanson). Thus Bernsten apples, and as desred we have Q,j p = t Y t p s 2 /m p + p p. There are two natural dstrbutons where η satsfes the condtons for the SJLT. In the frst, the columns are ndependent, and for each column (η 1,,..., η m, ) s chosen unformly at random from the ( ) m s vectors n {0, 1} m havng weght exactly s. A second dstrbuton s the CountSketch of [CCF04]. In ths dstrbuton, we assume s dvdes m, and the rows are parttoned arbtrarly nto s blocks each of equal sze m/s (e.g. the frst m/s rows, then the next m/s rows, etc.). For each column and for each block b wth correspondng η(b, ) = (η cm/s+1,,..., η (c+1)m/s, ), we set η(b, ) = e j R m/s for a unformly random j [m/s]. Ths s done ndependently across all b, pars. 3.2 FFT-based approach Another approach for obtanng fast JL was nvestgated by Alon and Chazelle [AC09]. Ths approach gves a runnng tme to compute Πx of roughly O(n log n), whch s faster than the sparse JL approach when x s suffcently dense. Although we dd not cover t ths approach n lecture today, I am ncludng a descrpton here. They called ther transformaton the Fast 6

7 Johnson-Lndenstrauss Transform (FJLT). A constructon smlar to thers, whch we wll analyze here, s the m n matrx Π defned as Π = 1 m SHD (4) where S s an m n samplng matrx wth replacement (each row has a 1 n a unformly random locaton and zeroes elsewhere, and the rows are ndependent), H s an unnormalzed bounded orthonormal system, and D = dag(α) for a vector α of n ndependent Rademachers. An unnormalzed bounded orthonormal system s a matrx H R n n such that H T H = I and max,j H,j 1. For example, H can be the unnormalzed Fourer matrx or Hadamard matrx. The orgnal FJLT replaced S wth a random sparse matrx P, whch has certan advantages; see Remark 1. The motvaton for the constructon (4) s speed: D can be appled n O(n) tme, H n O(n log n) tme (e.g. usng the Fast Fourer Transform), and S n O(m) tme. Thus, overall, applyng Π to any fxed vector x takes O(n log n) tme. Compare ths wth usng a dense matrx of Rademachers, whch takes O(mn) tme to apply. We wll show that for m ε 2 log(1/δ) log(1/(εδ)), the random Π descrbed n (4) provdes DJL. In fact we wll analyze a slghtly dfferent constructon n whch S s replaced by an n n dagonal matrx S η, S η = dag(η), where the entres of η {0, 1} n are ndependent wth E η = 1/m (so Π has m rows n expectaton). The proof to analyze the Π from (4) s essentally dentcal. The proof we provde here s an adaptaton of the proof of a more general theorem [CNW15, Theorem 9] to the current scenaro. Theorem 7. Let x R n be an arbtrary unt norm vector, and suppose 0 < ε, δ < 1/2. Also let Π = S η HD as descrbed above wth a number of rows equal to m ε 2 log(1/δ) log(1/(εδ)). Then P Π ( Πx > ε) < δ. Proof. We use the moment method. Let η be an ndependent copy of η, and let σ { 1, 1} n be unformly random. Wrte z = HDx so that Πx 2 2 = η z 2. Then 1 m n η z 2 1 p = 1 η z 2 1 L m p (η) L p (α) (5) =1 7

8 2 m σ η z 2 L p (η) L p (α) (symmetrzaton) 2 m σ η z 2 p p m ( η z 4 ) 1/2 p (Khntchne) p m (max η z ) ( η z 2 ) 1/2 p p m max η z 2 p 1/2 η z 2 1/2 p (Cauchy-Schwarz) p m max η z 2 p 1/2 ( 1 η z 2 1 p 1/2 + 1) (trangle nequalty) m We wll now bound max η z 2 p 1/2. Defne q = max{p, log m} and note p q. Then max η z 2 q = ( E ( max α,η E α,η ) 1/q η z 2q η z 2q ) 1/q ( ) 1/q = E η z 2q α,η ( n max E η z 2q α,η ( = ( = ) 1/q n max(e η ) (E z 2q η α m max ) 1/q E z 2q α )) 1/q (α, η ndependent) (6) 2 max z 2 q (m 1/q 2 by choce of q) = 2 max z 2 2q q (Khntchne) (7) 8

9 Eq. (7) uses that H s an unnormalzed bounded orthonormal system. Defnng E = 1 m η z 2 1 p 1/2 and combnng (5), (6), (7), we fnd that for some constant C > 0 pq pq E 2 C m E C m 0, mplyng E 2 max{ pq/m, pq/m}. By the Markov nequalty P( Πx > ε) ε p E 2p, and thus to acheve the theorem statement t suffces to set p = log(1/δ) then choose m ε 2 log(1/δ) log(m/δ). Remark 1. Note that the FJLT as analyzed above provdes suboptmal m. If one desred optmal m, one can nstead use the embeddng matrx Π Π,where Π s the FJLT and Π s, say, a dense matrx wth Rademacher entres havng the optmal m = O(ε 2 log(1/δ)) rows. The downsde s that the runtme to apply our embeddng worsens by an addtve m m. [AC09] slghtly mproved ths addtve term (by an ε 2 multplcatve factor) by replacng the matrx S wth a random sparse matrx P. Remark 2. The usual analyss for the FJLT, such as the approach n [AC09], would acheve a bound on m of O(ε 2 log(1/δ) log(n/δ)). Such analyses operate by, usng the notaton of the proof of Theorem 7, frst condtonng on z log(n/δ) (whch happens wth probablty at least 1 δ/2 by the Khntchne nequalty), then fnshng the proof usng Bernsten s nequalty. In our proof above, we mproved ths dependence on n to a dependence on the smaller quantty m by avodng any such condtonng. References [AC09] Nr Alon and Bernard Chazelle. The fast Johnson Lndenstrauss transform and approxmate nearest neghbors. SIAM J. Comput., 39(1): , [Alo03] Noga Alon. Problems and results n extremal combnatorcs I. Dscrete Mathematcs, 273(1-3):31 53,

10 [CCF04] Moses Charkar, Kevn C. Chen, and Martn Farach-Colton. Fndng frequent tems n data streams. Theor. Comput. Sc., 312(1):3 15, [CNW15] Mchael B. Cohen, Jelan Nelson, and Davd P. Woodruff. Optmal approxmate matrx product n terms of stable rank. CoRR, abs/ , [JW13] T. S. Jayram and Davd P. Woodruff. Optmal bounds for Johnson-Lndenstrauss transforms and streamng problems wth subconstant error. ACM Transactons on Algorthms, 9(3):26, [KMN11] Danel M. Kane, Raghu Meka, and Jelan Nelson. Almost optmal explct Johnson-Lndenstrauss famles. In RANDOM, pages , [KN14] Danel M. Kane and Jelan Nelson. Sparser Johnson-Lndenstrauss transforms. Journal of the ACM, 61(1):4, [LN14] Kasper Green Larsen and Jelan Nelson. The Johnson- Lndenstrauss lemma s optmal for lnear dmensonalty reducton. CoRR, abs/ , [NNW14] Jelan Nelson, Huy L. Nguy ên, and Davd P. Woodruff. On determnstc sketchng and streamng for sparse recovery and norm estmaton. Lnear Algebra and ts Applcatons, Specal Issue on Sparse Approxmate Soluton of Lnear Systems, 441: , [TZ12] Mkkel Thorup and Yn Zhang. Tabulaton-based 5-ndependent hashng wth applcatons to lnear probng and second moment estmaton. SIAM J. Comput., 41(2): ,

CIS 700: algorithms for Big Data

CIS 700: algorithms for Big Data CIS 700: algorthms for Bg Data Lecture 5: Dmenson Reducton Sldes at htt://grgory.us/bg-data-class.html Grgory Yaroslavtsev htt://grgory.us Today Dmensonalty reducton AMS as dmensonalty reducton Johnson-Lndenstrauss

More information

Dimensionality Reduction Notes 1

Dimensionality Reduction Notes 1 Dmensonalty Reducton Notes 1 Jelan Nelson mnlek@seas.harvard.edu August 10, 2015 1 Prelmnares Here we collect some notaton and basc lemmas used throughout ths note. Throughout, for a random varable X,

More information

Lecture 9 Sept 29, 2017

Lecture 9 Sept 29, 2017 Sketchng Algorthms for Bg Data Fall 2017 Prof. Jelan Nelson Lecture 9 Sept 29, 2017 Scrbe: Mtal Bafna 1 Fast JL transform Typcally we have some hgh-mensonal computatonal geometry problem, an we use JL

More information

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Simple Analyses of the Sparse Johnson-Lindenstrauss Transform

Simple Analyses of the Sparse Johnson-Lindenstrauss Transform Smple Analyses of the Sparse Johnson-Lndenstrauss Transform Mchael B. Cohen 1, T. S. Jayram 2, and Jelan Nelson 3 1 MIT, 32 Vassar Street, Cambrdge, MA 2139, USA mcohen@mt.edu 2 IBM Almaden Research Center,

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Lecture 5 September 17, 2015

Lecture 5 September 17, 2015 CS 229r: Algorthms for Bg Data Fall 205 Prof. Jelan Nelson Lecture 5 September 7, 205 Scrbe: Yakr Reshef Recap and overvew Last tme we dscussed the problem of norm estmaton for p-norms wth p > 2. We had

More information

Lecture 4: Constant Time SVD Approximation

Lecture 4: Constant Time SVD Approximation Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),

More information

An introduction to chaining, and applications to sublinear algorithms

An introduction to chaining, and applications to sublinear algorithms An ntroducton to channg, and applcatons to sublnear algorthms Jelan Nelson Harvard August 28, 2015 What s ths talk about? What s ths talk about? Gven a collecton of random varables X 1, X 2,...,, we would

More information

Lecture 3 January 31, 2017

Lecture 3 January 31, 2017 CS 224: Advanced Algorthms Sprng 207 Prof. Jelan Nelson Lecture 3 January 3, 207 Scrbe: Saketh Rama Overvew In the last lecture we covered Y-fast tres and Fuson Trees. In ths lecture we start our dscusson

More information

Mining Data Streams-Estimating Frequency Moment

Mining Data Streams-Estimating Frequency Moment Mnng Data Streams-Estmatng Frequency Moment Barna Saha October 26, 2017 Frequency Moment Computng moments nvolves dstrbuton of frequences of dfferent elements n the stream. Frequency Moment Computng moments

More information

Eigenvalues of Random Graphs

Eigenvalues of Random Graphs Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Lecture Space-Bounded Derandomization

Lecture Space-Bounded Derandomization Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 6 Luca Trevsan September, 07 Scrbed by Theo McKenze Lecture 6 In whch we study the spectrum of random graphs. Overvew When attemptng to fnd n polynomal

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information

Spectral Graph Theory and its Applications September 16, Lecture 5

Spectral Graph Theory and its Applications September 16, Lecture 5 Spectral Graph Theory and ts Applcatons September 16, 2004 Lecturer: Danel A. Spelman Lecture 5 5.1 Introducton In ths lecture, we wll prove the followng theorem: Theorem 5.1.1. Let G be a planar graph

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

HANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION

HANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION HANSON-WRIGHT INEQUALITY AND SUB-GAUSSIAN CONCENTRATION MARK RUDELSON AND ROMAN VERSHYNIN Abstract. In ths expostory note, we gve a modern proof of Hanson-Wrght nequalty for quadratc forms n sub-gaussan

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0 Bezer curves Mchael S. Floater August 25, 211 These notes provde an ntroducton to Bezer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of the

More information

The Second Eigenvalue of Planar Graphs

The Second Eigenvalue of Planar Graphs Spectral Graph Theory Lecture 20 The Second Egenvalue of Planar Graphs Danel A. Spelman November 11, 2015 Dsclamer These notes are not necessarly an accurate representaton of what happened n class. The

More information

Computing Correlated Equilibria in Multi-Player Games

Computing Correlated Equilibria in Multi-Player Games Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

CSCE 790S Background Results

CSCE 790S Background Results CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Section 3.6 Complex Zeros

Section 3.6 Complex Zeros 04 Chapter Secton 6 Comple Zeros When fndng the zeros of polynomals, at some pont you're faced wth the problem Whle there are clearly no real numbers that are solutons to ths equaton, leavng thngs there

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

Randomness and Computation

Randomness and Computation Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Lecture 4. Instructor: Haipeng Luo

Lecture 4. Instructor: Haipeng Luo Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

Min Cut, Fast Cut, Polynomial Identities

Min Cut, Fast Cut, Polynomial Identities Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

Perron Vectors of an Irreducible Nonnegative Interval Matrix

Perron Vectors of an Irreducible Nonnegative Interval Matrix Perron Vectors of an Irreducble Nonnegatve Interval Matrx Jr Rohn August 4 2005 Abstract As s well known an rreducble nonnegatve matrx possesses a unquely determned Perron vector. As the man result of

More information

Math 217 Fall 2013 Homework 2 Solutions

Math 217 Fall 2013 Homework 2 Solutions Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has

More information

Finding Primitive Roots Pseudo-Deterministically

Finding Primitive Roots Pseudo-Deterministically Electronc Colloquum on Computatonal Complexty, Report No 207 (205) Fndng Prmtve Roots Pseudo-Determnstcally Ofer Grossman December 22, 205 Abstract Pseudo-determnstc algorthms are randomzed search algorthms

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

1 Matrix representations of canonical matrices

1 Matrix representations of canonical matrices 1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:

More information

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Supplement to Clustering with Statistical Error Control

Supplement to Clustering with Statistical Error Control Supplement to Clusterng wth Statstcal Error Control Mchael Vogt Unversty of Bonn Matthas Schmd Unversty of Bonn In ths supplement, we provde the proofs that are omtted n the paper. In partcular, we derve

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

The Expectation-Maximization Algorithm

The Expectation-Maximization Algorithm The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.

More information

Bézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0

Bézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0 Bézer curves Mchael S. Floater September 1, 215 These notes provde an ntroducton to Bézer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS These are nformal notes whch cover some of the materal whch s not n the course book. The man purpose s to gve a number of nontrval examples

More information

Poisson brackets and canonical transformations

Poisson brackets and canonical transformations rof O B Wrght Mechancs Notes osson brackets and canoncal transformatons osson Brackets Consder an arbtrary functon f f ( qp t) df f f f q p q p t But q p p where ( qp ) pq q df f f f p q q p t In order

More information

COS 511: Theoretical Machine Learning

COS 511: Theoretical Machine Learning COS 5: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #0 Scrbe: José Sões Ferrera March 06, 203 In the last lecture the concept of Radeacher coplexty was ntroduced, wth the goal of showng that

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Example: (13320, 22140) =? Solution #1: The divisors of are 1, 2, 3, 4, 5, 6, 9, 10, 12, 15, 18, 20, 27, 30, 36, 41,

Example: (13320, 22140) =? Solution #1: The divisors of are 1, 2, 3, 4, 5, 6, 9, 10, 12, 15, 18, 20, 27, 30, 36, 41, The greatest common dvsor of two ntegers a and b (not both zero) s the largest nteger whch s a common factor of both a and b. We denote ths number by gcd(a, b), or smply (a, b) when there s no confuson

More information

Norms, Condition Numbers, Eigenvalues and Eigenvectors

Norms, Condition Numbers, Eigenvalues and Eigenvectors Norms, Condton Numbers, Egenvalues and Egenvectors 1 Norms A norm s a measure of the sze of a matrx or a vector For vectors the common norms are: N a 2 = ( x 2 1/2 the Eucldean Norm (1a b 1 = =1 N x (1b

More information

First day August 1, Problems and Solutions

First day August 1, Problems and Solutions FOURTH INTERNATIONAL COMPETITION FOR UNIVERSITY STUDENTS IN MATHEMATICS July 30 August 4, 997, Plovdv, BULGARIA Frst day August, 997 Problems and Solutons Problem. Let {ε n } n= be a sequence of postve

More information

Hashing. Alexandra Stefan

Hashing. Alexandra Stefan Hashng Alexandra Stefan 1 Hash tables Tables Drect access table (or key-ndex table): key => ndex Hash table: key => hash value => ndex Man components Hash functon Collson resoluton Dfferent keys mapped

More information

1 The Mistake Bound Model

1 The Mistake Bound Model 5-850: Advanced Algorthms CMU, Sprng 07 Lecture #: Onlne Learnng and Multplcatve Weghts February 7, 07 Lecturer: Anupam Gupta Scrbe: Bryan Lee,Albert Gu, Eugene Cho he Mstake Bound Model Suppose there

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13] Algorthms Lecture 11: Tal Inequaltes [Fa 13] If you hold a cat by the tal you learn thngs you cannot learn any other way. Mark Twan 11 Tal Inequaltes The smple recursve structure of skp lsts made t relatvely

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models

More information

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Exercises of Chapter 2

Exercises of Chapter 2 Exercses of Chapter Chuang-Cheh Ln Department of Computer Scence and Informaton Engneerng, Natonal Chung Cheng Unversty, Mng-Hsung, Chay 61, Tawan. Exercse.6. Suppose that we ndependently roll two standard

More information

Deriving the X-Z Identity from Auxiliary Space Method

Deriving the X-Z Identity from Auxiliary Space Method Dervng the X-Z Identty from Auxlary Space Method Long Chen Department of Mathematcs, Unversty of Calforna at Irvne, Irvne, CA 92697 chenlong@math.uc.edu 1 Iteratve Methods In ths paper we dscuss teratve

More information

a b a In case b 0, a being divisible by b is the same as to say that

a b a In case b 0, a being divisible by b is the same as to say that Secton 6.2 Dvsblty among the ntegers An nteger a ε s dvsble by b ε f there s an nteger c ε such that a = bc. Note that s dvsble by any nteger b, snce = b. On the other hand, a s dvsble by only f a = :

More information

FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP

FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP C O L L O Q U I U M M A T H E M A T I C U M VOL. 80 1999 NO. 1 FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP BY FLORIAN K A I N R A T H (GRAZ) Abstract. Let H be a Krull monod wth nfnte class

More information

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

COMPLEX NUMBERS AND QUADRATIC EQUATIONS COMPLEX NUMBERS AND QUADRATIC EQUATIONS INTRODUCTION We know that x 0 for all x R e the square of a real number (whether postve, negatve or ero) s non-negatve Hence the equatons x, x, x + 7 0 etc are not

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013 ISSN: 2277-375 Constructon of Trend Free Run Orders for Orthogonal rrays Usng Codes bstract: Sometmes when the expermental runs are carred out n a tme order sequence, the response can depend on the run

More information