Supplement to Clustering with Statistical Error Control

Size: px
Start display at page:

Download "Supplement to Clustering with Statistical Error Control"

Transcription

1 Supplement to Clusterng wth Statstcal Error Control Mchael Vogt Unversty of Bonn Matthas Schmd Unversty of Bonn In ths supplement, we provde the proofs that are omtted n the paper. In partcular, we derve heorems from Secton 4. hroughout the supplement, we use the symbol C to denote a unversal real constant whch may tae a dfferent value on each occurrence. Auxlary results In the proofs of heorems , we frequently mae use of the followng unform convergence result. Lemma S.. Let Z s = {Z st : t } be sequences of real-valued random varables for s S wth the followng propertes: for each s, the random varables n Z s are ndependent of each other, and E[Z st ] = 0 and E[ Z st φ ] C < for some φ > and C > 0 that depend nether on s nor on t. Suppose that S = q wth 0 q < φ/. hen s S Z st > η = o, where the constant η > 0 can be chosen as small as desred. roof of Lemma S.. Defne τ S, = S /{+δq+} wth some suffcently small δ > 0. In partcular, let δ > 0 be so small that + δq + < φ. Moreover, set Z st = Z st Z st τ S, E [ Z st Z st τ S, ] Z st > = Z st Z st > τ S, E [ Z st Z st > τ S, ] and wrte Z st = Z st + Z > st.

2 In what follows, we show that s S s S Z > st Z st > C η = o S. > C η = o S. for any fxed constant C > 0. Combnng S. and S. mmedately yelds the statement of Lemma S.. We start wth the proof of S.: It holds that s S Z > st > C η Q > + Q >, where and Q > := Q > := S s= S s= S s= CS τ φ S, S s= Z st Z st > τ S, > C η Z st > τ S, for some t Z st > τ S, = o S s= [ Zst φ ] E τ φ S, E [ Z st Z st > τ S, ] > C η = 0 for S and suffcently large, snce hs yelds S.. E [ Z st Z st > τ S, ] C τ φ S, [ Zst φ E τ φ S, = o η. ] Z st > τ S,

3 We next turn to the proof of S.: We apply the crude bound s S Z st > C η S s= Z st > C η and show that for any s S, Z st > C η C 0 ρ, S.3 where C 0 s a fxed constant and ρ > 0 can be chosen as large as desred by pcng η slghtly larger than / / + δ. Snce S = O q, ths mmedately mples S.. o prove S.3, we mae use of the followng facts: For a random varable Z and λ > 0, Marov s nequalty says that ± Z > δ E exp±λz. expλδ Snce Z st/ τ S, /, t holds that λ S, Z st/ /, where we set λ S, = /4τ S,. As expx + x + x for x /, ths mples that [ Z ] st E exp ± λ S, + λ S, E[ Zst ] λ S, exp E[ Zst ]. By defnton of λ S,, t holds that λ S, = 4S +δq+ = 4 q+ +δq+ = 4 +δ. Usng and wrtng EZ st C Z <, we obtan that Z st > C η exp { λ S, C η [ E = exp λ S, C η { Z st > C η + exp λ S, Z st > C η Z st ] + E [ Z ] st E exp λ S, + [ Z ] } st exp λ S, [ Z ] } st E exp λ S, 3

4 exp λ S, C η λ S, exp E[ Zst ] = exp C Z λ S, Cλ S, η CZ = exp 6 C 0 ρ, C +δ 4 +δ η where ρ > 0 can be chosen arbtrarly large f we pc η slghtly larger than / / + δ. roof of heorem 4. We frst prove that Ĥ[K 0 ] qα = α + o. S.4 o do so, we derve a stochastc expanson of the ndvdual statstcs [K 0]. Lemma S.. It holds that [K 0] = [K 0] +, where and the remander [K 0] = p has the property that n { ε j σ } /κ [K R 0 ] > p ξ = o S.5 for some ξ > 0. he proof of Lemma S. as well as those of the subsequent Lemmas S.3 S.5 are postponed untl the proof of heorem 4. s complete. Wth the help of Lemma S., we can bound the probablty of nterest as follows: Snce α := Ĥ[K 0 ] qα = [K 0] qα n n [K 0] n [K 0] + n n [K 0] n, 4

5 t holds that where < α > α = = < α α > α, n [K 0] n [K 0] qα n R[K 0] qα + n R[K 0]. As the remander has the property S.5, we further obtan that α + o α α + o, S.6 where α α = = n [K 0] n [K 0] qα p ξ qα + p ξ. Wth the help of strong approxmaton theory, we can derve the followng result on the asymptotc behavour of the probabltes α and α. Lemma S.3. It holds that α α = α + o = α + o. ogether wth S.6, ths mmedately yelds that α = α+o, thus completng the proof of S.4. We next show that for any K < K 0, Ĥ[K] qα = o. S.7 Consder a fxed K < K 0 and let S {Ĝ[K] followng property: : K} be any cluster wth the #S n := mn K0 #G, and S contans elements from at least two dfferent classes G and G. S.8 It s not dffcult to see that a cluster wth the property S.8 must always exst under our condtons. By C {Ĝ[K] : K}, we denote the collecton of clusters that have the property S.8. Wth ths notaton at hand, we can derve the [K] followng stochastc expanson of the ndvdual statstcs. 5

6 Lemma S.4. For any S and S C, t holds that [K] = κσ p d j +, where d j = µ j #S S µ j and the remander for some small ξ > 0. Usng S.9 and the fact that we obtan that Ĥ[K] qα S C S { [K] S C S S C S κσ p = n [K] has the property that [K] R > p ξ = o S.9 qα [K] qα S C S S C S { κσ p { S C S κσ p d j d j d j } } S C S S C S, [K] R qα } qα + p ξ + o. S.0 he arguments from the proof of Lemma S.3, n partcular S., mply that qα C log n for some fxed constant C > 0 and suffcently large n. Moreover, we can prove the followng result. Lemma S.5. It holds that for some fxed constant c > 0. { p S C S d j } c p Snce qα C log n and log n/ p = o by C3, Lemma S.5 allows us to nfer that { S C S κσ p d j } qα + p ξ = o. ogether wth S.0, ths yelds that Ĥ[K] qα = o. 6

7 roof of Lemma S.. Let n = #G and wrte ε = p p ε j along wth µ = p p µ j. Snce {Ĝ[K 0 ] } { } : K 0 = G : K 0 by 3., we can gnore the estmaton error n the clusters Ĝ[K 0] by the true classes G. For G, we thus get and replace them [K 0] = [K 0] +,A + R[K 0],B R[K 0],C + R[K 0],D, where,a,b = κ κ = κ p σ σ p,c = κ σ p,d = κ σ p { ε j σ } ε j { ε j ε + } ε n j ε G {ε + } ε n j ε. G We now show that G,l = o p p ξ for any and l = A,..., D. hs mples that n,l = K0 G,l = o p p ξ for l = A,..., D, whch n turn yelds the statement of Lemma S.. hroughout the proof, we use the symbol η > 0 to denote a suffcently small constant whch results from applyng Lemma S.. By assumpton, σ = σ + O p p /+δ and κ = κ + O p p δ for some δ > 0. Applyng Lemma S. and choosng ξ > 0 such that ξ < δ η, we obtan that and R [K 0 ],A G κ { ε } j p κ G σ = κ O p p η = O p p δ η = o p p ξ κ R [K 0 ],B G κ = σ { p σ G ε j σ + σ p} κ σ σ { Op p η + σ p } = o p p ξ. 7

8 We next show that R [K 0 ],C = op p 4. S. G o do so, we wor wth the decomposton,c = { κ σ }{,C, +R[K 0],C, R[K 0],C,3 }, where,c, = ε j ε p,c, = p,c,3 = p ε j n Wth the help of Lemma S., we obtan that G ε j ε j n G ε. R [K 0 ],C, p p G G ε j = Op p η p. S. Moreover, snce p n and,c, = n p G n p G n R [K 0 ],C, = Op n 4, S.3 G { ε j σ } + σ p n + n G p G p { ε j σ } = Op p η n ε j ε j, S.4 ε j ε j = O p n 4. S.5 S.4 s an mmedate consequence of Lemma S.. S.5 follows upon observng that for any constant C 0 > 0, G n n G G p G ε j ε j > C 0 n /4 p ε j ε j > C 0 n /4 8

9 { E n G G { n 4 p C, C0 4 G p,..., 4 G,..., 4 } 4 /{ C0 ε j ε j j,...,j 4 = n /4 } 4 E [ ] }/{ C } 4 0 ε j... ε j4 ε j... ε 4 j 4 n /4 the last nequalty resultng from the fact that the mean E[ε j... ε j4 ε j... ε 4 j 4 ] can only be non-zero f some of the ndex pars l, j l for l =,..., 4 are dentcal. Fnally, wth the help of Lemma S., we get that R [K 0 ],C,3 G ε n G p G p η ε j = Op. S.6 n p Combnng S., S.3 and S.6, we arrve at the statement S. on the remander,c. We fnally show that R [K 0 ] p η,d = Op. S.7 G p For the proof, we wrte,d = { κ σ }{,D, + R[K 0],D, }, where,d, = p p,d, = p ε j Wth the help of Lemma S., we obtan that { } ε n j ε. G Moreover, straghtforward calculatons yeld that S.7 now follows upon combnng S.8 and S.9. R [K 0 ] p η,d, = Op. S.8 G p p,d, = O p. S.9 G n 9

10 roof of Lemma S.3. We mae use of the followng three results: R Let {W : n} be ndependent random varables wth a standard normal dstrbuton and defne a n = / log n together wth hen for any w R, b n = log n log log n + log4π. log n lm W a n w + b n = exp exp w. n n In partcular, for wα ± ε = log log α ± ε, we get lm W a n wα ± ε + b n = α ± ε. n n he next result s nown as Khntchne s heorem. R Let F n be dstrbuton functons and G a non-degenerate dstrbuton functon. Moreover, let α n > 0 and β n R be such that F n α n x + β n Gx for any contnuty pont x of G. hen there are constants α n > 0 and β n R as well as a non-degenerate dstrbuton functon G such that F n α nx + β n G x at any contnuty pont x of G f and only f α n α n α, β n β n α n β and G x = Gα x + β. he fnal result explots strong approxmaton theory and s a drect consequence of the so-called KM heorems; see Komlós et al. 975, 976: R3 Wrte [K 0] = p X j wth X j = { ε j σ }/ κ and let F denote the dstrbuton functon of X j. It s possble to construct..d. random varables { X j : n, j p} wth the dstrbuton functon F and ndependent standard normal random varables {Z j : 0

11 n, j p} such that [K 0] = p X j and = p Z j have the followng property: [K 0 ] > Cp +δ θ/ p +δ for some arbtrarly small but fxed δ > 0 and some constant C > 0 that does not depend on, p and n. We now proceed as follows: We show that for any w R, n [K 0] a n w + b n exp exp w. S.0 hs n partcular mples that n [K 0] w n α ± ε α ± ε, S. where w n α ± ε = a n wα ± ε + b n wth a n, b n and wα ± ε as defned n R. he proof of S.0 s postponed untl the arguments for Lemma S.3 are complete. he statement S. n partcular holds n the specal case that ε j N0, σ. In ths case, qα s the α-quantle of n [K 0]. Hence, we have n [K 0] w n α ε α ε qα = α n [K 0] n [K 0] w n α + ε α + ε, whch mples that for suffcently large n. w n α ε qα w n α + ε S. Snce p ξ /a n = p ξ log n = o by C3, we can use S.0 together wth R to obtan that n [K 0] w n α ± ε ± p ξ α ± ε. S.3

12 As w n α ε p ξ qα p ξ qα + p ξ w n α + ε + p ξ for suffcently large n, t holds that α,ε := n [K 0] w n α ε p ξ α = n [K 0] qα p ξ α = n [K 0] qα + p ξ α,ε := w n α + ε + p ξ n [K 0] for large n. Moreover, snce α,ε α ε and α,ε α + ε for any fxed ε > 0 by S.3, we can conclude that α = α + o and = α + o, whch s the statement of Lemma S.3. α It remans to prove S.0: Usng the notaton from R3 and the shorthand w n = a n w + b n, we can wrte n [K 0] w n = [K 0] w n = n n = π S.4 wth π = [K 0 ] w n. he probabltes π can be decomposed nto two parts as follows: π = w n + { [K } 0] = π + π >, where π π > = w n + { [K } 0], [K 0] Cp = w n + { [K } 0], [K 0] > Cp +δ +δ. Wth the help of R3 and the assumpton that n p θ/4, we can show that n π = = n π + R n, S.5 = where R n s a non-negatve remander term wth R n n = n θ/ p +δ = o.

13 Moreover, the probabltes π can be bounded by w π n + Cp +δ w n Cp +δ θ/ p +δ, the second lne mang use of R3. From ths, we obtan that n π = Π n Π n + o, S.6 where Π n = n w n + Cp Π n = +δ n w n Cp +δ. By combnng S.4 S.6, we arrve at the ntermedate result that Π n + o n [K 0] w n Π n + o. S.7 Snce p +δ /an = p +δ log n = o, we can use R together wth R to show that Π n exp exp w and Π n exp exp w. S.8 luggng S.8 nto S.7 mmedately yelds that whch completes the proof. n [K 0] w n exp exp w, roof of Lemma S.4. We use the notaton n S = #S along wth ε = p p ε j, µ = p p µ j and d = p p d j. For any S and S C, we can wrte where [K] = κσ p d j +,A + R[K],B + R[K],C + R[K],D R[K],E + R[K],F + R[K],G,,A = κ σ κσ p d j 3

14 ,B = p,c κ = p κ { ε j σ }/ κ,d = κ σ p σ,e = κ σ p,f = κ σ p,g = κ σ p { ε j σ } ε j { ε j ε + ε n j ε } S S { ε + ε n j ε } S S { ε j ε } ε n j ε d j. S We now show that S C S,l = o pp / ξ for l = A,..., G. hs mmedately yelds the statement of Lemma S.4. hroughout the proof, η > 0 denotes a suffcently small constant that results from applyng Lemma S.. Wth the help of Lemma S. and our assumptons on σ and κ, t s straghtforward to see that S C S,l n,l = o pp / ξ for l = A, B, C, D wth some suffcently small ξ > 0. We next show that S C S S [K] R = Op p η. o do so, we wrte,e = { κ σ }{,E, + R[K],E, R[K],E,3 }, where,e, = p,e, = p,e,3 = p,e ε j ε ε j n S S ε j ε j n S S ε. S.9 Lemma S. yelds that S C S,E, n,e, = O pp η / p. Moreover, t holds that = O p p η, S C S,E, 4

15 snce and S C S S C S,E, = n S p n S p n S S { ε j σ } + σ p n S + n S { ε j σ } n n ε j ε p j < n p p whch follows upon applyng Lemma S.. Fnally, S p ε j ε j { ε j σ } p η = Op ε j ε j = O p p η, n,e,3 S C S { p n p ε j } = Op p η p, whch can agan be seen by applyng Lemma S.. uttng everythng together, we arrve at S.9. Smlar arguments show that,f S C S = O p p η S.30 as well. o analyze the term,g, we denote the sgnal vector of the group G by m = m,,..., m p, and wrte K 0 µ j = λ S, m j, n S S = wth λ S, = #S G /n S. Wth ths notaton, we get where,g = { κ σ }{,G, R[K],G, R[K],G,3 R[K],G,4 + R[K],G,5 },,G, = p = µ j ε j K0,G, = λ S, p,g,3 = p ε d j 5 m j, ε j

16 ,G,4 = n S p S,G,5 = n S K 0 S = µ j ε j λ S, p m j, ε j. Wth the help of Lemma S., t can be shown that S C S for l =,..., 5. For example, t holds that,g,l = Op p η S C S [K] R,G,4 < n p µ j ε j = O p p η. As a result, we obtan that hs completes the proof. S C S [K] R = Op p η.,g S.3 roof of Lemma S.5. Let S C. In partcular, suppose that S G and S G for some. We show the followng clam: there exsts some S such that p d j c p, S.3 where c = δ 0 / wth δ 0 defned n assumpton C. From ths, the statement of Lemma S.5 mmedately follows. For the proof of S.3, we denote the Eucldean dstance between vectors v = v,..., v p and w = w,..., w p by dv, w = p v j w j /. Moreover, as n Lemma S.4, we use the notaton K 0 µ j = λ S, m j,, n S S where n S = #S, λ S, = #S G /n S and m = m,,..., m p, s the sgnal vector of the class G. ae any S G. If = d µ, K 0 = λ S, m = d K 0 m, λ S, m = δ0 p, 6

17 the proof s fnshed, as S.3 s satsfed for. Next consder the case that d m, K 0 = λ S, m < δ0 p. By assumpton C, t holds that dm, m δ 0 p for. Hence, by the trangle nequalty, mplyng that δ0 p d m, m d m, < K 0 δ0 p + d = = K 0 K 0 λ S, m + d = λ S, m, m, = K 0 d λ S, m, m > δ0 p. λ S, m, m hs shows that the clam S.3 s fulflled for any S G. roof of heorem 4. By heorem 4., K0 > K 0 Ĥ[K] = > qα for all K K 0 Ĥ[K = Ĥ[K 0 ] > qα 0 ] > qα, Ĥ[K] qα for some K < K 0 = Ĥ[K 0 ] > qα + o = α + o and K0 < K 0 = Ĥ[K] qα for some K < K 0 K 0 K= = o. Ĥ[K] qα 7

18 Moreover, {Ĝ : K } { } 0 G : K 0 = {Ĝ : K } { } 0 G : K 0, K0 = K 0 + {Ĝ : K } { } 0 G : K 0, K0 K 0 = α + o, snce {Ĝ : K } { } 0 G : K 0, K0 = K 0 {Ĝ[K = 0 ] } { } : K 0 G : K 0, K0 = K 0 {Ĝ[K 0 ] } { } : K 0 G : K 0 = o by the consstency property 3. and {Ĝ : K } { } 0 G : K 0, K0 K 0 = K0 K 0 = α + o. roof of heorem 4.3 Wth the help of Lemma S., we can show that ρ, = σ + p µj µ j + op S.33 unformly over and. hs together wth C allows us to prove the followng clam: Wth probablty tendng to, the ndces,..., K belong to K dfferent classes n the case that K K 0 and to K 0 dfferent classes n the case that K > K 0. S.34 Now let K = K 0. Wth the help of S.33 and S.34, the startng values C [K 0],......, C [K 0] K 0 can be shown to have the property that {C [K 0 ] } { } : K 0 = G : K 0. S.35 8

19 ogether wth Lemma S., S.35 yelds that ρ = σ + p µj m j, + op unformly over and. Combned wth C, ths n turn mples that the -means algorthm converges already after the frst teraton step wth probablty tendng to and Ĝ[K 0] are consstent estmators of the classes G n the sense of 3.. roof of 3.6 Suppose that C C3 along wth 3.5 are satsfed. As already noted n Secton 3.4, the -means estmators {ĜA : K } can be shown to satsfy 3.4, that s, for any =,..., K. ĜA G for some K 0 S.36 hs can be proven by very smlar arguments as the consstency property 3.. We thus omt the detals. Let E A be the event that Ĝ A G for some K 0 holds for all clusters ĜA wth =,..., K. E A can be regarded as the event that the partton {ĜA : K } s a refnement of the class structure {G : K 0 }. By S.36, the event E A occurs wth probablty tendng to. Now consder the estmator σ RSS = K n p/ = ĜA Ŷ B #ĜA ĜA Ŷ B. Snce the random varables Ŷ B are ndependent of the estmators ĜA, t s not dffcult to verfy the followng: for any δ > 0, there exsts a constant C δ > 0 that does not depend on {ĜA : K } such that on the event E A, σ RSS σ C δ { } Ĝ A : K δ. p From ths, the frst statement of 3.6 easly follows. he second statement can be obtaned by smlar arguments. 9

20 References Komlós, J., Major,. and usnády, G An approxmaton of partal sums of ndependent RV s, and the sample DF. I. Zetschrft für Wahrschenlchetstheore und Verwandte Gebete, 3 3. Komlós, J., Major,. and usnády, G An approxmaton of partal sums of ndependent RV s, and the sample DF. II. Zetschrft für Wahrschenlchetstheore und Verwandte Gebete,

Appendix B. Criterion of Riemann-Stieltjes Integrability

Appendix B. Criterion of Riemann-Stieltjes Integrability Appendx B. Crteron of Remann-Steltes Integrablty Ths note s complementary to [R, Ch. 6] and [T, Sec. 3.5]. The man result of ths note s Theorem B.3, whch provdes the necessary and suffcent condtons for

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Supplementary material: Margin based PU Learning. Matrix Concentration Inequalities

Supplementary material: Margin based PU Learning. Matrix Concentration Inequalities Supplementary materal: Margn based PU Learnng We gve the complete proofs of Theorem and n Secton We frst ntroduce the well-known concentraton nequalty, so the covarance estmator can be bounded Then we

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Lecture 4. Instructor: Haipeng Luo

Lecture 4. Instructor: Haipeng Luo Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Lecture Space-Bounded Derandomization

Lecture Space-Bounded Derandomization Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

The path of ants Dragos Crisan, Andrei Petridean, 11 th grade. Colegiul National "Emil Racovita", Cluj-Napoca

The path of ants Dragos Crisan, Andrei Petridean, 11 th grade. Colegiul National Emil Racovita, Cluj-Napoca Ths artcle s wrtten by students. It may nclude omssons and mperfectons, whch were dentfed and reported as mnutely as possble by our revewers n the edtoral notes. The path of ants 07-08 Students names and

More information

Randomness and Computation

Randomness and Computation Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

The Expectation-Maximization Algorithm

The Expectation-Maximization Algorithm The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS These are nformal notes whch cover some of the materal whch s not n the course book. The man purpose s to gve a number of nontrval examples

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

k t+1 + c t A t k t, t=0

k t+1 + c t A t k t, t=0 Macro II (UC3M, MA/PhD Econ) Professor: Matthas Kredler Fnal Exam 6 May 208 You have 50 mnutes to complete the exam There are 80 ponts n total The exam has 4 pages If somethng n the queston s unclear,

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Large Sample Properties of Matching Estimators for Average Treatment Effects by Alberto Abadie & Guido Imbens

Large Sample Properties of Matching Estimators for Average Treatment Effects by Alberto Abadie & Guido Imbens Addtonal Proofs for: Large Sample Propertes of atchng stmators for Average Treatment ffects by Alberto Abade & Gudo Imbens Remnder of Proof of Lemma : To get the result for U m U m, notce that U m U m

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

A note on almost sure behavior of randomly weighted sums of φ-mixing random variables with φ-mixing weights

A note on almost sure behavior of randomly weighted sums of φ-mixing random variables with φ-mixing weights ACTA ET COMMENTATIONES UNIVERSITATIS TARTUENSIS DE MATHEMATICA Volume 7, Number 2, December 203 Avalable onlne at http://acutm.math.ut.ee A note on almost sure behavor of randomly weghted sums of φ-mxng

More information

Lecture 4: Constant Time SVD Approximation

Lecture 4: Constant Time SVD Approximation Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Strong Markov property: Same assertion holds for stopping times τ.

Strong Markov property: Same assertion holds for stopping times τ. Brownan moton Let X ={X t : t R + } be a real-valued stochastc process: a famlty of real random varables all defned on the same probablty space. Defne F t = nformaton avalable by observng the process up

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

MA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials

MA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials MA 323 Geometrc Modellng Course Notes: Day 13 Bezer Curves & Bernsten Polynomals Davd L. Fnn Over the past few days, we have looked at de Casteljau s algorthm for generatng a polynomal curve, and we have

More information

THE WEIGHTED WEAK TYPE INEQUALITY FOR THE STRONG MAXIMAL FUNCTION

THE WEIGHTED WEAK TYPE INEQUALITY FOR THE STRONG MAXIMAL FUNCTION THE WEIGHTED WEAK TYPE INEQUALITY FO THE STONG MAXIMAL FUNCTION THEMIS MITSIS Abstract. We prove the natural Fefferman-Sten weak type nequalty for the strong maxmal functon n the plane, under the assumpton

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

Random band matrices in the delocalized phase, II: Generalized resolvent estimates

Random band matrices in the delocalized phase, II: Generalized resolvent estimates Random band matrces n the delocalzed phase, II: Generalzed resolvent estmates P. Bourgade F. Yang H.-T. Yau J. Yn Courant Insttute bourgade@cms.nyu.edu U. of Calforna, Los Angeles fyang75@math.ucla.edu

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Spectral Graph Theory and its Applications September 16, Lecture 5

Spectral Graph Theory and its Applications September 16, Lecture 5 Spectral Graph Theory and ts Applcatons September 16, 2004 Lecturer: Danel A. Spelman Lecture 5 5.1 Introducton In ths lecture, we wll prove the followng theorem: Theorem 5.1.1. Let G be a planar graph

More information

FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP

FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP C O L L O Q U I U M M A T H E M A T I C U M VOL. 80 1999 NO. 1 FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP BY FLORIAN K A I N R A T H (GRAZ) Abstract. Let H be a Krull monod wth nfnte class

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1 MATH 5707 HOMEWORK 4 SOLUTIONS CİHAN BAHRAN 1. Let v 1,..., v n R m, all lengths v are not larger than 1. Let p 1,..., p n [0, 1] be arbtrary and set w = p 1 v 1 + + p n v n. Then there exst ε 1,..., ε

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Exercise Solutions to Real Analysis

Exercise Solutions to Real Analysis xercse Solutons to Real Analyss Note: References refer to H. L. Royden, Real Analyss xersze 1. Gven any set A any ɛ > 0, there s an open set O such that A O m O m A + ɛ. Soluton 1. If m A =, then there

More information

1 GSW Iterative Techniques for y = Ax

1 GSW Iterative Techniques for y = Ax 1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn

More information

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2 Salmon: Lectures on partal dfferental equatons 5. Classfcaton of second-order equatons There are general methods for classfyng hgher-order partal dfferental equatons. One s very general (applyng even to

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

arxiv: v1 [math.ca] 31 Jul 2018

arxiv: v1 [math.ca] 31 Jul 2018 LOWE ASSOUAD TYPE DIMENSIONS OF UNIFOMLY PEFECT SETS IN DOUBLING METIC SPACE HAIPENG CHEN, MIN WU, AND YUANYANG CHANG arxv:80769v [mathca] 3 Jul 08 Abstract In ths paper, we are concerned wth the relatonshps

More information

3.1 ML and Empirical Distribution

3.1 ML and Empirical Distribution 67577 Intro. to Machne Learnng Fall semester, 2008/9 Lecture 3: Maxmum Lkelhood/ Maxmum Entropy Dualty Lecturer: Amnon Shashua Scrbe: Amnon Shashua 1 In the prevous lecture we defned the prncple of Maxmum

More information

Dimensionality Reduction Notes 1

Dimensionality Reduction Notes 1 Dmensonalty Reducton Notes 1 Jelan Nelson mnlek@seas.harvard.edu August 10, 2015 1 Prelmnares Here we collect some notaton and basc lemmas used throughout ths note. Throughout, for a random varable X,

More information

Exam. Econometrics - Exam 1

Exam. Econometrics - Exam 1 Econometrcs - Exam 1 Exam Problem 1: (15 ponts) Suppose that the classcal regresson model apples but that the true value of the constant s zero. In order to answer the followng questons assume just one

More information

Economics 101. Lecture 4 - Equilibrium and Efficiency

Economics 101. Lecture 4 - Equilibrium and Efficiency Economcs 0 Lecture 4 - Equlbrum and Effcency Intro As dscussed n the prevous lecture, we wll now move from an envronment where we looed at consumers mang decsons n solaton to analyzng economes full of

More information

Math 702 Midterm Exam Solutions

Math 702 Midterm Exam Solutions Math 702 Mdterm xam Solutons The terms measurable, measure, ntegrable, and almost everywhere (a.e.) n a ucldean space always refer to Lebesgue measure m. Problem. [6 pts] In each case, prove the statement

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

SOME NEW CHARACTERIZATION RESULTS ON EXPONENTIAL AND RELATED DISTRIBUTIONS. Communicated by Ahmad Reza Soltani. 1. Introduction

SOME NEW CHARACTERIZATION RESULTS ON EXPONENTIAL AND RELATED DISTRIBUTIONS. Communicated by Ahmad Reza Soltani. 1. Introduction Bulletn of the Iranan Mathematcal Socety Vol. 36 No. 1 (21), pp 257-272. SOME NEW CHARACTERIZATION RESULTS ON EXPONENTIAL AND RELATED DISTRIBUTIONS M. TAVANGAR AND M. ASADI Communcated by Ahmad Reza Soltan

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors Stat60: Bayesan Modelng and Inference Lecture Date: February, 00 Reference Prors Lecturer: Mchael I. Jordan Scrbe: Steven Troxler and Wayne Lee In ths lecture, we assume that θ R; n hgher-dmensons, reference

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

DIFFERENTIAL FORMS BRIAN OSSERMAN

DIFFERENTIAL FORMS BRIAN OSSERMAN DIFFERENTIAL FORMS BRIAN OSSERMAN Dfferentals are an mportant topc n algebrac geometry, allowng the use of some classcal geometrc arguments n the context of varetes over any feld. We wll use them to defne

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Dirichlet s Theorem In Arithmetic Progressions

Dirichlet s Theorem In Arithmetic Progressions Drchlet s Theorem In Arthmetc Progressons Parsa Kavkan Hang Wang The Unversty of Adelade February 26, 205 Abstract The am of ths paper s to ntroduce and prove Drchlet s theorem n arthmetc progressons,

More information

Graph Reconstruction by Permutations

Graph Reconstruction by Permutations Graph Reconstructon by Permutatons Perre Ille and Wllam Kocay* Insttut de Mathémathques de Lumny CNRS UMR 6206 163 avenue de Lumny, Case 907 13288 Marselle Cedex 9, France e-mal: lle@ml.unv-mrs.fr Computer

More information

Math 217 Fall 2013 Homework 2 Solutions

Math 217 Fall 2013 Homework 2 Solutions Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has

More information

Appendix B. The Finite Difference Scheme

Appendix B. The Finite Difference Scheme 140 APPENDIXES Appendx B. The Fnte Dfference Scheme In ths appendx we present numercal technques whch are used to approxmate solutons of system 3.1 3.3. A comprehensve treatment of theoretcal and mplementaton

More information

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2. THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple

More information

6) Derivatives, gradients and Hessian matrices

6) Derivatives, gradients and Hessian matrices 30C00300 Mathematcal Methods for Economsts (6 cr) 6) Dervatves, gradents and Hessan matrces Smon & Blume chapters: 14, 15 Sldes by: Tmo Kuosmanen 1 Outlne Defnton of dervatve functon Dervatve notatons

More information

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso Supplement: Proofs and Techncal Detals for The Soluton Path of the Generalzed Lasso Ryan J. Tbshran Jonathan Taylor In ths document we gve supplementary detals to the paper The Soluton Path of the Generalzed

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

STAT 3008 Applied Regression Analysis

STAT 3008 Applied Regression Analysis STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,

More information

Sample Average Approximation with Adaptive Importance Sampling

Sample Average Approximation with Adaptive Importance Sampling oname manuscrpt o. wll be nserted by the edtor) Sample Average Approxmaton wth Adaptve Importance Samplng Andreas Wächter Jeremy Staum Alvaro Maggar Mngbn Feng October 9, 07 Abstract We study sample average

More information

P exp(tx) = 1 + t 2k M 2k. k N

P exp(tx) = 1 + t 2k M 2k. k N 1. Subgaussan tals Defnton. Say that a random varable X has a subgaussan dstrbuton wth scale factor σ< f P exp(tx) exp(σ 2 t 2 /2) for all real t. For example, f X s dstrbuted N(,σ 2 ) then t s subgaussan.

More information

Axiomatizations of Pareto Equilibria in Multicriteria Games

Axiomatizations of Pareto Equilibria in Multicriteria Games ames and Economc Behavor 28, 146154 1999. Artcle ID game.1998.0680, avalable onlne at http:www.dealbrary.com on Axomatzatons of Pareto Equlbra n Multcrtera ames Mark Voorneveld,* Dres Vermeulen, and Peter

More information

SUCCESSIVE MINIMA AND LATTICE POINTS (AFTER HENK, GILLET AND SOULÉ) M(B) := # ( B Z N)

SUCCESSIVE MINIMA AND LATTICE POINTS (AFTER HENK, GILLET AND SOULÉ) M(B) := # ( B Z N) SUCCESSIVE MINIMA AND LATTICE POINTS (AFTER HENK, GILLET AND SOULÉ) S.BOUCKSOM Abstract. The goal of ths note s to present a remarably smple proof, due to Hen, of a result prevously obtaned by Gllet-Soulé,

More information

Perfect Competition and the Nash Bargaining Solution

Perfect Competition and the Nash Bargaining Solution Perfect Competton and the Nash Barganng Soluton Renhard John Department of Economcs Unversty of Bonn Adenauerallee 24-42 53113 Bonn, Germany emal: rohn@un-bonn.de May 2005 Abstract For a lnear exchange

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

Lena Boneva and Oliver Linton. January 2017

Lena Boneva and Oliver Linton. January 2017 Appendx to Staff Workng Paper No. 640 A dscrete choce model for large heterogeneous panels wth nteractve fxed effects wth an applcaton to the determnants of corporate bond ssuance Lena Boneva and Olver

More information

Linear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space.

Linear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space. Lnear, affne, and convex sets and hulls In the sequel, unless otherwse specfed, X wll denote a real vector space. Lnes and segments. Gven two ponts x, y X, we defne xy = {x + t(y x) : t R} = {(1 t)x +

More information

Random Partitions of Samples

Random Partitions of Samples Random Parttons of Samples Klaus Th. Hess Insttut für Mathematsche Stochastk Technsche Unverstät Dresden Abstract In the present paper we construct a decomposton of a sample nto a fnte number of subsamples

More information

A new construction of 3-separable matrices via an improved decoding of Macula s construction

A new construction of 3-separable matrices via an improved decoding of Macula s construction Dscrete Optmzaton 5 008 700 704 Contents lsts avalable at ScenceDrect Dscrete Optmzaton journal homepage: wwwelsevercom/locate/dsopt A new constructon of 3-separable matrces va an mproved decodng of Macula

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise. Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where y + = β + β e for =,..., y and are observable varables e s a random error How can an estmaton rule be constructed for the

More information

The Second Eigenvalue of Planar Graphs

The Second Eigenvalue of Planar Graphs Spectral Graph Theory Lecture 20 The Second Egenvalue of Planar Graphs Danel A. Spelman November 11, 2015 Dsclamer These notes are not necessarly an accurate representaton of what happened n class. The

More information

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo

More information

HMMT February 2016 February 20, 2016

HMMT February 2016 February 20, 2016 HMMT February 016 February 0, 016 Combnatorcs 1. For postve ntegers n, let S n be the set of ntegers x such that n dstnct lnes, no three concurrent, can dvde a plane nto x regons (for example, S = {3,

More information