Boning Yang. March 8, 2018
|
|
- Hilary Randall
- 5 years ago
- Views:
Transcription
1 Concentraton Inequaltes by concentraton nequalty Introducton to Basc Concentraton Inequaltes by Florda State Unversty March 8, 2018
2 Framework Concentraton Inequaltes by 1. concentraton nequalty concentraton nequalty Introducton to Basc 2. Introducton to 3. Basc 4.
3 concentraton nequalty Concentraton Inequaltes by Value of sum of rvs s concentrated near ts expectaton. concentraton nequalty Introducton to Basc How about non-asymptotc and general functon case? The tal probablty of functon of r.v. s s bounded. 2 ncludes Markov case where no center exsts n event.
4 Mathematcal vew of Concentraton Inequaltes by measures unpredctablty of r.v. n terms of state. concentraton nequalty Introducton to Basc Varance measures that n terms of numercal value. It makes no sense to defne the expectaton of state! Unpredctablty s related to tal prob. to some extent.
5 Shannon s defnton from nformaton theory Concentraton Inequaltes by Let p j, I (p j ) be the prob. and nfo. of event E j. concentraton nequalty Introducton to Basc (1) I (p j ) 0, I (1) = 0, nfo. of any event s non-negatve! (2) I (p 1 p 2 ) = I (p 1 ) + I (p 2 ). Info. of events s addtve. I (p) = C log p where C < 0 s the target functon.
6 Example for I (p 1 p 2 ) = I (p 1 ) + I (p 2 ) Concentraton Inequaltes by Toss an unf. con twce. E 1 : head n 1st toss; E 2 : tal n 2nd. concentraton nequalty Introducton to Basc I (P (E 1 E 2 )) = I (P (E 1 )) + I (P (E 2 )) when E 1 E 2. Toss one con once. E 1 = E 2 : head n 1st toss. Not! I (P (E 1 E 2 )) = I (P (E 1 )) I (P (E 1 )) + I (P (E 2 )).
7 What do we learn from Shannon? Concentraton Inequaltes by May be hard to know the target functon n advance. concentraton nequalty Introducton to Basc But can lst the propertes target functon must satsfy. Is target functon unque? Add property 1 by 1 accordng to the relatve mportance f needed.
8 Addtonal comments Concentraton Inequaltes by Can buld confdence by checkng nduced propertes. concentraton nequalty Introducton to Basc I (p) = C log p wth C < 0. I (p) s non-ncreasng w.r.t p. I (0) = +.
9 : expected nformaton gan Concentraton Inequaltes by I (p) = C log p, C = 1 s used for conventon. concentraton nequalty Introducton to Do N experments, calculate the ave. nfo. Basc Expected ave. nfo. : H (X ) = x X Np(x) log p(x) N. : H (X ) = x X P (x) log P (x) = E log P (X ).
10 Unf. con e.g. dfference between var. and Concentraton Inequaltes by Y {head, tal}: the state result after con toss. concentraton nequalty Introducton to Basc X {1(when head), 1.01(when tal)} : numercal result... Var (X ) small whle H (X) = H (Y) large, sgn. dfference! /Var. tells the spread of categorcal/numercal r.v.
11 Condtonal Concentraton Inequaltes by {X = x} {Y = y X = x}, = {X = x, Y = y}. concentraton nequalty Introducton to Basc I (P (E 1 E 2 )) = I (P (E 1 )) + I (P (E 2 )) when E 1 E 2. I (P (X = x, Y = y)) = I (P (X = x)) + I (P (Y = y X = x)). Intutvely, nfo. of 2 events = Info. of A + nfo. of B A.
12 Condtonal (contnue 1) Concentraton Inequaltes by I (P (X = x, Y = y)) = I (P (X = x)) + I (P (Y = y X = x)). concentraton nequalty Introducton to Basc E Y X =x log (X = x, Y ) = log (P (X = x)) + E Y X =x (log (P (Y X = x))). Take exp. on X, E log P (X, Y ) = E log P (X ) + E log P (Y X ). H (X, Y ) = H (X ) + H (Y X ).
13 Condtonal (contnue 2) Concentraton Inequaltes by H (Y, X ) = H (X, Y ) = H (X ) + H (Y X ). concentraton nequalty Introducton to Basc Perm. nvar., exp. nfo. of 2 = that of X + extra of Y gven X. H (X ) H (X Y ) H (X Y, Z), proof omtted. Gven Y, X carres less ave. nfo., same for the 2nd.
14 Proof of Han s nequalty (nducton also works) Concentraton Inequaltes by H (X 1,..., X n) n=1 H(X () ) n 1, X () = (X 1,..., X 1, X +1,..., X n). concentraton nequalty Introducton to Basc Hnted by r.h.s, H (X 1,..., X n ) = H ( X ()) + H ( X X ()). To prove n =1 H ( X X ()) H (X 1,..., X n ), assume n = 3. Left H (X 1 ) + H (X 2 X 1 ) + H (X 3 X 2, X 1 ) = H (X 1, X 2, X 3 ).
15 Relatve (KL-dvergence) Concentraton Inequaltes by Let P, Q be the p.d.f. for r.v. X. concentraton nequalty Introducton to Basc D (Q P) = X =x Q(x) Q (x) log P(x), P (x) = 0 Q (x) = 0. Argu. for the def. from codng theory can be found (dubous). Amount of nfo. lost when P s used to approxmate Q.
16 Han s nequalty for relatve Concentraton Inequaltes by Let P = P 1 P n be p.d.f. for X n = (X 1,..., X n), P () ( x () ) = concentraton nequalty Introducton to Basc ) x P (x, x () = x P 1 (x 1) P n (x n), Q s defned smlarly. Then D (Q P) n =1 ( D (Q P) D ( Q () P ())), or equvalently, D (Q P) 1 n 1 n =1 D ( Q () P ()).
17 D (Q P) 1 n 1 n =1 D ( Q () P ()) (contnue 1) Concentraton Inequaltes by Hnted by Han s H (X n ) n =1 H(X () ) n 1, choose the 2nd form. concentraton nequalty Introducton to D (Q P) = x n Q (x n ) log Q (x n ) x n Q (x n ) log P (x n ). Basc Han s for : H (X n ) n =1 H(X () ) n 1 mples x Q (x n n ) log Q (x n ) =1[ n x () Q () (x () ) log Q () (x () )] n 1 ( - sgn).
18 Stll true when Q sn t prob. measure (contnue 2) Concentraton Inequaltes by Goal: x n Q (x n ) log P (x n ) = n =1[ x () Q () (x () ) log P () (x () )] n 1. concentraton nequalty Introducton to Basc x Q (x n n ) log P (x n ) = =1[ x n Q(x n )(log P () (x n() )+log P (x n ))] n n. x Q (x n n ) log P (x n ) = =1[ x n Q(x n )(log P () (x n() ))] n n 1. n =1[ x n Q(x n )(log P () (x n() ))] n n 1 = =1[ x () Q () (x () ) log P () (x () )] n 1.
19 D (Q P) n ( ( =1 D (Q P) D Q () P ())) Concentraton Inequaltes by ) ] φ (x) = x log x, Y = f (X n ) 0, E Y (X () = E [Y X () concentraton nequalty Introducton to Basc Eφ (Y ) φ (EY ) n =1 E [E φ (Y ) φ (E Y )] (Lemma 1). T for EY = 1 T for CY wth EY = 1. Assume T for EY = 0? Eφ (Y ) φ (EY ) = Eφ (Y ) = E [Y log Y ] = D (Q P) (Han s KL)?
20 Q (X n ) = f (X n ) P (X n ) (contnue 1) Concentraton Inequaltes by x n Q log Q P = D (Q P) = Eφ (Y ) = xn Pf log f. concentraton nequalty Introducton to Basc Han s for KL: D (Q P) n =1 ( D (Q P) D ( Q () P ())). Goal: D ( Q () P ()) = E [φ (E Y )] (take exp. on X () ). ) D (Q () P () = ) ( ) x (x () Q() () log Q() (x () ) P () (x () ), P() x () needed!
21 Goal: D ( Q () P ()) = E [φ (E Y )] (contnue 2) Concentraton Inequaltes by ) x (x () Q() () log Q() (x () ) P () (x () ) = ( ) x () P() x () Q () (x () ) P () (x () ) log Q () (x () ) P () (x () ). concentraton nequalty Introducton to Basc Need to prove E Y ( x ()) = E [ Y x ()] = Q() (x () ) P () (x () ). E [ Y x ()] = x n x () f (x n ) P ( x n x ()) = x n x () f (x n )P(x n ) P () (x () ). x n x () f (x n )P(x n ) P () (x () ) = x n x () Q(x n ) P () (x () ) = Q() (x () ) P () (x () ).
22 Some notatons and explanatons Concentraton Inequaltes by Let Z = g (X n ), Z = g ( X 1,..., X,..., X n ), ψ (x) = e x x 1. concentraton nequalty Introducton to Basc Z, Z dffers only by th component n X n, where X, X are..d. If n = 4, = 2, Z = g (3, 5, 7, 8), then Z 2 = g (3,?, 7, 8). Gven x () = (x 1,..., x 1, x +1,..., x n ), X X Z Z.
23 and Lemma 2 Concentraton Inequaltes by se [ Ze sz] E [ e sz] log E [ e sz] ( ( ))] n =1 [e E sz ψ s Z Z. concentraton nequalty Introducton to Basc Lemma 2 (proof omtted): for any postve r.v.s Y, Y > 0, E [Y log Y ] (EY ) log (EY ) E [ ( Y log Y Y log Y Y Y When provng L.S.I., Y = e sz s suggested by lemma 2! )].
24 Idea of the proof Concentraton Inequaltes by se [ Ze sz] E [ e sz] log E [ e sz] ( ( ))] n =1 [e E sz ψ s Z Z. concentraton nequalty Introducton to Basc Left = Eφ (Y ) φ (EY ) wth Y = e sz and φ (Y ) = Y log Y. Lemma 1: Eφ (Y ) φ (EY ) n =1 E [E φ (Y ) φ (E Y )]. [ ( Lemma 2: E φ (Y ) φ (E Y ) E Y log Y Y log Y Y Y )].
25 Idea of proof (contnue) Concentraton Inequaltes by ( ( ))] Rght of Lemma 2 = E [e sz ψ s Z Z wth Y = e sz. concentraton nequalty Introducton to Basc n =1 E [E φ (Y ) φ (E Y )] [ ( ( n =1 E E [e sz ψ s Eφ (Y ) φ (EY ) [ ( ( n =1 E E [e sz ψ s Z Z Z Z ))]]. se [ Ze sz] E [ e sz] log E [ e sz] ( ( ))] n =1 [e E sz ψ s Z Z. ))]].
26 Symmetrzed Verson of L.S.I. Concentraton Inequaltes by se [ Ze sz ] E [ e sz ] log E [ e sz ] ( ( n =1 [e E sz τ s Z Z )) ] 1 Z>Z, concentraton nequalty Introducton to Basc where τ (x) = x (e x 1) and the statement s true for all s > 0. Eφ (Y ) φ (EY ) [ ( ( n =1 E E [e sz ψ s ( ( E [e sz ψ s Z Z ))] [ ( ( E e sz τ s Z Z Z Z )) ))]]. 1 Z>Z ]?
27 E [ e sz ψ ( s ( Z Z ))] (symmetrzed verson!) Concentraton Inequaltes by ( ( = E [e sz ψ s Z Z )) ] [ ( ( 1 Z>Z + E e sz ψ s Z Z )) ] 1 Z<Z. concentraton nequalty Introducton to Basc ( ( E [e sz ψ s Z Z )) ] [ 1 Z<Z = E e sz ( ( ψ s ] ( E [e sz ( ψ s (Z Z )) 1 = E e sz e s Z Z ) ψ Z>Z ( ( E [e sz ψ s Z Z ( s ))] [ ( ( = E e sz τ s Z Z (Z Z )) Z Z )) ] 1 Z>Z. 1 Z>Z )). 1 Z>Z ]!
28 An example of applcaton Concentraton Inequaltes by n =1 (Z Z ) 2 C t > 0, P {Z > EZ + t} 2e t 2 /4C. concentraton nequalty Introducton to Basc x > 0, τ ( x) x 2 s > 0, se [ Ze sz ] E [ e sz ] log E [ e sz ] E [ e sz ( ) ] 2 n =1 s2 Z Z 1Z>Z s 2 CE [ e sz ]. Brefly, sm (s) M (s) log M (s) Cs 2 F (s), M (s) = E [ e sz ].
29 An example of applcaton (contnue) Concentraton Inequaltes by After math trck, M (s) e sez+s2c, Chernoff suggests: concentraton nequalty Introducton to Basc P (Z > EZ + t) = P ( e sz > e s(ez+t)) E[esZ ] e sez+st e s2 C st. e t2 /4C e s2 C st, s = t 2C gves the tghtest bound. Last word...
30 Take home message Concentraton Inequaltes by Mathematcal proof often comes after the correct ntuton. concentraton nequalty Introducton to Basc Rule 1: start wth smple case. Rule 2: replacng term by +, rather than,. Try to gve defntons n an axomatc way to convnce others.
3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationTAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES
TAIL BOUNDS FOR SUMS OF GEOMETRIC AND EXPONENTIAL VARIABLES SVANTE JANSON Abstract. We gve explct bounds for the tal probabltes for sums of ndependent geometrc or exponental varables, possbly wth dfferent
More informationLecture 3: Probability Distributions
Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.
More informationFoundations of Arithmetic
Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationMATH 281A: Homework #6
MATH 28A: Homework #6 Jongha Ryu Due date: November 8, 206 Problem. (Problem 2..2. Soluton. If X,..., X n Bern(p, then T = X s a complete suffcent statstc. Our target s g(p = p, and the nave guess suggested
More informationExpected Value and Variance
MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or
More informationThe lower and upper bounds on Perron root of nonnegative irreducible matrices
Journal of Computatonal Appled Mathematcs 217 (2008) 259 267 wwwelsevercom/locate/cam The lower upper bounds on Perron root of nonnegatve rreducble matrces Guang-Xn Huang a,, Feng Yn b,keguo a a College
More informationBeyond Zudilin s Conjectured q-analog of Schmidt s problem
Beyond Zudln s Conectured q-analog of Schmdt s problem Thotsaporn Ae Thanatpanonda thotsaporn@gmalcom Mathematcs Subect Classfcaton: 11B65 33B99 Abstract Usng the methodology of (rgorous expermental mathematcs
More informationAPPENDIX A Some Linear Algebra
APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,
More informationANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)
Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of
More informationThe Order Relation and Trace Inequalities for. Hermitian Operators
Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence
More informationj) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1
Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons
More informationprinceton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora
prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable
More informationGoogle PageRank with Stochastic Matrix
Google PageRank wth Stochastc Matrx Md. Sharq, Puranjt Sanyal, Samk Mtra (M.Sc. Applcatons of Mathematcs) Dscrete Tme Markov Chan Let S be a countable set (usually S s a subset of Z or Z d or R or R d
More information} Often, when learning, we deal with uncertainty:
Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationSupplementary material: Margin based PU Learning. Matrix Concentration Inequalities
Supplementary materal: Margn based PU Learnng We gve the complete proofs of Theorem and n Secton We frst ntroduce the well-known concentraton nequalty, so the covarance estmator can be bounded Then we
More informationAppendix B. Criterion of Riemann-Stieltjes Integrability
Appendx B. Crteron of Remann-Steltes Integrablty Ths note s complementary to [R, Ch. 6] and [T, Sec. 3.5]. The man result of ths note s Theorem B.3, whch provdes the necessary and suffcent condtons for
More informationSTAT 3008 Applied Regression Analysis
STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,
More informationContinuous Time Markov Chain
Contnuous Tme Markov Chan Hu Jn Department of Electroncs and Communcaton Engneerng Hanyang Unversty ERICA Campus Contents Contnuous tme Markov Chan (CTMC) Propertes of sojourn tme Relatons Transton probablty
More informationIntroduction. - The Second Lyapunov Method. - The First Lyapunov Method
Stablty Analyss A. Khak Sedgh Control Systems Group Faculty of Electrcal and Computer Engneerng K. N. Toos Unversty of Technology February 2009 1 Introducton Stablty s the most promnent characterstc of
More informationEdge Isoperimetric Inequalities
November 7, 2005 Ross M. Rchardson Edge Isopermetrc Inequaltes 1 Four Questons Recall that n the last lecture we looked at the problem of sopermetrc nequaltes n the hypercube, Q n. Our noton of boundary
More informationLecture 3: Shannon s Theorem
CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts
More information4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA
4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected
More informationIntroduction to Dummy Variable Regressors. 1. An Example of Dummy Variable Regressors
ECONOMICS 5* -- Introducton to Dummy Varable Regressors ECON 5* -- Introducton to NOTE Introducton to Dummy Varable Regressors. An Example of Dummy Varable Regressors A model of North Amercan car prces
More information12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA. 4. Tensor product
12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA Here s an outlne of what I dd: (1) categorcal defnton (2) constructon (3) lst of basc propertes (4) dstrbutve property (5) rght exactness (6) localzaton
More informationprinceton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg
prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there
More informationA Note on Bound for Jensen-Shannon Divergence by Jeffreys
OPEN ACCESS Conference Proceedngs Paper Entropy www.scforum.net/conference/ecea- A Note on Bound for Jensen-Shannon Dvergence by Jeffreys Takuya Yamano, * Department of Mathematcs and Physcs, Faculty of
More informationCSCE 790S Background Results
CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each
More informationIntroduction to Random Variables
Introducton to Random Varables Defnton of random varable Defnton of random varable Dscrete and contnuous random varable Probablty functon Dstrbuton functon Densty functon Sometmes, t s not enough to descrbe
More informationP exp(tx) = 1 + t 2k M 2k. k N
1. Subgaussan tals Defnton. Say that a random varable X has a subgaussan dstrbuton wth scale factor σ< f P exp(tx) exp(σ 2 t 2 /2) for all real t. For example, f X s dstrbuted N(,σ 2 ) then t s subgaussan.
More informationRandić Energy and Randić Estrada Index of a Graph
EUROPEAN JOURNAL OF PURE AND APPLIED MATHEMATICS Vol. 5, No., 202, 88-96 ISSN 307-5543 www.ejpam.com SPECIAL ISSUE FOR THE INTERNATIONAL CONFERENCE ON APPLIED ANALYSIS AND ALGEBRA 29 JUNE -02JULY 20, ISTANBUL
More informationThe Number of Ways to Write n as a Sum of ` Regular Figurate Numbers
Syracuse Unversty SURFACE Syracuse Unversty Honors Program Capstone Projects Syracuse Unversty Honors Program Capstone Projects Sprng 5-1-01 The Number of Ways to Wrte n as a Sum of ` Regular Fgurate Numbers
More informationFirst Year Examination Department of Statistics, University of Florida
Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve
More information6. Stochastic processes (2)
Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space
More informationAppendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis
A Appendx for Causal Interacton n Factoral Experments: Applcaton to Conjont Analyss Mathematcal Appendx: Proofs of Theorems A. Lemmas Below, we descrbe all the lemmas, whch are used to prove the man theorems
More informationQuantum and Classical Information Theory with Disentropy
Quantum and Classcal Informaton Theory wth Dsentropy R V Ramos rubensramos@ufcbr Lab of Quantum Informaton Technology, Department of Telenformatc Engneerng Federal Unversty of Ceara - DETI/UFC, CP 6007
More information6. Stochastic processes (2)
6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process
More informationEstimation: Part 2. Chapter GREG estimation
Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the
More informationOn some variants of Jensen s inequality
On some varants of Jensen s nequalty S S DRAGOMIR School of Communcatons & Informatcs, Vctora Unversty, Vc 800, Australa EMMA HUNT Department of Mathematcs, Unversty of Adelade, SA 5005, Adelade, Australa
More informationA random variable is a function which associates a real number to each element of the sample space
Introducton to Random Varables Defnton of random varable Defnton of of random varable Dscrete and contnuous random varable Probablty blt functon Dstrbuton functon Densty functon Sometmes, t s not enough
More informationPerfect Competition and the Nash Bargaining Solution
Perfect Competton and the Nash Barganng Soluton Renhard John Department of Economcs Unversty of Bonn Adenauerallee 24-42 53113 Bonn, Germany emal: rohn@un-bonn.de May 2005 Abstract For a lnear exchange
More informationA CHARACTERIZATION OF ADDITIVE DERIVATIONS ON VON NEUMANN ALGEBRAS
Journal of Mathematcal Scences: Advances and Applcatons Volume 25, 2014, Pages 1-12 A CHARACTERIZATION OF ADDITIVE DERIVATIONS ON VON NEUMANN ALGEBRAS JIA JI, WEN ZHANG and XIAOFEI QI Department of Mathematcs
More informationFURTHER RESULTS ON CONVOLUTIONS INVOLVING GAUSSIAN ERROR FUNCTION erf( x 1/2 )
1 ADV MATH SCI JOURNAL Advances n Mathematcs: Scentc Journal 6 (217, no.1, 2938 ISSN 1857-8365 prnted verson ISSN 1857-8438 electronc verson UDC: 517.18:517.983 FURTHER RESULTS ON CONVOLUTIONS INVOLVING
More informationPerron Vectors of an Irreducible Nonnegative Interval Matrix
Perron Vectors of an Irreducble Nonnegatve Interval Matrx Jr Rohn August 4 2005 Abstract As s well known an rreducble nonnegatve matrx possesses a unquely determned Perron vector. As the man result of
More informationThe Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction
ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also
More informationMath 217 Fall 2013 Homework 2 Solutions
Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has
More informationSome basic inequalities. Definition. Let V be a vector space over the complex numbers. An inner product is given by a function, V V C
Some basc nequaltes Defnton. Let V be a vector space over the complex numbers. An nner product s gven by a functon, V V C (x, y) x, y satsfyng the followng propertes (for all x V, y V and c C) (1) x +
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationAssortment Optimization under MNL
Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.
More informationCS-433: Simulation and Modeling Modeling and Probability Review
CS-433: Smulaton and Modelng Modelng and Probablty Revew Exercse 1. (Probablty of Smple Events) Exercse 1.1 The owner of a camera shop receves a shpment of fve cameras from a camera manufacturer. Unknown
More informationOn mutual information estimation for mixed-pair random variables
On mutual nformaton estmaton for mxed-par random varables November 3, 218 Aleksandr Beknazaryan, Xn Dang and Haln Sang 1 Department of Mathematcs, The Unversty of Msssspp, Unversty, MS 38677, USA. E-mal:
More informationTHE SUMMATION NOTATION Ʃ
Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the
More informationLecture 10: May 6, 2013
TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,
More informationLecture 4: September 12
36-755: Advanced Statstcal Theory Fall 016 Lecture 4: September 1 Lecturer: Alessandro Rnaldo Scrbe: Xao Hu Ta Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer: These notes have not been
More information6.842 Randomness and Computation February 18, Lecture 4
6.842 Randomness and Computaton February 18, 2014 Lecture 4 Lecturer: Rontt Rubnfeld Scrbe: Amartya Shankha Bswas Topcs 2-Pont Samplng Interactve Proofs Publc cons vs Prvate cons 1 Two Pont Samplng 1.1
More informationOutline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline
Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number
More informationBallot Paths Avoiding Depth Zero Patterns
Ballot Paths Avodng Depth Zero Patterns Henrch Nederhausen and Shaun Sullvan Florda Atlantc Unversty, Boca Raton, Florda nederha@fauedu, ssull21@fauedu 1 Introducton In a paper by Sapounaks, Tasoulas,
More informationDimensionality Reduction Notes 1
Dmensonalty Reducton Notes 1 Jelan Nelson mnlek@seas.harvard.edu August 10, 2015 1 Prelmnares Here we collect some notaton and basc lemmas used throughout ths note. Throughout, for a random varable X,
More informationHomogenization of reaction-diffusion processes in a two-component porous medium with a non-linear flux-condition on the interface
Homogenzaton of reacton-dffuson processes n a two-component porous medum wth a non-lnear flux-condton on the nterface Internatonal Conference on Numercal and Mathematcal Modelng of Flow and Transport n
More informationGeometry of Müntz Spaces
WDS'12 Proceedngs of Contrbuted Papers, Part I, 31 35, 212. ISBN 978-8-7378-224-5 MATFYZPRESS Geometry of Müntz Spaces P. Petráček Charles Unversty, Faculty of Mathematcs and Physcs, Prague, Czech Republc.
More informationChapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where y + = β + β e for =,..., y and are observable varables e s a random error How can an estmaton rule be constructed for the
More informationMMA and GCMMA two methods for nonlinear optimization
MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons
More informationPredictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore
Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.
More informatione i is a random error
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where + β + β e for,..., and are observable varables e s a random error How can an estmaton rule be constructed for the unknown
More informationDirichlet s Theorem In Arithmetic Progressions
Drchlet s Theorem In Arthmetc Progressons Parsa Kavkan Hang Wang The Unversty of Adelade February 26, 205 Abstract The am of ths paper s to ntroduce and prove Drchlet s theorem n arthmetc progressons,
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationEM and Structure Learning
EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationAwareness and forgetting of facts and agents
Awareness and forgettng of facts and agents Hans van Dtmarsch Unversty of Sevlla, Span & Unversty of Otago, New Zealand Emal: hans@cs.otago.ac.nz Tm French Unversty of Western Australa, Perth, Australa
More informationProbability Theory (revisited)
Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationMAT 578 Functional Analysis
MAT 578 Functonal Analyss John Qugg Fall 2008 Locally convex spaces revsed September 6, 2008 Ths secton establshes the fundamental propertes of locally convex spaces. Acknowledgment: although I wrote these
More informationC/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1
C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned
More informationText S1: Detailed proofs for The time scale of evolutionary innovation
Text S: Detaled proofs for The tme scale of evolutonary nnovaton Krshnendu Chatterjee Andreas Pavloganns Ben Adlam Martn A. Nowak. Overvew and Organzaton We wll present detaled proofs of all our results.
More informationAdditional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty
Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,
More informationExercise Solutions to Real Analysis
xercse Solutons to Real Analyss Note: References refer to H. L. Royden, Real Analyss xersze 1. Gven any set A any ɛ > 0, there s an open set O such that A O m O m A + ɛ. Soluton 1. If m A =, then there
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationDeriving the X-Z Identity from Auxiliary Space Method
Dervng the X-Z Identty from Auxlary Space Method Long Chen Department of Mathematcs, Unversty of Calforna at Irvne, Irvne, CA 92697 chenlong@math.uc.edu 1 Iteratve Methods In ths paper we dscuss teratve
More informationA Gentle Introduction to Concentration Inequalities
A Gentle Introducton to Concentraton Inequaltes Karthk Srdharan Abstract Ths notes s ment to be a revew of some basc nequaltes and bounds on Random varables. A basc understandng of probablty theory and
More information11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]
Algorthms Lecture 11: Tal Inequaltes [Fa 13] If you hold a cat by the tal you learn thngs you cannot learn any other way. Mark Twan 11 Tal Inequaltes The smple recursve structure of skp lsts made t relatvely
More informationAffine transformations and convexity
Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/
More informationLinear Regression Analysis: Terminology and Notation
ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented
More informationStrong Markov property: Same assertion holds for stopping times τ.
Brownan moton Let X ={X t : t R + } be a real-valued stochastc process: a famlty of real random varables all defned on the same probablty space. Defne F t = nformaton avalable by observng the process up
More informationA how to guide to second quantization method.
Phys. 67 (Graduate Quantum Mechancs Sprng 2009 Prof. Pu K. Lam. Verson 3 (4/3/2009 A how to gude to second quantzaton method. -> Second quantzaton s a mathematcal notaton desgned to handle dentcal partcle
More informationIntroduction to Algorithms
Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of
More informationModule 2. Random Processes. Version 2 ECE IIT, Kharagpur
Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be
More informationE Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities
Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy
More informationEngineering Risk Benefit Analysis
Engneerng Rsk Beneft Analyss.55, 2.943, 3.577, 6.938, 0.86, 3.62, 6.862, 22.82, ESD.72, ESD.72 RPRA 2. Elements of Probablty Theory George E. Apostolaks Massachusetts Insttute of Technology Sprng 2007
More information1 Binary Response Models
Bnary and Ordered Multnomal Response Models Dscrete qualtatve response models deal wth dscrete dependent varables. bnary: yes/no, partcpaton/non-partcpaton lnear probablty model LPM, probt or logt models
More informationSolutions to exam in SF1811 Optimization, Jan 14, 2015
Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable
More informationON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION
Advanced Mathematcal Models & Applcatons Vol.3, No.3, 2018, pp.215-222 ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EUATION
More informationCourse 395: Machine Learning - Lectures
Course 395: Machne Learnng - Lectures Lecture 1-2: Concept Learnng (M. Pantc Lecture 3-4: Decson Trees & CC Intro (M. Pantc Lecture 5-6: Artfcal Neural Networks (S.Zaferou Lecture 7-8: Instance ased Learnng
More informationOnline Appendix to Asset Pricing in Large Information Networks
Onlne Appendx to Asset Prcng n Large Informaton Networks Han N. Ozsoylev and Johan Walden The proofs of Propostons 1-12 have been omtted from the man text n the nterest of brevty. Ths Onlne Appendx contans
More informationRules of Probability
( ) ( ) = for all Corollary: Rules of robablty The probablty of the unon of any two events and B s roof: ( Φ) = 0. F. ( B) = ( ) + ( B) ( B) If B then, ( ) ( B). roof: week 2 week 2 2 Incluson / Excluson
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More information2 S. S. DRAGOMIR, N. S. BARNETT, AND I. S. GOMM Theorem. Let V :(d d)! R be a twce derentable varogram havng the second dervatve V :(d d)! R whch s bo
J. KSIAM Vol.4, No., -7, 2 FURTHER BOUNDS FOR THE ESTIMATION ERROR VARIANCE OF A CONTINUOUS STREAM WITH STATIONARY VARIOGRAM S. S. DRAGOMIR, N. S. BARNETT, AND I. S. GOMM Abstract. In ths paper we establsh
More informationβ0 + β1xi and want to estimate the unknown
SLR Models Estmaton Those OLS Estmates Estmators (e ante) v. estmates (e post) The Smple Lnear Regresson (SLR) Condtons -4 An Asde: The Populaton Regresson Functon B and B are Lnear Estmators (condtonal
More information