3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Size: px
Start display at page:

Download "3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X"

Transcription

1 Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number of summary measures, whch convey certan dstrbutonal characterstcs such as central locaton, spread, etc In ths chapter, we generalze the noton of expectaton to the multvarate case What follows are smple extensons of the defntons n your Statstcs 11 course 31 Expectaton of Functons of Several Random Varables 311 Defnton of Expectaton Def n: Let ( 1,,, )' be a -dmensonal dscrete or contnuous random vector, wth jont PMF p (,,, ) or jont PDF f (,,, ), respectvely 1,,, 1,,, a The expectaton of the random vector, denoted by E ( ), or the jont expectaton of the random varables 1,,,, s defned as: E E 1 E E ( ) [ ], [ ],, [ ] ', where E [ ] s the expectaton of, 1,,, b The expectaton of a functon of the random vector, say g ( ), s defned as: E[ g( )] g( x) p ( x), for the dscrete case, provded A ths sum s fnte, where A s the set of mass ponts of E [ g( )] g( x) f ( x) d x, for the contnuous case, provded R g ( ) s tself a random varable (or vector) and the ntegral exsts Remars: 1 In other words, the expectaton of the random vector ( 1,,, )' s smply the vector consstng of the margnal expectatons of each random varable n the vector The ntegral n defnton (1b) can be expressed n expanded form as: E g( )] g( x, x,, x ) f ( x, x,, x dx dx dx [ R R R 1,,, 1 ) For the bvarate case, f (, )' s a dscrete or contnuous random vector, wth jont PMF p (, ) or jont PDF f (, ), respectvely, the expectaton of the random vector,, (, )' s: E[(, )'] ( E[ ], E[ ])'

2 Statstcs 1: Probablty Theory II 38 If h (, ) s a real-valued functon h : R R, and f Z h(, ), then, E( Z) E[ h(, )] h( x, y) p ( x, y), for the dscrete case x y E( Z) E[ h(, )] h( x, y) f ( x, y) dxdy, for the contnuous case,, Specal Cases: Z h(, ) ( or ) Z h E or E (, ) { ( )} ( { ( )} ) Examples: 1 A far con s tossed three tmes Defne as the number of heads on the frst two tosses and the number of heads on the last two tosses Obtan the jont expectaton of and, E() and E(+) Fnd E(), E() and E(), f and are jontly contnuous random varables, wth jont PDF gven by: f ( x, y) xi ( x) I ( y), [0,1] [0,1] 3 Fnd E(3++6Z), E(Z), and E(), f, and Z are jontly contnuous random varables, wth jont PDF gven by: f (,, Z x, y, z ) 8 xyzi ( ) ( ) ( ) (0,1) x I(0,1) y I(0,1) z 4 Suppose and are ndependent random varables, each havng Unform dstrbuton over the nterval [0,1] If Z = max(,), fnd E(Z) 5 Consder a bvarate random vector (, )' havng a Trnomal dstrbuton consstng of n trals, wth parameters p 1 and p, both between 0 and 1, and 0 p1 p 1 Fnd the jont expectaton of and and E() 31 Basc Propertes of Expectaton Result #1 Expectaton of Lnear Combnatons Let ( 1,,, )' be a -dmensonal random vector, and let g 1( ), g( ),, gm( ) be m real-valued functons of m m E a g( ) ae[ g( )] 1 1, for any constants 1 Some partcular cases: E( ) E( ) E( ) E( ) E( a b) ae( ) b 3 Bvarate Case: Let (, )' be a bvarate random vector E( a b ) ae( ) be( ) a, a,, a m

3 Statstcs 1: Probablty Theory II 39 Result # Expectaton of Products Let ( 1,,, )' be a -dmensonal random vector, wth 1,,, ndependent, and let g () be a functon of alone, 1,,, E a g a E g 1 1 ( ) ( ), for any constants 1 a, a,, a Some partcular cases: 1 E ) E( ) E( ) E( ), where 1,,, ndependent ( 1 1 Bvarate Case: Let (, )' be a bvarate random vector, wth and ndependent a E( ) E( ) E( ) b E[ g( ) h( )] E[ g( )] E[ h( )] Remars: 1 Result #1 states that the expectaton of a lnear combnaton (of rv s) s the lnear combnaton of the expectatons (of the rv s) That s, the expectaton of a sum and/or dfference (of rv s) s the sum and/or dfference of the expectatons (of the rv s) No assumpton regardng the ndependence of 1,,, s needed Result # states that the expectaton of a product (of rv s) s the product of the expectatons (of the rv s), provded the random varables nvolved are ndependent 3 Further, Result # provdes a necessary (but not suffcent) condton for the ndependence of 1,,, That s, f 1,,, are ndependent, then the condton below necessarly follows But the condton below s not suffcent to conclude that 1,,, are ndependent That s, the condton may be satsfed even f the random varables are not ndependent E E( ) 1 1 Examples: 1 A far con s tossed three tmes Defne as the number of heads on the frst two tosses and the number of heads on the last two tosses From prevous secton, E(), E(), and E(+) were derved Show that E(+) = E() +E() Determne f and are ndependent and show that E( ) E( ) E( ), f and are jontly contnuous random varables, wth jont PDF gven by: f ( x, y) xi ( x) I ( y), [0,1] [0,1] 3 Usng the propertes, evaluate E(3++6Z), E(Z), and E(), f, and Z are jontly contnuous random varables, wth jont PDF gven by: f (,, Z x, y, z ) 8 xyzi ( ) ( ) ( ) (0,1) x I(0,1) y I(0,1) z

4 Statstcs 1: Probablty Theory II 40 4 Let be a dscrete random varable wth PMF gven below Defne = Verfy that E( ) E( ) E( ), even though and are not ndependent p ( x, y) 05 I ( x) 05 I ( x), {0} { 1,1} 313 Some Specal Expectatons Def n: Let ( 1,,, )' be a -dmensonal random vector Then a the covarance between any par of random varables and j, denoted by Cov(, j ), s defned as: Cov(, ) E [ E( )][ E( )], j j j provded the expectaton exsts, where E ( ) and E ( ) are the margnal expectatons of and j, respectvely b the correlaton coeffcent between any par of random varables and j, denoted by (, j ), s defned as: Cov(, j) (, j) =, V ( )V( ) j provded the covarance exsts, where V( ) 0 and V( ) 0 are the varances of and j, respectvely Remars: 1 A lttle algebra would yeld the followng computatonal formula: Cov(, ) E[ ] [ E( ) E( )], j j j for any j, where, j 1,,, Note that f =j, the covarance s smply the varance Also, the exstence of the covarance s assured by the exstence of the second moments The covarance s a measure of how two random varables vary relatve to each other (or how closely they move together) However, t possesses one man defect: t s senstve to the unts of measurement of the two random varables For nstance, two random varables, say and, measured n nches mght gve a larger covarance than another par of random varables, W and Z, whch are the same measurements converted to centmeters In ths case, and only move more closely together than do W and Z because the latter are measured n smaller unts 3 To address the problem n Remar, the correlaton coeffcent, a rescaled verson of the covarance, can be used nstead The correlaton coeffcent s a unt-free measure of the nter-relatonshp between two random varables It taes values between 1 and +1 A zero correlaton coeffcent mples that the random varables are uncorrelated j j

5 Statstcs 1: Probablty Theory II 41 4 Zero covarance (or correlaton) s a necessary (but not suffcent) condton for two random varables to be ndependent That s, f and are ndependent, then t necessarly follows that Cov(,) = 0 However, Cov(,) = 0 s not suffcent to show that and are ndependent That s, and may not be ndependent even f Cov(,) = 0 5 The correlaton coeffcent may also be vewed as a measure of lnear dependence Ths s so because (, ) 1 f, and only f, one random varable s a lnear functon of the other, wth probablty 1 The sgn of (, ) s an ndcaton of the drecton of the lnear dependence, whle the magntude of (, ) s a measure of the strength of the lnear dependence 6 The property of "uncorrelatedness" s not transtve For nstance, f the random varable s uncorrelated wth the random varables and Z, t does not follow that and Z are uncorrelated as well Examples: 1 Fnd the covarance and the correlaton coeffcent between and, f and are jontly dscrete random varables, wth jont PMF gven by: 0 1 p ( x ) 0 3/8 6/8 1/8 1 9/8 6/8 0 3/8 0 0 p ( y ) Obtan Cov(,) and (, ), f and are jontly contnuous random varables, wth jont PDF gven by: f ( x, y) 10 x yi ( x) I ( y), [0,1] [0, x] 3 Consder a bvarate random vector (, )' havng a trnomal dstrbuton consstng of n trals, wth parameters p 1 and p, both between 0 and 1, and 0 p1 p 1 Fnd Cov(,) and (, ) 4 A far con s tossed two tmes Defne as the sum of the number of heads on the tosses and the dfference of the number of heads on the tosses Verfy that the varables and are uncorrelated but not ndependent 5 Obtan Cov(,) and (, ), f and are jontly contnuous random varables, wth jont PDF gven by: f ( x, y) 8 xyi ( x) I ( y), [0,1] [0, x]

6 Statstcs 1: Probablty Theory II Some General Results of Lnear Combnatons Result #1: If 1 and are two random varables, then a1, a R, V( a a ) a V( ) a V( ) a a Cov(, ) Result #: Let 1,,, be random varables Then a a,, a R V a a V ( ) aa jcov(, j ) 1 1 j, 1, Result #3: Let 1,,, and 1,,, be sets of random varables, and let a, a,, a 1 R and b, b,, b 1 R be sets of real numbers Then, Cov a, a jj ab jcov(, j ) 1 j1 1 j1 Some Partcular Cases: 1 V( ) V( ) V( ) Cov(, ) V( ) V( ) V( ) Cov(, ) 3 V( a b) a V( ) b V( ) abcov(, ) 4 V( ) V( ) V( ), f and are ndependent 5 V( ) V( ) V( ), f and are ndependent 6 V( a b ) a V( ) b V( ), f and are ndependent 7 If 1,,, are uncorrelated random varables, then Result # reduces to: V a a V ( ) 1 1 Examples: 1 Let 1,, 3 be random varables each havng a mean and varance Further, let Cov( 1, ), Cov( 1, 3) 3 and Cov(, 3) 1 Defne U 1 43 Fnd the mean and the varance of U Let 1,,, n be n ndependent random varables, each havng mean and varance Let denote the sample mean of 1,,, n Fnd the mean and the varance of

7 Statstcs 1: Probablty Theory II The Cauchy-Schwartz Inequalty Theorem: Let and be two random varables wth fnte second (raw) moments Then, the followng nequalty holds: E ( ) E( ) E( ), wth equalty f, and only f, for some constants a and b, P( a b ) 1, e, s a lnear functon of wth probablty 1 Remars: 1 The Cauchy-Schwartz Inequalty can be used to establsh bounds on the value of (, ) That s, usng the theorem, we can show that (, ) 1 Recall that (, ) 1 has a specal sgnfcance If and are random varables such that = a + b, where a and b ( b 0 ) are constants, then (, ) 1 More precsely, f b > 0, then (, ) 1 and f b < 0, (, ) 1 What do these mply regardng the behavor of and relatve to each other? 3 The converse of remar also holds wth probablty 1 If (, ) 1, then one random varable s a functon of the other wth probablty 1 4 If, on the other hand, (, ) 1, then there s always a nonzero probablty that not all the ponts (, )' wll le on a straght lne, no matter what straght lne we draw on the - plane ( R ) However, ths does not preclude the possblty that and may be related n a dfferent fashon, perhaps nonlnearly Hence, even f (, ) 0, t s stll possble for and to be dependent Examples: 1 Suppose s a Standard Normal random varable Defne 1 1 and Verfy that (, 1) 1 and (, ) 1 Let and be standardzed random varables (e, random varables wth zero mean and unt varance) wth Cov(,) = Obtan (, ), E( ), V( ) and Cov(, ) 3 Let 1,, r, 1,, s, Z1, Z, Z t be uncorrelated random varables, each havng unt varance Derve ( UV, ), where U s the sum of all s and s, and V s the sum of all s and Z s

8 Statstcs 1: Probablty Theory II 44 3 Condtonal Expectaton 31 Defnton of Condtonal Expectaton A condtonal expectaton s smply the expectaton of a condtonal dstrbuton Or, more descrptvely, t s the expectaton of a random varable/vector gven the value of another For smplcty, we consder only the bvarate case Def n: Let (, )' be a bvarate dscrete or contnuous random vector and let g(, ) be a functon of and The condtonal expectaton of g(, ) gven = x 0, denoted as E [ g(,) = x 0 ], s defned as: E [ g(, ) x ] g( x, y) p ( y x ), for the dscrete case y E [ g(, ) x0 ] g( x0, y) f ( y x0 ) dy, for the contnuous case R Smlarly, f (, )' s a bvarate dscrete or contnuous random vector and g(, ) s a functon of and, then the condtonal expectaton of g(, ) gven = y 0, denoted as E [ g(,) = y 0 ], s defned as: E [ g(, ) y ] g( x, y ) p ( x y ), for the dscrete case x E [ g(, ) y0 ] g( x, y0 ) f ( x y0 ) dx, for the contnuous case R Specal Cases: 1 g(,) = E [ g(, ) x0 ] E [ x0 ] g(,) = E [ g(, ) y0 ] E [ y0 ] 3 g(, ) E( x ) 0 E g x0 V x0 4 g(, ) E( y ) [ (, ) ] [ ] 0 E g y0 V y0 [ (, ) ] [ ] Remars: 1 If g () s a functon of alone, then E [ g( ) x0 ] wll be a functon of x0 and wll not nvolve the random varable In the same context, f g () s a functon of alone, then E [ g( ) y0 ] wll be a functon of y0 and wll not nvolve the random varable Gven a random vector ( 1,,, )', the condtonal expectaton of any subvector of gven another sub-vector can be defned analogously The condtonal expectaton of a functon of the sub-vectors 1 (wth dmenson m) and, say

9 Statstcs 1: Probablty Theory II 45 g( 1, ), gven x, s obtaned by summng over all the mass ponts of 1 or ntegratng wth respect to 1 the product of g( 1, ) and the condtonal PMF/PDF of 1 gven x That s, E [ g(, ) x ] g( x, x ) p ( x x ), for the dscrete case x1 E[ g( 1, ) x ] g( x1, x) f ( x1 x) d x1, for the contnuous case R m 1 For nstance, f ( 1,,, )' s a random vector, 1 ( 1, 3)' and ( 4, 6)' are sub-vectors of, the condtonal expectaton of g( 1, ), gven x, s: E [ g( 1, ) x ] g( x1, x3, x4, x6 ) p1, 3 4, ( x 6 1, x3 x4, x6) x1 x3 E g( 1, ) x ] g( x1, x3, x4, x6) f,, ( x1, x3 x4, x6) dx1dx3 [ R R 3 Just as condtonal dstrbutons satsfy all the propertes of ordnary dstrbutons, condtonal expectatons and condtonal varances also satsfy all the propertes of ordnary expectatons and ordnary varances, respectvely For nstance, f (, )' s a bvarate random vector and a and b are real numbers, we have the followng results: E( a b y ) a E( y ) b a 0 0 b 0 0 c d e f E( a b x ) a E( x ) b V( y ) E( y ) [ E( y ) ] V( x ) E( x ) [ E( x ) ] V( a b y ) a V ( y ) 0 0 V( a b x ) a V ( x ) 0 0 Also, f g 1( ) and g ( ) are real-valued functons of only one of the random varables, we have the followng results: E[ g ( ) x ] g ( x ) a E[ g ( ) y ] g ( y ) b 0 0 E[ g ( ) g ( ) x ] E[ g ( ) x ] E[ g ( ) x ] c E[ g ( ) g ( ) y ] E[ g ( ) y ] E[ g ( ) y ] d E[ g ( ) g ( ) x ] g ( x ) E[ g ( ) x ] e E[ g ( ) g ( ) y ] g ( y ) E[ g ( ) y ] f Note that f and are ndependent, then E( y0) E( ), for all mass ponts y 0 of, or E( x0) E( ), for all mass ponts x 0 of Thus, the condtonal mean s equal to the uncondtonal mean wth probablty 1 But the converse s not true That s,

10 Statstcs 1: Probablty Theory II 46 even f the condtonal mean s equal to the uncondtonal mean wth probablty 1, the varables may not be ndependent Examples: 1 Consder the experment of tossng two tetrahedra, each wth sdes labeled 1 to 4 Let represent the number on the downturned face of the 1 st tetrahedron and the larger of the two downturned numbers Fnd the condtonal mean of gven = and that of (+) gven =, e, fnd E( ) and E( ) Fnd the condtonal expectaton of gven =x 0, e, fnd E( x0 ), f and are jontly contnuous random varables wth jont PDF gven by: f ( x, y) ( x y) I ( x) I ( y), (0,1) (0,1) 3 Fnd the condtonal mean of 1 gven = 1, e, fnd E( 1 1), f 1 and are jontly contnuous random varables wth jont PDF gven by: f ( y, y ) (1/ ) I ( y ) I ( y ) 1, 1 (0, y ) 1 (0,) 4 Let and be jontly contnuous random varables wth jont PDF gven by: f ( x, y) 8 xyi ( x) I ( y), (0, y) (0,1) Fnd the condtonal mean of gven = x 0, e, fnd E( x0 ) Fnd the condtonal mean of gven = x 0, e, fnd E( x ) Fnd the condtonal mean of gven = x 0, e, fnd E( x0 ) 5 Fnd the condtonal mean and varance of gven =y 0, e, fnd E( y0) and V( y0), f and are jontly contnuous random varables wth jont PDF gven by: f ( x, y) yexp{ xy} I ( x) I ( y), (0, ) (0,1) 6 Show that the condtonal mean of gven = x 0 s equal to ts uncondtonal mean wth probablty 1, f (, )' s Bvarate Unform over the unt square 7 Suppose (, )' has three mass ponts (0,0), (0,) and (1,1), each wth probablty 1/3 Verfy that, although and are not ndependent, E( x ) E( ), for every mass pont x 0 of 0 0

11 Statstcs 1: Probablty Theory II 47 3 Expectaton by Condtonng The condtonal expectaton E( ) s a functon of the random varable, and the specfc value of E( ) at x0 s equal to E( x0 ) If s not fxed to tae any specfc value, then E( ) s tself a random varable Theorem: Let (, )' be a bvarate random vector and let g () be a real-valued functon of (or ) Then E[ g( )] E { E [ g( ) ]} or E[ g( )] E { E [ g( ) ]} These results smplfy to: Eg( ) x0 p ( x0), for the dscrete case x0 E[ g( )] Eg( ) x f ( x) dx, for the contnuous case E[ g( )] y0 E g( ) y p ( y ), for the dscrete case 0 0 E g( ) y f ( y) dy, for the contnuous case Specal Cases: E( ) E E ( ) 1 E( ) EE ( ) 3 V( ) E V ( ) V E ( ) 4 V( ) E V ( ) V E ( ) terated varance formulas Remars: 1 Specal cases #1 and # mply that the mean of one random varable s the mean (or expectaton) of ts condtonal means (gven the value of another random varable), where the expectaton s taen over all the values of the condtonng random varable Specal cases #3 and #4 mply that the varance of one random varable can be expressed as the sum of: a the mean (or expectaton) of ts condtonal varances (gven the value of another random varable), where the expectaton s taen over all the dfferent values of the condtonng random varable, and b the varance of ts condtonal means across the dfferent values of the condtonng random varable

12 Statstcs 1: Probablty Theory II 48 Examples: 1 Let be a dscrete random varable wth PMF p ( x) ( x /3) I{1,} ( x) Suppose that the condtonal PMF of another random varable, gven = x, s Bnomal wth parameters n = x and p = ½ Fnd the jont PMF of and Fnd the mean and varance of and Fnd the condtonal varance of gven = x 0, e fnd V ( x0) and the uncondtonal varance of, e, V ( ), f and are jontly contnuous random varables wth jont PDF gven by: f ( x, y) 8 xyi ( x) I ( y), (0, y) (0,1) 3 Suppose t, representng the number of phone calls arrvng at an exchange durng a perod of tme of length t, s a Posson random varable wth parameter t The probablty that an operator wll answer any gven phone call s equal to p Fnd the expected number of phone calls that wll be answered n a perod of tme of length t 33 Jont Moment Generatng Functons and Moments 331 Defnton of Jont Moment Generatng Functon and Jont Moments The moment generatng functon (MGF) of a random varable provdes a compact way of representng ts moments In addton, t can often be used to determne the dstrbuton of a functon of a random varable The concept of the MGF can be extended for the multvarate case (e random varables) through the so-called jont moment generatng functon Def n: Let ( 1,,, )' be a -dmensonal random vector The jont moment generatng functon (or jont MGF) of the random varables 1,,, denoted as m (,,, ), s defned as: 1, 1,, m 1,,, ( t1, t,, t ) exp{ 1 1 } exp{ } E t t t E t, 1 provded the expectaton exsts for all values t 1, t,, t such that h t h, for some h 0, 1,,, The jont raw moments of the random varables 1,,, s denoted as: E r1 r r 1, where r s ether zero or a postve nteger, 1,,, The jont central moments of the random varables 1,,, (about ther respectve means) s denoted as: r1 r r E ( 1 ) ( 1 ) ( ), where r s ether zero or a postve nteger, 1,,,

13 Statstcs 1: Probablty Theory II 49 Specal Cases: 1 If r rj 1, and all other r m s are equal to zero, then the jont central moments of the random varables 1,,, becomes the covarance between and If r E ( 1 ) ( 1 ) ( ) ( ) ( ) j j E ( )( ) (, ) j Cov j j j, e,, and all other r m s are equal to zero, then the jont central moments of the,,, becomes the margnal varance of, e, random varables E ( 1 ) ( 1 ) ( ) ( ) E ( ) Var( ) ( ) V Def n: For the bvarate case, f (, )' s a bvarate random vector, the jont moment generatng functon (or jont MGF) of the random varables and, denoted as m (, ) s defned as:, m ( t, t ) E exp{ t t },, 1 1 provded the expectaton exsts for all values t 1 and t such that, for any real numbers a and b, a t1 a and b t b The jont raw moments of the random varables and s denoted as: r s E[ ], where r and s are ether zeros or postve ntegers The jont central moments of the random varables and (about ther respectve means) s denoted as: r s E[( ) ( ) ], where r and s are ether zeros or postve ntegers Specal Cases: 1 If r = s = 1, then the jont central moments of the random varables and becomes the covarance between and, e, 1 1 E[( ) ( ) ] E[( )( )] Cov(, ) If r = and s = 0, then the jont central moments of the random varables and becomes the margnal varance of, e, 0 E[( ) ( ) ] E[( ) ] Var( ) V( )

14 Statstcs 1: Probablty Theory II 50 Remars: 1 The unqueness of the MGF carres on n the bvarate and multvarate cases That s, the jont MGF of 1,,, unquely determnes ther jont dstrbuton; and conversely, f the jont MGF of 1,,, exsts, t s unque By remar 1, the jont MGF of 1,,, unquely determnes the margnal MGF of any of the random varables 1,,,, and thus, the margnal dstrbuton of any of the random varables 1,,, Why? m ( t ) m (0,0,, t,,0) E exp{ t } 1,,,, 3 A necessary and suffcent condton for 1,,, to be ndependent s: m ( t, t,, t ) m ( t ) m ( t ) m ( t ) m ( t ) 1,,, Generaton of Moments Result #1 Jont Raw Moments Let ( 1,,, )' be a -dmensonal random vector The jont raw moments of r r r 1 1,,,, denoted E( 1 ), can be obtaned from the jont MGF of 1,,,, by dfferentatng the jont MGF r 1 tmes wth respect to t 1, r tmes wth respect to t, and so on, and r tmes wth respect to t The lmt of the resultng dervatve s then taes as all t s go to zero Ths may be extended to the case of the jont raw moments of any sub-vector of Bvarate Case: Let (, )' be a bvarate random vector The jont (r,s) th raw moments of and, denoted r s E( ), can be obtaned from the jont MGF of and, by dfferentatng the jont MGF r tmes wth respect to t 1, and s tmes wth respect to t, and then tang the lmt of the resultng dervatve as both t 1 and t go to zero Result # Margnal Raw Moments Let ( 1,,, )' be a -dmensonal random vector The r th margnal raw moment r of, denoted E ( ), can be obtaned from the jont MGF of 1,,,, by dfferentatng the jont MGF r tmes wth respect to t The lmt of the resultng dervatve s then taen as all t s go to zero

15 Statstcs 1: Probablty Theory II 51 Bvarate Case: Let (, )' be a bvarate random vector The r th margnal raw moment of, denoted r E ( ), can be obtaned from the jont MGF of and, by dfferentatng the jont MGF r tmes wth respect to t 1, and then tang the lmt of the resultng dervatve as both t 1 and t go to zero Remars: 1 The jont central moments may be obtaned drectly from the jont raw moments, snce the former wll always be a functon of the latter For nstance, the jont (1,1) th central moments of and about ther respectve means, 1 1 denoted E[( ) ( ) ], s just Cov(, ), whch can be expressed as a functon of the jont raw moment, E( ), and the margnal raw moments, E ( ) and E ( ) The margnal central moments may be obtaned drectly from the margnal raw moments, snce the former wll always be a functon of the latter For nstance, the jont (,0) th central moments of and about ther respectve means, 0 denoted E[( ) ( ) ] E[( ) ], s just V( ), whch can be expressed ( ) as a functon of the margnal raw moments, E and E ( ) Examples: 1 Fnd the jont MGF of and, f and are jontly contnuous random varables wth jont PDF gven by: f ( x, y) exp{ y} I ( x) I ( y), (0, y) (0, ) Fnd the margnal MGFs, the margnal means and varances of and Fnd the covarance between and Fnd the jont MGF of U and V, f U and V are jontly contnuous random varables wth jont PDF gven by: f ( u, v) exp{ ( u v)} I ( u) I ( v) UV, (0, ) (0, ) Fnd the covarance between U and V Are U and V ndependent? 3 Let,,Z be 3 random varables (ether dscrete or contnuous), wth jont MGF denoted m ( t, t, t ) Show that m,,( t, t, t) gves the MGF of (++Z) and that by,, Z 1 3 Z m,,( Z t, t,0) gves the MGF of (+)

16 Statstcs 1: Probablty Theory II 5 4 Let (, )' be a bvarate random vector havng a trnomal dstrbuton consstng of n trals, wth parameters p 1 and p, both between 0 and 1, and 0 p1 p 1 Fnd the jont MGF of and Fnd the covarance between and Usng the jont MGF of and, show that the margnal dstrbuton of s B(n,p 1 ) 34 Bvarate Normal Dstrbuton 341 Densty Functon Def n: A bvarate contnuous random vector (, )' s sad to have a Bvarate Normal Dstrbuton f, and only f, the jont PDF of and s gven by: 1 1 x x y y f, ( x, y) exp I x I y 1 (1 ) where,,, and are constants such that:, 0, 1 1, (, )'~ BVN ( 0 We wrte,,,, ) 34 (Jont) Moment Generatng Functons and Moments y Def n: Let (, )' be a bvarate contnuous random vector havng a Bvarate Normal dstrbuton, e, (, )'~ BVN (,,,, ) The jont moment generatng functon (or jont MGF) of the random varables and, s defned, y t 1 R and t R, m ( t, t ) exp t t (1/ )( t t t t ) Remars: 1 If (, )'~ BVN (,,,, ) wth 0, then and are ndependent Normal y random varables Ths result mples that, for the Bvarate Normal case, uncorrelatedness wll mply ndependence Thus, uncorrelatedness s both a necessary and suffcent condton for ndependence n the Bvarate Normal case However, f and are two (unvarate) Normal random varables, not jontly (bvarate) Normal, then uncorrelatedness wll not necessarly mply ndependence, as:

17 Statstcs 1: Probablty Theory II Margnal and Condtonal Denstes Theorem: Let (, )' be a bvarate contnuous random vector havng a Bvarate Normal dstrbuton, e, (, )'~ BVN (,,,, ) Then, the margnal dstrbutons of y and are each (unvarate) Normal dstrbutons, e, ~ N(, ) and ~ N(, ) Remars: 1 The theorem states that the margnal dstrbutons of a Bvarate Normal dstrbuton are each unvarate Normal Ths result can also be generalzed to the Multvarate Normal case It s mportant to note, however, that t s qute possble for a bvarate dstrbuton, whch s NOT Bvarate Normal, to have unvarate Normal margnal dstrbutons That s, even f the margnal dstrbutons of two random varables are each unvarate Normal, ther jont dstrbuton wll not necessarly be Bvarate Normal Theorem: Let (, )' be a bvarate contnuous random vector havng a Bvarate Normal dstrbuton, e, (, )'~ BVN (,,,, ) Then the condtonal dstrbuton of y gven = x 0 s unvarate Normal, and smlarly, the condtonal dstrbuton of gven = y 0 s unvarate Normal, e, ( x0 ) x0 ~ N, (1 ) ( y0 ) y0 ~ N, (1 ) Remars: 1 The theorem states that, for a Bvarate Normal random vector, the condtonal dstrbuton of one random varable gven a specfc value of the other s unvarate Normal From the theorem above, the condtonal means and varances of gven = x 0 or that of gven = y 0, can be easly obtaned Examples: 1 Let (, )'~ BVN (,,,, ) y Fnd the margnal MGFs, the margnal means and margnal varances and Fnd the covarance and correlaton between and Usng condtonng, fnd the margnal means and margnal varances of and Show that the margnal PDFs of and are each Standard Normal, f and are jontly contnuous random varables wth jont PDF gven by:

18 Statstcs 1: Probablty Theory II 54 1 ( x y ), (0, ) (0, ) (,0) (,0) f ( x, y) (1/ )e I ( x) I ( y) I ( x) I ( y) 3 Suppose ~N(0,1) and consder tossng a far con once Defne the random varable, f a head shows up, and, otherwse Note that also has a Standard Normal Dstrbuton Why? Verfy that and are uncorrelated but are not ndependent 4 Let (, )'~ BVN ( 5, 10, 1, 5, ) If 0, fnd such that P( ) 0454 y

A be a probability space. A random vector

A be a probability space. A random vector Statstcs 1: Probablty Theory II 8 1 JOINT AND MARGINAL DISTRIBUTIONS In Probablty Theory I we formulate the concept of a (real) random varable and descrbe the probablstc behavor of ths random varable by

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table:

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table: SELECTED PROOFS DeMorgan s formulas: The frst one s clear from Venn dagram, or the followng truth table: A B A B A B Ā B Ā B T T T F F F F T F T F F T F F T T F T F F F F F T T T T The second one can be

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

Probability and Random Variable Primer

Probability and Random Variable Primer B. Maddah ENMG 622 Smulaton 2/22/ Probablty and Random Varable Prmer Sample space and Events Suppose that an eperment wth an uncertan outcome s performed (e.g., rollng a de). Whle the outcome of the eperment

More information

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y)

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y) Secton 1.5 Correlaton In the prevous sectons, we looked at regresson and the value r was a measurement of how much of the varaton n y can be attrbuted to the lnear relatonshp between y and x. In ths secton,

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering Statstcs and Probablty Theory n Cvl, Surveyng and Envronmental Engneerng Pro. Dr. Mchael Havbro Faber ETH Zurch, Swtzerland Contents o Todays Lecture Overvew o Uncertanty Modelng Random Varables - propertes

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown 9.07 Introducton to Probablty and Statstcs for Bran and Cogntve Scences Emery N. Brown Lecture 6: Epectaton and Varance, Covarance and Correlaton, And Movement Generatng Functons I. Objectves Understand

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Strong Markov property: Same assertion holds for stopping times τ.

Strong Markov property: Same assertion holds for stopping times τ. Brownan moton Let X ={X t : t R + } be a real-valued stochastc process: a famlty of real random varables all defned on the same probablty space. Defne F t = nformaton avalable by observng the process up

More information

CSCE 790S Background Results

CSCE 790S Background Results CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each

More information

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14 APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

Modelli Clamfim Equazioni differenziali 7 ottobre 2013

Modelli Clamfim Equazioni differenziali 7 ottobre 2013 CLAMFIM Bologna Modell 1 @ Clamfm Equazon dfferenzal 7 ottobre 2013 professor Danele Rtell danele.rtell@unbo.t 1/18? Ordnary Dfferental Equatons A dfferental equaton s an equaton that defnes a relatonshp

More information

Notes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology

Notes prepared by Prof Mrs) M.J. Gholba Class M.Sc Part(I) Information Technology Inverse transformatons Generaton of random observatons from gven dstrbutons Assume that random numbers,,, are readly avalable, where each tself s a random varable whch s unformly dstrbuted over the range(,).

More information

Simulation and Random Number Generation

Simulation and Random Number Generation Smulaton and Random Number Generaton Summary Dscrete Tme vs Dscrete Event Smulaton Random number generaton Generatng a random sequence Generatng random varates from a Unform dstrbuton Testng the qualty

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

Probability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n!

Probability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n! 8333: Statstcal Mechancs I Problem Set # 3 Solutons Fall 3 Characterstc Functons: Probablty Theory The characterstc functon s defned by fk ep k = ep kpd The nth coeffcent of the Taylor seres of fk epanded

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

Effects of Ignoring Correlations When Computing Sample Chi-Square. John W. Fowler February 26, 2012

Effects of Ignoring Correlations When Computing Sample Chi-Square. John W. Fowler February 26, 2012 Effects of Ignorng Correlatons When Computng Sample Ch-Square John W. Fowler February 6, 0 It can happen that ch-square must be computed for a sample whose elements are correlated to an unknown extent.

More information

Econ Statistical Properties of the OLS estimator. Sanjaya DeSilva

Econ Statistical Properties of the OLS estimator. Sanjaya DeSilva Econ 39 - Statstcal Propertes of the OLS estmator Sanjaya DeSlva September, 008 1 Overvew Recall that the true regresson model s Y = β 0 + β 1 X + u (1) Applyng the OLS method to a sample of data, we estmate

More information

Differentiating Gaussian Processes

Differentiating Gaussian Processes Dfferentatng Gaussan Processes Andrew McHutchon Aprl 17, 013 1 Frst Order Dervatve of the Posteror Mean The posteror mean of a GP s gven by, f = x, X KX, X 1 y x, X α 1 Only the x, X term depends on the

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of

More information

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere Fall Analyss of Epermental Measurements B. Esensten/rev. S. Errede Some mportant probablty dstrbutons: Unform Bnomal Posson Gaussan/ormal The Unform dstrbuton s often called U( a, b ), hch stands for unform

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Appendix B. Criterion of Riemann-Stieltjes Integrability

Appendix B. Criterion of Riemann-Stieltjes Integrability Appendx B. Crteron of Remann-Steltes Integrablty Ths note s complementary to [R, Ch. 6] and [T, Sec. 3.5]. The man result of ths note s Theorem B.3, whch provdes the necessary and suffcent condtons for

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

STAT 3008 Applied Regression Analysis

STAT 3008 Applied Regression Analysis STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,

More information

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise. Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where y + = β + β e for =,..., y and are observable varables e s a random error How can an estmaton rule be constructed for the

More information

Research Article Green s Theorem for Sign Data

Research Article Green s Theorem for Sign Data Internatonal Scholarly Research Network ISRN Appled Mathematcs Volume 2012, Artcle ID 539359, 10 pages do:10.5402/2012/539359 Research Artcle Green s Theorem for Sgn Data Lous M. Houston The Unversty of

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours UNIVERSITY OF TORONTO Faculty of Arts and Scence December 005 Examnatons STA47HF/STA005HF Duraton - hours AIDS ALLOWED: (to be suppled by the student) Non-programmable calculator One handwrtten 8.5'' x

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

Chapter 2 Transformations and Expectations. , and define f

Chapter 2 Transformations and Expectations. , and define f Revew for the prevous lecture Defnton: support set of a ranom varable, the monotone functon; Theorem: How to obtan a cf, pf (or pmf) of functons of a ranom varable; Eamples: several eamples Chapter Transformatons

More information

CS-433: Simulation and Modeling Modeling and Probability Review

CS-433: Simulation and Modeling Modeling and Probability Review CS-433: Smulaton and Modelng Modelng and Probablty Revew Exercse 1. (Probablty of Smple Events) Exercse 1.1 The owner of a camera shop receves a shpment of fve cameras from a camera manufacturer. Unknown

More information

k t+1 + c t A t k t, t=0

k t+1 + c t A t k t, t=0 Macro II (UC3M, MA/PhD Econ) Professor: Matthas Kredler Fnal Exam 6 May 208 You have 50 mnutes to complete the exam There are 80 ponts n total The exam has 4 pages If somethng n the queston s unclear,

More information

Linear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space.

Linear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space. Lnear, affne, and convex sets and hulls In the sequel, unless otherwse specfed, X wll denote a real vector space. Lnes and segments. Gven two ponts x, y X, we defne xy = {x + t(y x) : t R} = {(1 t)x +

More information

β0 + β1xi. You are interested in estimating the unknown parameters β

β0 + β1xi. You are interested in estimating the unknown parameters β Ordnary Least Squares (OLS): Smple Lnear Regresson (SLR) Analytcs The SLR Setup Sample Statstcs Ordnary Least Squares (OLS): FOCs and SOCs Back to OLS and Sample Statstcs Predctons (and Resduals) wth OLS

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Economics 130. Lecture 4 Simple Linear Regression Continued

Economics 130. Lecture 4 Simple Linear Regression Continued Economcs 130 Lecture 4 Contnued Readngs for Week 4 Text, Chapter and 3. We contnue wth addressng our second ssue + add n how we evaluate these relatonshps: Where do we get data to do ths analyss? How do

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

MATH 281A: Homework #6

MATH 281A: Homework #6 MATH 28A: Homework #6 Jongha Ryu Due date: November 8, 206 Problem. (Problem 2..2. Soluton. If X,..., X n Bern(p, then T = X s a complete suffcent statstc. Our target s g(p = p, and the nave guess suggested

More information

Supplementary material: Margin based PU Learning. Matrix Concentration Inequalities

Supplementary material: Margin based PU Learning. Matrix Concentration Inequalities Supplementary materal: Margn based PU Learnng We gve the complete proofs of Theorem and n Secton We frst ntroduce the well-known concentraton nequalty, so the covarance estmator can be bounded Then we

More information

Appendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis

Appendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis A Appendx for Causal Interacton n Factoral Experments: Applcaton to Conjont Analyss Mathematcal Appendx: Proofs of Theorems A. Lemmas Below, we descrbe all the lemmas, whch are used to prove the man theorems

More information

A note on almost sure behavior of randomly weighted sums of φ-mixing random variables with φ-mixing weights

A note on almost sure behavior of randomly weighted sums of φ-mixing random variables with φ-mixing weights ACTA ET COMMENTATIONES UNIVERSITATIS TARTUENSIS DE MATHEMATICA Volume 7, Number 2, December 203 Avalable onlne at http://acutm.math.ut.ee A note on almost sure behavor of randomly weghted sums of φ-mxng

More information

A random variable is a function which associates a real number to each element of the sample space

A random variable is a function which associates a real number to each element of the sample space Introducton to Random Varables Defnton of random varable Defnton of of random varable Dscrete and contnuous random varable Probablty blt functon Dstrbuton functon Densty functon Sometmes, t s not enough

More information

Canonical transformations

Canonical transformations Canoncal transformatons November 23, 2014 Recall that we have defned a symplectc transformaton to be any lnear transformaton M A B leavng the symplectc form nvarant, Ω AB M A CM B DΩ CD Coordnate transformatons,

More information

The equation of motion of a dynamical system is given by a set of differential equations. That is (1)

The equation of motion of a dynamical system is given by a set of differential equations. That is (1) Dynamcal Systems Many engneerng and natural systems are dynamcal systems. For example a pendulum s a dynamcal system. State l The state of the dynamcal system specfes t condtons. For a pendulum n the absence

More information

Modelli Clamfim Equazioni differenziali 22 settembre 2016

Modelli Clamfim Equazioni differenziali 22 settembre 2016 CLAMFIM Bologna Modell 1 @ Clamfm Equazon dfferenzal 22 settembre 2016 professor Danele Rtell danele.rtell@unbo.t 1/22? Ordnary Dfferental Equatons A dfferental equaton s an equaton that defnes a relatonshp

More information

Solutions Homework 4 March 5, 2018

Solutions Homework 4 March 5, 2018 1 Solutons Homework 4 March 5, 018 Soluton to Exercse 5.1.8: Let a IR be a translaton and c > 0 be a re-scalng. ˆb1 (cx + a) cx n + a (cx 1 + a) c x n x 1 cˆb 1 (x), whch shows ˆb 1 s locaton nvarant and

More information

Lecture 4 Hypothesis Testing

Lecture 4 Hypothesis Testing Lecture 4 Hypothess Testng We may wsh to test pror hypotheses about the coeffcents we estmate. We can use the estmates to test whether the data rejects our hypothess. An example mght be that we wsh to

More information

LECTURE 9 CANONICAL CORRELATION ANALYSIS

LECTURE 9 CANONICAL CORRELATION ANALYSIS LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of

More information

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models Computaton of Hgher Order Moments from Two Multnomal Overdsperson Lkelhood Models BY J. T. NEWCOMER, N. K. NEERCHAL Department of Mathematcs and Statstcs, Unversty of Maryland, Baltmore County, Baltmore,

More information

Conjugacy and the Exponential Family

Conjugacy and the Exponential Family CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the

More information

However, since P is a symmetric idempotent matrix, of P are either 0 or 1 [Eigen-values

However, since P is a symmetric idempotent matrix, of P are either 0 or 1 [Eigen-values Fall 007 Soluton to Mdterm Examnaton STAT 7 Dr. Goel. [0 ponts] For the general lnear model = X + ε, wth uncorrelated errors havng mean zero and varance σ, suppose that the desgn matrx X s not necessarly

More information

Lecture 4: September 12

Lecture 4: September 12 36-755: Advanced Statstcal Theory Fall 016 Lecture 4: September 1 Lecturer: Alessandro Rnaldo Scrbe: Xao Hu Ta Note: LaTeX template courtesy of UC Berkeley EECS dept. Dsclamer: These notes have not been

More information

a b a In case b 0, a being divisible by b is the same as to say that

a b a In case b 0, a being divisible by b is the same as to say that Secton 6.2 Dvsblty among the ntegers An nteger a ε s dvsble by b ε f there s an nteger c ε such that a = bc. Note that s dvsble by any nteger b, snce = b. On the other hand, a s dvsble by only f a = :

More information

DS-GA 1002 Lecture notes 5 Fall Random processes

DS-GA 1002 Lecture notes 5 Fall Random processes DS-GA Lecture notes 5 Fall 6 Introducton Random processes Random processes, also known as stochastc processes, allow us to model quanttes that evolve n tme (or space n an uncertan way: the trajectory of

More information

Engineering Risk Benefit Analysis

Engineering Risk Benefit Analysis Engneerng Rsk Beneft Analyss.55, 2.943, 3.577, 6.938, 0.86, 3.62, 6.862, 22.82, ESD.72, ESD.72 RPRA 2. Elements of Probablty Theory George E. Apostolaks Massachusetts Insttute of Technology Sprng 2007

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1 MATH 5707 HOMEWORK 4 SOLUTIONS CİHAN BAHRAN 1. Let v 1,..., v n R m, all lengths v are not larger than 1. Let p 1,..., p n [0, 1] be arbtrary and set w = p 1 v 1 + + p n v n. Then there exst ε 1,..., ε

More information

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition)

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition) Count Data Models See Book Chapter 11 2 nd Edton (Chapter 10 1 st Edton) Count data consst of non-negatve nteger values Examples: number of drver route changes per week, the number of trp departure changes

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

DIFFERENTIAL FORMS BRIAN OSSERMAN

DIFFERENTIAL FORMS BRIAN OSSERMAN DIFFERENTIAL FORMS BRIAN OSSERMAN Dfferentals are an mportant topc n algebrac geometry, allowng the use of some classcal geometrc arguments n the context of varetes over any feld. We wll use them to defne

More information

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

COMPLEX NUMBERS AND QUADRATIC EQUATIONS COMPLEX NUMBERS AND QUADRATIC EQUATIONS INTRODUCTION We know that x 0 for all x R e the square of a real number (whether postve, negatve or ero) s non-negatve Hence the equatons x, x, x + 7 0 etc are not

More information

Testing for seasonal unit roots in heterogeneous panels

Testing for seasonal unit roots in heterogeneous panels Testng for seasonal unt roots n heterogeneous panels Jesus Otero * Facultad de Economía Unversdad del Rosaro, Colomba Jeremy Smth Department of Economcs Unversty of arwck Monca Gulett Aston Busness School

More information

DECOUPLING THEORY HW2

DECOUPLING THEORY HW2 8.8 DECOUPLIG THEORY HW2 DOGHAO WAG DATE:OCT. 3 207 Problem We shall start by reformulatng the problem. Denote by δ S n the delta functon that s evenly dstrbuted at the n ) dmensonal unt sphere. As a temporal

More information

Marginal Effects in Probit Models: Interpretation and Testing. 1. Interpreting Probit Coefficients

Marginal Effects in Probit Models: Interpretation and Testing. 1. Interpreting Probit Coefficients ECON 5 -- NOE 15 Margnal Effects n Probt Models: Interpretaton and estng hs note ntroduces you to the two types of margnal effects n probt models: margnal ndex effects, and margnal probablty effects. It

More information

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9 Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,

More information

Lecture 17 : Stochastic Processes II

Lecture 17 : Stochastic Processes II : Stochastc Processes II 1 Contnuous-tme stochastc process So far we have studed dscrete-tme stochastc processes. We studed the concept of Makov chans and martngales, tme seres analyss, and regresson analyss

More information

Chapter 7 Generalized and Weighted Least Squares Estimation. In this method, the deviation between the observed and expected values of

Chapter 7 Generalized and Weighted Least Squares Estimation. In this method, the deviation between the observed and expected values of Chapter 7 Generalzed and Weghted Least Squares Estmaton The usual lnear regresson model assumes that all the random error components are dentcally and ndependently dstrbuted wth constant varance. When

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Affine transformations and convexity

Affine transformations and convexity Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/

More information

Chapter 4: Regression With One Regressor

Chapter 4: Regression With One Regressor Chapter 4: Regresson Wth One Regressor Copyrght 2011 Pearson Addson-Wesley. All rghts reserved. 1-1 Outlne 1. Fttng a lne to data 2. The ordnary least squares (OLS) lne/regresson 3. Measures of ft 4. Populaton

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

PHYS 705: Classical Mechanics. Calculus of Variations II

PHYS 705: Classical Mechanics. Calculus of Variations II 1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary

More information

e i is a random error

e i is a random error Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where + β + β e for,..., and are observable varables e s a random error How can an estmaton rule be constructed for the unknown

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information