MATH2715: Statistical Methods
|
|
- Kathryn Ross
- 5 years ago
- Views:
Transcription
1 MATH275: Statistical Methods Exercises VI (based on lectre, work week 7, hand in lectre Mon 4 Nov) ALL qestions cont towards the continos assessment for this modle. Q. The random variable X has a discrete niform distribtion, taking vales and +, each with probability 0.5. Show that X has mean zero and variance nity. Show that the moment generating fnction of X is m X (t) = E[e tx ] = 2 e t + 2 et. Q2. Sppose that X,...,X n are mtally independent discrete niform random variables taking vales and +, each with probability 0.5. Let S n = X + + X n. The moment generating fnction of X is m X (t) = 2 e t + 2 et. Write down the moment generating fnction of S n. Sppose that Y = S n n has moment generating fnction m Y (t). Show that m Y (t) = ( 2 e t/ n + 2 et/ n ) n. Obtain log m Y (t) and dedce that, as n, log m Y (t) 2 t2. What do yo conclde abot the distribtion of Y for large n? Hint: Recall that e θ = + θ + θ2 2! + θ3 3! +. Q3. The random variables X and Y are independent and each has an exponential distribtion with parameter λ =. Let U = X Y. Prove that the moment generating fnction of U is m U (t) = t2. Obtain the mean and variance of U. Hint: If X exponential(λ), then the moment generating fnction is m X (t) = λ λ t for t < λ. Q4. The random variables X and Y are independent and each has an exponential distribtion with parameter λ =. Let U = X Y. Prove that the characteristic fnction of U is φ U (t) = + t 2. Hint: Recall that φ U (t) = E[e itu ]. Q5. The random variables X and Y are independent and each has an exponential distribtion with parameter λ =. Let U = X Y. It can be shown that U has probability density fnction f U () = 2 e for < <. The Forier inversion theorem relates a density fnction f U () and the corresponding characteristic fnction φ U (t) by f U () = 2π e it φ U (t)dt. By considering what yo get if yo pt z = t, dedce that the characteristic fnction of the Cachy distribtion is e t. Hint: Yo don t need to evalate any integrals! 69
2 Backgrond Notes: Lectre. Central limit theorem Another integration reslt NOT necessary to learn e itx π( + x 2 ) = = = 2 π 0 cos(tx) + i sin(tx) π( + x 2 ) cos(tx) π( + x 2 as the imaginary part cancels ot, ) cos(tx) = e t + x2 on sing Cachy s reside theorem (see MATH206). History of the central limit theorem NOT necessary to learn For the case where X Bin(n,θ = 2 ) and n is large De Moivre ( ) by 730 had discovered the approximation pr { 2 X = 2 n} nπ. The reslt follows becase Bin(n,θ = 2 ) N( 2 n, 4n). By 733 he had extended the reslt to showing, in modern notation, that if X Bin(n,θ = 2 ), then { lim pr a < X 2 n } < b = b e 2 t2 dt. n n 2π In 82 Laplace generalised this reslt to Bin(n,θ) random variables, { } lim pr a < X nθ < b = b e 2 t2 dt. n nθ( θ) 2π This is sometimes referred to as the De Moivre-Laplace theorem. 2 Recall that X Bin(n,θ) can be thoght of as the sm, X = X +X 2 + +X n, of n independent Bernolli trials, where { with probability θ, X i = 0 with probability θ. Ths the De Moivre-Laplace theorem can be written as { } lim pr a < X + X X n nθ < b n nθ( θ) a a = b e 2 t2 dt. 2π In 887 Chebyshev generalised this reslt bt his proof was improved by one of his stdents, Markov. In 90 another of his stdents, Aleksandr Lyapnov (857-98), frther generalised the De Moivre-Laplace theorem. Later workers removed some of the assmptions made in earlier proofs and in 922 Jarl Lindeberg ( ) proved what is now known as the central limit theorem, that a seqence of n independent and identically distribted random variables X, X 2,..., X n, each with mean µ and finite variance σ 2, satisfies { } lim pr X + X X n nµ < b = b e 2 t2 dt. n nσ 2 2π a 70
3 Frther reading for lectre Rice, J.A. (995) Mathematical Statistics and Data Analysis (2nd edition), sections 3.6, 4., 4.5, 5.3. Hogg, R.V., McKean, J.W. and Craig, A.T. (2005) Introdction to Mathematical Statistics (6th edition), sections.9, 4.4. Larsen, R.J. and Marx, M.L. (200) An Introdction to Mathematical Statistics and its Applications (5th edition), sections 4.3. Miller, I. and Miller, M. (2004) John E. Frend s Mathematical Statistics with Applications, sections
4 MATH275: Statistical Methods Worked Examples VI Worked Example: The random variables X and Y are independent and have moment generating fnction m X (t) and m Y (t) respectively. Obtain the moment generating fnction of U = ax + by. Answer: m U (t) = E[e tu ] = E[e t(ax+by ) ] = E[e (at)x e (bt)y ] = E[e (at)x ]E[e (bt)y ] by independence of X and Y. Ths m U (t) = m X (at)m Y (bt). Worked Example: Let X,...,X n be independent and identically distribted exponential(λ) random variables. If Xn = n X i, show that, as n, Z n = X n E[ X n ] d N(0,). n Stdev[ X n ] i= Answer: If X i exponential(λ), then E[X i ] = /λ, Var[X i ] = /λ 2 so E[ X n ] = λ, Var[ X n ] = nλ 2. The moment generating fnction of X i is m Xi (t) = ( ) λ n of S n = X + + X n is m Sn (t) =. λ t Now notice Z n = X n E[ X n ] Stdev[ X n ] = λ λ t S n n λ λs = n n. n nλ 2 Recalling that U = ay + b has moment generating fnction so that the moment generating fnction m U (t) = E[e tu ] = E[e t(ay +b) ] = E[e bt e aty ] = e bt E[e (at)y ] = e bt m Y (at) shows that, with Y = S n, U = Z n, a = λ/ n, and b = n, ( ) n ( m Zn (t) = e nt = e nt λ λ (λt/ n) (t/ n) ) n. Hence Recall that, for small δ, log m Zn (t) = ( nt n log t ). n log( δ) = δ 2 δ2 3 δ3 so log m Zn (t) = ( ) t nt + n + t2 n 2n + t3 3n.5 + t22 t3 = + 3 n +. As n, log m Zn (t) 2 t2 so m Zn (t) e 2 t2 which is the moment generating fnction of a N(0,) random variable. 72
5 Worked Example: Sppose X,...,X n are independent and identically distribted Poisson(µ) random variables. Show that, as n, Z n = X n E[ X n ] Stdev[ X n ] d N(0,). Answer: If X i Poisson(µ), then E[X i ] = µ and Var[X i ] = µ so E[ X n ] = µ and Var[ X n ] = µ n. The moment generating fnction of X i is m Xi (t) = exp( µ( e t )) so that the moment generating fnction of S n = X + + X n is Now notice Z n = X n E[ X n ] Stdev[ X n ] m Sn (t) = ( exp( µ( e t )) ) n = exp( nµ( e t )). = S n n µ = S n nµ. µ nµ n Recalling that U = ay + b has moment generating fnction m U (t) = E[e tu ] = E[e t(ay +b) ] = E[e bt e aty ] = e bt E[e (at)y ] = e bt m Y (at) shows that, with Y = S n, U = Z n, a = / nµ and b = nµ, m Zn (t) = e ( nµ)t exp( nµ( e t/ nµ )). Hence log m Zn (t) = ( ( ) nµ)t nµ e t/ nµ = ( nµ)t nµ + nµe t/ nµ. Recall that, for small δ, e δ = + δ + 2! δ2 + 3! δ3 + so giving log m Zn (t) = ( nµ)t nµ + nµ ( + t + t2 nµ 2nµ + t 3 ) 6n.5 µ.5 + = t22 + t 3 6n 0.5 µ As n, log m Zn (t) 2 t2 so m Zn (t) e 2 t2 which is the moment generating fnction of a N(0,) random variable. Worked Example: Random variables X and Y are independent and each has a niform(0, ) distribtion. Let U = X Y. Obtain the characteristic fnction and probability density fnction f U () of U. Use the Forier inversion theorem to dedce the characteristic fnction of the random ( cos z) variable Z with probability density fnction f Z (z) = πz 2, where < z < +. Hint: The Forier inversion theorem relates a density fnction f U () and the characteristic fnction φ U (t) by Answer: f U () = e it φ U (t)dt. 2π 73
6 Characteristic fnction of U: If X niform(0,), then f X (x) = for 0 < x <. Hence X (and Y ) has moment generating fnction m X (t) = E[e tx ] = x= e tx f X (x) = x=0 [ e e tx tx = t ] x=0 = et, < t <. t The mgf of U is m U (t) = E[e tu ] = E[e t(x Y ) ] = E[e tx e ty ] = E[e tx ] E[e ( t)y ] = m X (t)m Y ( t) as X, Y are independent. Hence ( e t )( e t ) m U (t) = = 2 et e t t t t 2. Ths the characteristic fnction of U is φ U (t) = E[e itu ] = m U (it) = 2 eit e it 2 (cos t + isin t) (cos t isin t) 2( cos t) (it) 2 = t 2 = t 2. Joint pdf of (U,V ): As X and Y are independent, f XY (x,y) = f X (x)f Y (y) =, 0 < x,y <. Pt = x y, v = y so x = + v and y = v with Jacobian J = (x, y) x x (, v) = v = 0 =. y y v Hence f UV (,v) = f XY (x,y) J =. Range of U and V : 0 < x <, 0 < y < = < x y < so < < while 0 < v <. Bt also 0 < x < = y < x y < y so v < < v giving < v <. Clearly max(0, ) < v < min(, ). Figre 30(left) shows the (x,y) region on the left. The elemental strip B 66 with > 0 is defined for y (0, ). The elemental strip A 67 with < 0 is defined for y (,). Figre 30(right) shows the mapping in the (, v) plane. The corresponding elemental strips are now vertical strips. When > 0, the elemental strip B is defined for v (0, ). When < 0, the elemental strip A is defined for v (,). y A <0 A* v= v=y B >0 v=- v=- B* 0 x - 0 v=0 =x-y Figre 30: mapping = x y, v = y for x, y 0 showing (left) x-y and (right) -v planes. 66 The line y = x for 0 < x < with > The line y = x for 0 < x < with < 0. 74
7 Probability density fnction of U: f U () = v= f UV (,v)dv = min(, ) v=max(0, ) dv = [ ] min(, ) v. v=max(0, ) There are ths two cases to consider. [ ] v= v = + if < 0, v= f U () = = for < <. [ ] v= v = if 0. v=0 Forier inversion theorem: The Forier inversion theorem gives f U () = 2π Ths = e it 2( cos t) 2π t= t 2 dt. Ptting t = z and dt = dz shows that z= e iz( cos z) πz 2 dz = t= e it φ U (t)dt. so that the characteristic fnction of a random variable Z having probability density fnction ( cos z) f Z (z) = πz 2 is φ Z () =. Clearly f Z (z) is a probability density fnction since f Z (z) 0 for all z and φ Z (0) =. It is shown in figre 3(right) Pdf f U () Pdf f Z (z) z Figre 3: (left) plot of f U () = for < <, (right) plot of probability density fnction f Z (z). 68 The probability density fnction f Z(z) can be plotted sing the following R command: crve((-cos(x))/(pi*x*x),-30,30,xlab="z",ylab="pdf") 75
8 MATH275: Soltions to exercises III Q. f X (x) = for 0 < x <. If = log x, then x = e and d = e so that f U () = f X (x) d = e = e, > 0. This is the pdf of an exponential random variable with parameter λ = and mean /λ =. Q2. (a) f X (x) = for 0 < x <. The transformation = x 2 gives x = so d = 2 2 so that f U () = f X (x) d = 2 = 2, 0 < <. The range for follows becase if 0 < x <, then 0 < <. (b) f X (x) = for 2 < x < 2. The transformation = x2 now gives x = ± so d = ± 2 2. The transformation is no longer a mapping. Both x = + and x = give the same -vale. Hence split the problem into the two cases x < 0 and x > 0 where the mapping is. This gives f U () = f X (x = ) d + f X (x = + ) x= d. x=+ In both cases the absolte vale of the Jacobian is the same. 69 Ths f U () = =, 0 < < 4. The range of U follows becase 2 < x < 2 implies 0 < < 4. Q3. = log x so x = e and d = e. Since x > 0 then = log x satisfies < < +. f U () = f X (x) d = e x e = exp( e )e = exp( e ), < <. df U () Sketching pdf: = ( e )exp( e ) so df U() = 0 at = 0. As ±, f U () 0. d d The pdf can be evalated for different vales and is shown in figre 32. Q4. (a) f X (x) = 2 e 2 x, for x > 0. The mapping = x is a mapping with inverse transformation x = 2 and d = 2. f U () = f X (x) d = 2 e 2 x 2 = e 2 2, > In some more complicated sitations the gradient making p the Jacobian might be very different in the different parts of the mapping. For example, consider the mapping = x 2 if x > 0 bt = x if x < 0. The two vales x = ± map to the same = vale bt the gradient d eqals 2 if x = + and eqals if x =. 76
9 f U () = exp( e ) Pdf Figre 32: pdf f U () = e e, < <. U is said to have a Rayleigh distribtion 70. (b) If Z N(0,), then E[Z 2 ] =, so Ths E[U] = (c) The R code... f U ()d = =0 z= z 2 e 2 z2 dz = and z 2 e 2 z2 dz = 2π z=0 2π 2. 2 e 2 2 d = π 2π 2 e 2 2 d = =0 2π 2 2π = 2. x=rexp(000,0.5) =sqrt(x) mean() #...gave me.2603 while sqrt(pi/2)= Q5. (a) Noting that X and Y are independent, their joint pdf is f XY (x,y) = f X (x)f Y (y) = e 2 x2 e 2 y2 = 2π 2π 2π e 2 (x2 +y 2), < x,y <. If = x 2 + y 2, v = x/y, then (, v) (x, y) = x v x y v y = 2x 2y /y x/y 2 = 2(x2 /y 2 + ). The mapping is not since ( x, y) and (x,y) map to give the same vale of (,v). Hence consider the two cases {x < 0} and {x 0} separately. Since J = (x, y) (, v) = (,v) (x, y) = ( 2(x 2 /y 2 + )) = 2(v 2 + ), then J = 2( + v 2 and the ) joint pdf f UV (,v) satisfies f UV (,v) = f XY ( x, y) J + f XY (x,y) J = 2 2π e 2 (x2 +y 2) 2 2( + v 2 ) = e 2π( + v 2 ). 70 Often sed to model wind velocity and sea-wave heights. For example, if X and Y are independent orthogonal components of wind velocity having normal distribtions, then the magnitde X 2 + Y 2 has a Rayleigh distribtion. Qestion 4 shows that X 2 + Y 2 has a Rayleigh distribtion and qestion 5 shows that X 2 +Y 2 exponential( 2 ). 77
10 The joint pdf is defined for > 0, < v <. (b) Independence 7 of U and V follows as the joint pdf factorises to give f UV (,v) = f U ()f V (v) where f U () = 2 e 2, > 0, f V (v) = π( + v 2 ), < v <. Clearly these are recognisable 72 pdfs so that U exponential(λ = 2 ) while V Cachy. 7 Contors of f XY (x, y) are circles centred on the origin as shown in the plot below where r = p x 2 + y 2 and θ = tan x/y. The qestion shows that U and V = tanθ (and hence R = U and Θ) are independent. y θ r (x,y) x 72 If yo do not recognise them as pdfs, then the pdfs of U and V cold be obtained by integrating the joint pdf: f U() = Z for > 0. Also f V (v) = Z 0 f UV (, v) dv = 2 e 2 Z f UV (, v) d = Z dv π( + v 2 ) = 2 e 2 Z π( + v 2 2 ) e 2 d = 0 v=0 2dv h π( + v 2 ) = 2 e 2 2 i π tan v = 2 e 2, 0 h e i π( + v 2 2 = ) =0, for < v <. π( + v 2 ) 78
11 MATH275: Statistical Methods Examples Class VI, week 7 The qestions below will be looked at in Examples Class VI held in week 7. Q. Sppose that X and Y are independent Poisson random variables with means λ and µ respectively. Use moment generating fnctions to dedce the distribtion of U = X + Y. Q2. The random variables X and X 2 are independent discrete random variables satisfying pr{x i = 0} = pr{x i = } = 2 for i =,2. Obtain the moment generating fnction of X i. Obtain the moment generating fnction of U = X + X 2 and of V = X X 2. Dedce the probability fnctions of U and V. 79
MATH2715: Statistical Methods
MATH275: Statistical Methods Exercises III (based on lectres 5-6, work week 4, hand in lectre Mon 23 Oct) ALL qestions cont towards the continos assessment for this modle. Q. If X has a niform distribtion
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises IV (based on lectures 7-8, work week 5, hand in lecture Mon 30 Oct) ALL questions count towards the continuous assessment for this module. Q1. If a random variable
More informationLecture 2: CENTRAL LIMIT THEOREM
A Theorist s Toolkit (CMU 8-859T, Fall 3) Lectre : CENTRAL LIMIT THEOREM September th, 3 Lectrer: Ryan O Donnell Scribe: Anonymos SUM OF RANDOM VARIABLES Let X, X, X 3,... be i.i.d. random variables (Here
More informationProving the central limit theorem
SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit
More informationMath 263 Assignment #3 Solutions. 1. A function z = f(x, y) is called harmonic if it satisfies Laplace s equation:
Math 263 Assignment #3 Soltions 1. A fnction z f(x, ) is called harmonic if it satisfies Laplace s eqation: 2 + 2 z 2 0 Determine whether or not the following are harmonic. (a) z x 2 + 2. We se the one-variable
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationLecture The Sample Mean and the Sample Variance Under Assumption of Normality
Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013
More information8 Laws of large numbers
8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable
More informationCharacteristic Functions and the Central Limit Theorem
Chapter 6 Characteristic Functions and the Central Limit Theorem 6.1 Characteristic Functions 6.1.1 Transforms and Characteristic Functions. There are several transforms or generating functions used in
More informationTHE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA
THE ROYAL STATISTICAL SOCIETY 4 EXAINATIONS SOLUTIONS GRADUATE DIPLOA PAPER I STATISTICAL THEORY & ETHODS The Societ provides these solutions to assist candidates preparing for the examinations in future
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationTwo hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45
Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions
More informationBayes and Naïve Bayes Classifiers CS434
Bayes and Naïve Bayes Classifiers CS434 In this lectre 1. Review some basic probability concepts 2. Introdce a sefl probabilistic rle - Bayes rle 3. Introdce the learning algorithm based on Bayes rle (ths
More informationLinear System Theory (Fall 2011): Homework 1. Solutions
Linear System Theory (Fall 20): Homework Soltions De Sep. 29, 20 Exercise (C.T. Chen: Ex.3-8). Consider a linear system with inpt and otpt y. Three experiments are performed on this system sing the inpts
More informationSTEP Support Programme. STEP III Hyperbolic Functions: Solutions
STEP Spport Programme STEP III Hyperbolic Fnctions: Soltions Start by sing the sbstittion t cosh x. This gives: sinh x cosh a cosh x cosh a sinh x t sinh x dt t dt t + ln t ln t + ln cosh a ln ln cosh
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationA Single Species in One Spatial Dimension
Lectre 6 A Single Species in One Spatial Dimension Reading: Material similar to that in this section of the corse appears in Sections 1. and 13.5 of James D. Mrray (), Mathematical Biology I: An introction,
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationPhysicsAndMathsTutor.com
C Integration - By sbstittion PhysicsAndMathsTtor.com. Using the sbstittion cos +, or otherwise, show that e cos + sin d e(e ) (Total marks). (a) Using the sbstittion cos, or otherwise, find the eact vale
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationUNCERTAINTY FOCUSED STRENGTH ANALYSIS MODEL
8th International DAAAM Baltic Conference "INDUSTRIAL ENGINEERING - 19-1 April 01, Tallinn, Estonia UNCERTAINTY FOCUSED STRENGTH ANALYSIS MODEL Põdra, P. & Laaneots, R. Abstract: Strength analysis is a
More information7 Continuous Variables
7 Continuous Variables 7.1 Distribution function With continuous variables we can again define a probability distribution but instead of specifying Pr(X j) we specify Pr(X < u) since Pr(u < X < u + δ)
More informationStatistics 1B. Statistics 1B 1 (1 1)
0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?
More informationON THE SHAPES OF BILATERAL GAMMA DENSITIES
ON THE SHAPES OF BILATERAL GAMMA DENSITIES UWE KÜCHLER, STEFAN TAPPE Abstract. We investigate the for parameter family of bilateral Gamma distribtions. The goal of this paper is to provide a thorogh treatment
More informationMATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours
MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2011, Professor Whitt Class Lecture Notes: Thursday, September 15. Random Variables, Conditional Expectation and Transforms 1. Random
More information1 The space of linear transformations from R n to R m :
Math 540 Spring 20 Notes #4 Higher deriaties, Taylor s theorem The space of linear transformations from R n to R m We hae discssed linear transformations mapping R n to R m We can add sch linear transformations
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationLecture 17 Errors in Matlab s Turbulence PSD and Shaping Filter Expressions
Lectre 7 Errors in Matlab s Trblence PSD and Shaping Filter Expressions b Peter J Sherman /7/7 [prepared for AERE 355 class] In this brief note we will show that the trblence power spectral densities (psds)
More informationCharacterizations of probability distributions via bivariate regression of record values
Metrika (2008) 68:51 64 DOI 10.1007/s00184-007-0142-7 Characterizations of probability distribtions via bivariate regression of record vales George P. Yanev M. Ahsanllah M. I. Beg Received: 4 October 2006
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More information2. The CDF Technique. 1. Introduction. f X ( ).
Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationFinal Solutions Fri, June 8
EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on
More informationAn Investigation into Estimating Type B Degrees of Freedom
An Investigation into Estimating Type B Degrees of H. Castrp President, Integrated Sciences Grop Jne, 00 Backgrond The degrees of freedom associated with an ncertainty estimate qantifies the amont of information
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)
More informationMath 116 First Midterm October 14, 2009
Math 116 First Midterm October 14, 9 Name: EXAM SOLUTIONS Instrctor: Section: 1. Do not open this exam ntil yo are told to do so.. This exam has 1 pages inclding this cover. There are 9 problems. Note
More informationChem 4501 Introduction to Thermodynamics, 3 Credits Kinetics, and Statistical Mechanics. Fall Semester Homework Problem Set Number 10 Solutions
Chem 4501 Introdction to Thermodynamics, 3 Credits Kinetics, and Statistical Mechanics Fall Semester 2017 Homework Problem Set Nmber 10 Soltions 1. McQarrie and Simon, 10-4. Paraphrase: Apply Eler s theorem
More information18.175: Lecture 8 Weak laws and moment-generating/characteristic functions
18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach
More informationPrint of 1 https://s-mg4.mail.yahoo.com/neo/lanch?#5138161362 2/3/2015 11:12 AM On Monday, 2 Febrary 2015 2:26 PM, Editor AJS wrote: Dear Prof. Uixit, I am glad to inform yo
More informationStatistics 3657 : Moment Generating Functions
Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationCHARACTERIZATIONS OF EXPONENTIAL DISTRIBUTION VIA CONDITIONAL EXPECTATIONS OF RECORD VALUES. George P. Yanev
Pliska Std. Math. Blgar. 2 (211), 233 242 STUDIA MATHEMATICA BULGARICA CHARACTERIZATIONS OF EXPONENTIAL DISTRIBUTION VIA CONDITIONAL EXPECTATIONS OF RECORD VALUES George P. Yanev We prove that the exponential
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationEE2 Mathematics : Functions of Multiple Variables
EE2 Mathematics : Fnctions of Mltiple Variables http://www2.imperial.ac.k/ nsjones These notes are not identical word-for-word with m lectres which will be gien on the blackboard. Some of these notes ma
More informationMath 407: Probability Theory 5/10/ Final exam (11am - 1pm)
Math 407: Probability Theory 5/10/2013 - Final exam (11am - 1pm) Name: USC ID: Signature: 1. Write your name and ID number in the spaces above. 2. Show all your work and circle your final answer. Simplify
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationFormal Methods for Deriving Element Equations
Formal Methods for Deriving Element Eqations And the importance of Shape Fnctions Formal Methods In previos lectres we obtained a bar element s stiffness eqations sing the Direct Method to obtain eact
More informationcalled the potential flow, and function φ is called the velocity potential.
J. Szantr Lectre No. 3 Potential flows 1 If the flid flow is irrotational, i.e. everwhere or almost everwhere in the field of flow there is rot 0 it means that there eists a scalar fnction ϕ,, z), sch
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More information, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40
Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationthe convolution of f and g) given by
09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that
More informationSecond-Order Wave Equation
Second-Order Wave Eqation A. Salih Department of Aerospace Engineering Indian Institte of Space Science and Technology, Thirvananthapram 3 December 016 1 Introdction The classical wave eqation is a second-order
More informationMotion in One Dimension. A body is moving with velocity 3ms towards East. After s its velocity becomes 4ms towards North. The average acceleration of the body is a) 7ms b) 7ms c) 5ms d) ms. A boy standing
More informationMath 151. Rumbos Spring Solutions to Review Problems for Exam 2
Math 5. Rumbos Spring 22 Solutions to Review Problems for Exam 2. Let X and Y be independent Normal(, ) random variables. Put Z = Y X. Compute the distribution functions (z) and (z). Solution: Since X,
More informationMathematical Statistics
Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution
More informationFundamental Tools - Probability Theory IV
Fundamental Tools - Probability Theory IV MSc Financial Mathematics The University of Warwick October 1, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory IV 1 / 14 Model-independent
More informationBIOSTATISTICAL METHODS
BIOSTATISTICAL METHOS FOR TRANSLATIONAL & CLINICAL RESEARCH ROC Crve: IAGNOSTIC MEICINE iagnostic tests have been presented as alwas having dichotomos otcomes. In some cases, the reslt of the test ma be
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationMAS113 Introduction to Probability and Statistics
MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically
More informationLecture 8: September 26
10-704: Information Processing and Learning Fall 2016 Lectrer: Aarti Singh Lectre 8: September 26 Note: These notes are based on scribed notes from Spring15 offering of this corse. LaTeX template cortesy
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationExercises for Part 1
MATH200 Complex Analysis. Exercises for Part Exercises for Part The following exercises are provided for you to revise complex numbers. Exercise. Write the following expressions in the form x + iy, x,y
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationRandom Variables (Continuous Case)
Chapter 6 Random Variables (Continuous Case) Thus far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationA generalized Alon-Boppana bound and weak Ramanujan graphs
A generalized Alon-Boppana bond and weak Ramanjan graphs Fan Chng Abstract A basic eigenvale bond de to Alon and Boppana holds only for reglar graphs. In this paper we give a generalized Alon-Boppana bond
More informationMoment Generating Function. STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution
Moment Generating Function STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution T. Linder Queen s University Winter 07 Definition Let X (X,...,X n ) T be a random vector and
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More informationSometimes can find power series expansion of M X and read off the moments of X from the coefficients of t k /k!.
Moment Generating Functions Defn: The moment generating function of a real valued X is M X (t) = E(e tx ) defined for those real t for which the expected value is finite. Defn: The moment generating function
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationChapter 4 Supervised learning:
Chapter 4 Spervised learning: Mltilayer Networks II Madaline Other Feedforward Networks Mltiple adalines of a sort as hidden nodes Weight change follows minimm distrbance principle Adaptive mlti-layer
More informationSOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM
SOLUTIONS TO MATH68181 EXTREME VALUES AND FINANCIAL RISK EXAM Solutions to Question A1 a) The marginal cdfs of F X,Y (x, y) = [1 + exp( x) + exp( y) + (1 α) exp( x y)] 1 are F X (x) = F X,Y (x, ) = [1
More informationMath 151. Rumbos Spring Solutions to Review Problems for Exam 3
Math 151. Rumbos Spring 2014 1 Solutions to Review Problems for Exam 3 1. Suppose that a book with n pages contains on average λ misprints per page. What is the probability that there will be at least
More informationManual for SOA Exam MLC.
Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition
More informationECON3120/4120 Mathematics 2, spring 2009
University of Oslo Department of Economics Arne Strøm ECON3/4 Mathematics, spring 9 Problem soltions for Seminar 4, 6 Febrary 9 (For practical reasons some of the soltions may inclde problem parts that
More informationGeneralized Jinc functions and their application to focusing and diffraction of circular apertures
Qing Cao Vol. 20, No. 4/April 2003/J. Opt. Soc. Am. A 66 Generalized Jinc fnctions and their application to focsing and diffraction of circlar apertres Qing Cao Optische Nachrichtentechnik, FernUniversität
More informationThis exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.
GROUND RULES: This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner. This exam is closed book and closed notes. Show
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationSTA 4321/5325 Solution to Homework 5 March 3, 2017
STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More information