UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

Size: px
Start display at page:

Download "UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007"

Transcription

1 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas & Tsitsiklis, Problem 8. Let X, X,... be independent and identically distributed random variables, where each X i is distributed according to the logarithmic PMF with parameter p i.e., p X (k) ( p)k, k,, 3,..., k ln(/p) where 0 < p <. Discrete random variable N has the Poisson PMF with parameter λ i.e., where λ > 0. λ λk p N (k) e, k 0,,,..., (a) Determine the transform M X (s) associated with each random variable X i. Hint: You may find the following identity useful: Provided < a, ln( + a) a a + a3 3 a (b) Defining Y N i X i, determine the transform M Y (s). Solution:. The Transform of p X (k) ( p)k k ln(/p) is M X (s) k ln(/p) (e sk )( p) k k ln(/p) (e s ( p)) k k ln( es ( p)) ln p k where we used ln( α) α + α +... k αk k for < α. The Transform of a Poisson Random Variable p N (k) e λ λk is given by M N (s) E[e sn ] e λ(es )

2 Using the Law of Iterated Expectation, the Transform of Y N i X i is M Y (s) E[e sy ] E[E[e sy N]] E[E[e s P N i X i N]] E[M X (s) N ] M N (s) e s M X (s) e λ(es ) e s M X (s) e ln( e s ( p)) λ( ) ln p e λ ln p ( ln( es ( p))+ln p) [e ( ln( es ( p))+ln p) ] λ ln p [e ln( es ( p)) e ln p ] λ ln p p [ e s ( p) ] λ ln p Problem 8. N male-female couples at a dinner party play the following game. Each member of the couple, writes his/her name on a piece of paper (so there are a total of N pieces of paper). Then men throw their paper into hat A, while the women throw their paper in hat B. Once all the papers are in, the host draws a piece of paper from each hat, and the two people chosen are dance partners for the rest of the night. The host continues in this fashion until all N poeple are paired up for night. Let M be the number of couples that are reunited by this game. Find E[M] and V ar(m). Hint: set up indicator variables and use covariance. Solution: We will use indicator functions to solve this problem. Define variables {X i } such that X i if couple i is reunited, and is 0 otherwise. Clearly M N i X i. We know that: E[M] N E[X i ] NE[X i ] i Now we easily see that: E[X i ] N and therefore E[N]. Finding the variance is a bit more tricky since the X i are not independent. Thus we have: V ar(m) N i V ar(x i ) + i<j Cov(X i, X j ) Now, V ar(x i ) E[Xi ] E[X i ] E[X i ] E[X i ] N N N N. Furthermore, Cov(X i, X j ) E[X i X j ] E[X i ]E[X j ] N N N N (N )

3 Thus we find: V ar(x) N N ( ) N N + N (N ) N N (N ) + N N (N ) N N + N Problem 8.3 Suppose that the random variables X, X have joint density function: [ ] f X,X (x, x ) γ exp (x 6x x + 0x ), for < x, x <. (a) Find µ x, µ x, Cov(x, x ) and the correlation coefficient ρ. (b) Find E[x x 3]. (c) For y ax + bx and y cx find a, b, c such that ρ 0, σ y σ y. Solution:. To find the means, the covariance, and the correlation coefficient, we need to compare given pdf to the generic bivariate normal PDF: { [ πσ x σ exp x ρ (x µ x ) ρ σx ρ x µ x x µ x + (x µ x ) ]} σ x σ x σx and we find: µ x µ x 0. (For more background on the bivariate normal random variable, please refer to chapter 4 of the course notes.) Therefore: [ (x µ x ) ρ σx ρ x µ x x µ x + (x µ x ) ] σ x σ x σx x 6x x + x ρ ρ σx, ( ρ 6, )σ x σ x ρ σx 0 σ x 0, σ x, ρ 3 0. and Cov(x, x ) σ x σ x ρ 3.. To find the conditional mean of x given x 3 we use the formula: E[x x ] µ x + ρσ x σ x (x µ x ) and we find: E[x x 3]

4 3. We want to find a, b, c such that ρ 0, σ y σ y. We can see that: E[y ] E[y ] 0. Thus we have σy E[y ] and for this to equal, we need c ±. We also have: σ y E[y ] E[a x + abx x + b x ] 0a + 6ab + b, and Cov(y, y ) a E[x x ] + b E[x ] 3a + b b 3a 0a 8a + 9a a ± a, b 3, c a, b 3, c a, b 3, c a, b 3, c Problem 8.4 The receiver is an optical communications system uses a photodetector that counts the number of photons that arrive during the communication session. (The time of the communication session is time unit.) The sender conveys information by either transmitting or not transmitting photons to the photodetector. The probability of transmitting is p. If she transmits, the number of photons X that she transmits during the session has a Poisson PMF with mean λ per time unit. If she does not transmit, she generates no photons. Unfortunately, regardless of whether or not she transmits, there may still be photons arriving at the photodetector because of a phenomenon called shot noise. The number N of photons that arrive because of the shot noise has a Poisson PMF with mean µ. N and X are independent. The total number of photons counted by the photodetector is equal to the sum of the transmitted photons and the photons generated by the shot noise effect. (a) What is the probability that the sender transmitted if the photodetector counted k photons? (b) Before you know anything else about a particular communication session, what is your least squares estimate of the number of photons transmitted? (c) What is the least squares estimate of the number of photons transmitted by the sender if the photodetector counted k photons? (d) What is the best linear predictor for the number of photons transmitted by the sender as a function of k, the number of the detected photons? Solution: 4

5 . Let A be the event that the sender transmitted, and K be the number of photons counted by the photodetector. Using Bayes rule, P(A K k) p K A(k)P(A) p K (k) p X+N (k) p p N (k) ( p) + p X+N (k) p The discrete random variables X and N are given by the following PMF s: p X (x) λx e λ, x 0 x! p N (n) µn e µ, n 0 n! The sum of two independent Poisson random variables is also Poisson, with mean equal to the sum of the means of each of the random variables. This fact can be derived by looking at the product of the transforms of X and N. Therefore: Thus, p X+N (k) (λ + µ)k e (λ+µ), k 0 P(A K k) p (λ+µ)k e (λ+µ) p (λ+µ)k e (λ+µ) + ( p) µk e µ ( ) + p µ ke λ p λ+µ. Let S be the number of photons transmitted by the sender. Then with probability p, S X, and with probability p, S 0. The least squares estimate of the number of photons transmitted by the sender is simply the mean, in the absence of any additional information: Ŝ E[S] p λ + ( p) 0 pλ. 3. The least squares predictor has a form Using Bayes rule, Ŝ (k) E[S K k]. Ŝ (k) k sp S K (s k) s0 k s0 s p K S(k s)p S (s) p K (k) p K (k) k sp K S (k s)p S (s). s0 From the definitions of S and K, the following are true: { ( p) + pe λ, s 0 p S (s) p λs e λ s!, s,,... p K S (k s) p N (k s) µ(k s) e µ p K (k) p (λ + µ)k e (λ+µ) 5 (k s)! + ( p) µk e µ

6 In order to obtain the last expression, we observe that K S + N with probability p, and K N with probability p. Substituting into the formula above, [ ] Ŝ (k) 0 ( p) µ(k 0) e µ k + sp λs e λ µ (k s) e µ p K (k) (k 0)! s! (k s)! s0 p K (k) pe λ µ (λ + µ)k k ( ) λ s ( ) µ (k s) e s s!(k s)! λ + µ λ + µ s0 ( ) ( ) p K (k) pe λ µ (λ + µ)k λ λ pe (λ+µ) (λ + µ) k e k k λ + µ λ + µ p K (k) ( ) λ k ( ) λ + µ + p µ ke. λ p λ+µ Thus Ŝ (k) kλ λ + µ ( + p µ p λ+µ ) ke. λ Note that as k increases, the estimator can be approximated by 4. The linear least squares predictor has the form Ŝ 3 (k) E[S] + kλ λ+µ. Cov(S, K) σk (k E[K]) () Note that since X and N are independent, S and N are also independent. E[S] pλ E[S ] pe[x ] + ( p)(0) p(λ + λ) σs E[S ] (E[S]) p(λ + λ) (pλ) p( p)λ + pλ. E[K] E[S] + E[N] pλ + µ σ K σ S + σ N p( p)λ + pλ + µ. Finally, we need to find Cov(S, K). Cov(S, K) E[(S E[S])(K E[K])] E[(S E[S])(S E[S] + N E[N])] E[(S E[S])(S E[S])] + E[(S E[S])(N E[N])] σ S + E[(S E[S])(N E[N])] σ S p( p)λ + pλ. Note that we have used the fact that (S E[S]) and (N E[N]) are independent, and E[(S E[S])] 0 E[(N E[N])]. 6

7 Therefore, substituting all the numbers into the equation above, we get the linear predictor: Ŝ 3 (k) pλ + p( p)λ + pλ p( p)λ (k pλ µ). + pλ + µ Problem 8.5 Let X and Y be two random variables with positive variances, associated with the same experiment. (a) Show that ρ, the correlation coefficient of X and Y, equals to if and only if there exists a constant b and a negative constant a such that Y ax + b. (b) Let ˆ XL be the linear least mean squares estimator of X based on Y. Show that E[(X ˆ X L )Y ] 0. Use this property to show that the correlation of the estimation error X ˆ X L with Y is zero. (c) Let ˆX E[X Y ] be the least mean squares estimator of X given Y. Show that for any function h. E[(X ˆX)h(Y )] 0, (d) Is it true that the estimation error X E[X Y ] is independent of Y? Solution: (a) First suppose there exists a constant b and a negative constant a such that Y ax + b. ρ E[UV ] E[ X E[X] σ X Y E[Y ] σ Y ] Now, Y ax + b, E[Y ] ae[x] + b and var(y ) a var(x). Therefore, σ Y aσ X (since a < 0). We obtain ρ E[ X E[X] σ X (ax + b ae[x] b) X E[X] ] σ X σ X E[ U ] E[U ]. aσ X ] E[ X E[X] Now we do the proof in the other direction. Suppose that ρ E[UV ]. First we will show that E[(U E[UV ]V ) ] 0. We observe that E[U] E[V ] 0 and E[U ] E[V ]. Then, E[(U E[UV ]V ) ] E[U ] (E[UV ]) + (E[UV ]) E[V ] (E[UV ]) 0. 7

8 However, By the definition of expection, we have E[(U E[UV ]V ) ] (u E[UV ]v) f U,V (u, v)dudv. The integrand 0 because it is a squared term. Therefore, the integrand must be zero in order for the integration to be zero. Therefore U E[UV ]V V. Thus X E[X] σ X Y E[Y ] σ Y, which means Y ax + b where a σ Y σ X and b E[Y ] σ Y σ X E[X]. This completes the proof. (b) ˆXL E[X] + cov(x,y ) (Y E[Y ]), so we have E[(X ˆX L )Y ] E[XY (E[X] + cov(x, Y ) (Y E[Y ]))Y ] E[XY E[X]Y cov(x, Y ) (Y Y E[Y ])] E[XY ] E[X]E[Y ] cov(x, Y )E[Y ] σ Y cov(x, Y )[ E[Y ] σ Y + cov(x, Y )[ σ Y ] 0. E[Y ] ] + cov(x, Y )E[Y ] σ Y Now, let us examine the correlation of the estimation error X ˆX L with Y. ρ cov(x ˆX L,Y ) σ X ˆXL σ Y. So we must show that cov(x ˆX L, Y ) 0. Now cov(x ˆX L, Y ) E[(X ˆX L )Y ] E[X ˆX L ]E[Y ] E[X ˆX L ]E[Y ] E[X ˆX L ] E[X E[X] cov(x, Y ) (Y E[Y ])] E[X] E[X] cov(x, Y ) E[Y E[Y ]] 0. Hence, we have proven the desired property. (c) E[(X ˆX)h(Y )] E[(X E[X Y ])h(y )], by definition of ˆX. Now applying the linearity of expectations, we get E[Xh(Y )] E[E[X Y ]h(y )] E[Xh(Y )] E[E[Xh(Y ) Y ]] σ Y E[Xh(Y )] E[Xh(Y )] 0 where the first equality is obtained by noticing that E[X Y ] is taken with respect to X thus allowing h(y ) to be pulled into the expectation. The second equality results from the law of iterated expectations. 8

9 (d) No. Consider the following counterexample (as in problem 8.3). Let X and Y be discrete random variables with the following joint pmf { p X,Y (x, y) 4 for (x, y) (, 0), (0, ), (, 0), (0, ) 0 otherwise. Notice that E[X Y y] 0, for all feasible y. So, E[X Y ] 0. More precisely, E[X Y ] is a random variable that equals zero with probability one. So, we have X E[X Y ] X (where the equality refers to equality in distribution). X and Y are not independent, so X E[X Y ] is not independent of Y. Problem 8.6 Let random variables X and Y have the bivariate normal PDF { ( x f X,Y (x, y) π ρ exp ρxy + y ) } ( ρ, ) < x, y <, where ρ denotes the correlation coefficient between X and Y. (a) Determine the numerical values of E[X], var(x), E[Y ] and var(y ). (b) Show that X and Z (Y ρx)/ ρ are independent normal random variables, and determine the numerical values of E[Z] and var(z). (c) Deduce that Solution ( P {X > 0} ) {Y > 0} 4 + π sin ρ.. Two random variables X and Y are said to be jointly normal if they can be expressed in the form X au + bv Y cu + dv where U and V are independent normal random variable (Section 4.7 in text) The bivariate normal PDF of X and Y has the form ( ) (x E[X]) ρ(x E[X])(y E[Y ]) (y E[Y ]) f X,Y (x, y) π exp ρ σ X σ Y σx σ X σ Y + ( ρ ), < x, y <, where E[X],E[Y ], σx, σ Y are the mean and variance of X and Y. By inspection, E[X] E[Y ] 0 and var(x) var(y ) for { ( x f X,Y (x, y) π ρ exp ρxy + y ) } ( ρ, < x, y <. ) 9

10 . From part (a), X is normal since it is a linear combination of two independent normal random variables. Since Z is a linear combination of X and Y, it can also be expressed as a linear combination of independent random variables U and V. Therefore, Z is also normal. To show that X and Z are independent, we need to show that cov(x, Z) 0. (Independence implies zero covariance, but the converse is not true except for normal random variables). E[Z] σ Z Z Y ρ E[Y ] ρ E[X] 0 ρ ρ ρx ρ σ Y ρ ρcov(x, Y ) ρ + ρ σx ρ ρ ρ ρ + ρ ρ ρ ρ Solving for the covariance of X and Z, we have cov(x, Z) E[(X E[X])(Z E[Z])] Y E[XZ] E[X ( ρx )] ρ ρ E[XY ] ρ ρe[x ] ρ E[XY ] ρ ρ ρ E[XY ] ρ E[XY ] ρ 0 Therefore, X and Z are independent. 3. Since Z Y ρx, if Y > 0, then Z > ρx ρ ρ. Therefore we have, P(X > 0, Y > 0) P(X > 0, Z > ρx ) ρ Changing to polar coordinates yields P(X > 0, Y > 0) π θα π α r0 X>0 π Z> ρx ρ r exp rdrdθ π (x +z ) exp dzdx π dθ where α ρ tan ( ) ρ sin ρ 4 + π sin ρ 0

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 9 Spring 2006

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 9 Spring 2006 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 9 Spring 006 Exam format The second midterm will be held in class on Thursday, April

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE53 Handout #34 Prof Young-Han Kim Tuesday, May 7, 04 Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei) Linear estimator Consider a channel with the observation Y XZ, where the

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Solutions to Homework Set #6 (Prepared by Lele Wang)

Solutions to Homework Set #6 (Prepared by Lele Wang) Solutions to Homework Set #6 (Prepared by Lele Wang) Gaussian random vector Given a Gaussian random vector X N (µ, Σ), where µ ( 5 ) T and 0 Σ 4 0 0 0 9 (a) Find the pdfs of i X, ii X + X 3, iii X + X

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

STOR Lecture 16. Properties of Expectation - I

STOR Lecture 16. Properties of Expectation - I STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

Properties of Summation Operator

Properties of Summation Operator Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 4: Solutions Fall 007 Issued: Thursday, September 0, 007 Due: Friday, September 8,

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 26: Probablity and Random Processes Problem Set 9 Fall 2007 Issued: Thursday, November, 2007 Due: Friday, November 9, 2007 Reading:

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Final Solutions Fri, June 8

Final Solutions Fri, June 8 EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on

More information

Section 9.1. Expected Values of Sums

Section 9.1. Expected Values of Sums Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.

More information

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y) HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Homework 9 (due November 24, 2009)

Homework 9 (due November 24, 2009) Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x

More information

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO QUESTION BOOKLET EE 26 Spring 2006 Final Exam Wednesday, May 7, 8am am DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 80 minutes to complete the final. The final consists of five

More information

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II Week 8 Jointly Distributed Random Variables Part II Week 8 Objectives 1 The connection between the covariance of two variables and the nature of their dependence is given. 2 Pearson s correlation coefficient

More information

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, 008 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

14.30 Introduction to Statistical Methods in Economics Spring 2009

14.30 Introduction to Statistical Methods in Economics Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011 UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, 2014 Homework Set #6 Due: Thursday, May 22, 2011 1. Linear estimator. Consider a channel with the observation Y = XZ, where the signal X and

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables.

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables. Random vectors Recall that a random vector X = X X 2 is made up of, say, k random variables X k A random vector has a joint distribution, eg a density f(x), that gives probabilities P(X A) = f(x)dx Just

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

ORF 245 Fundamentals of Statistics Great Expectations

ORF 245 Fundamentals of Statistics Great Expectations ORF 245 Fundamentals of Statistics Great Expectations Robert Vanderbei Fall 2015 Slides last edited on November 16, 2015 http://www.princeton.edu/ rvdb Definition The expectation of a random variable is

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom 1 Learning Goals Covariance and Correlation Class 7, 18.05 Jerem Orloff and Jonathan Bloom 1. Understand the meaning of covariance and correlation. 2. Be able to compute the covariance and correlation

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Introduction to Computational Finance and Financial Econometrics Matrix Algebra Review

Introduction to Computational Finance and Financial Econometrics Matrix Algebra Review You can t see this text! Introduction to Computational Finance and Financial Econometrics Matrix Algebra Review Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Matrix Algebra Review 1 / 54 Outline 1

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 10 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Monday, October 26, 2015 Recap

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

Problem Set 7 Due March, 22

Problem Set 7 Due March, 22 EE16: Probability and Random Processes SP 07 Problem Set 7 Due March, Lecturer: Jean C. Walrand GSI: Daniel Preda, Assane Gueye Problem 7.1. Let u and v be independent, standard normal random variables

More information

ECSE B Solutions to Assignment 8 Fall 2008

ECSE B Solutions to Assignment 8 Fall 2008 ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable

More information

Class 8 Review Problems 18.05, Spring 2014

Class 8 Review Problems 18.05, Spring 2014 1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements

More information

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y

More information

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20 Problem Set MAS 6J/.6J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 0 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain a

More information

Final Examination Solutions (Total: 100 points)

Final Examination Solutions (Total: 100 points) Final Examination Solutions (Total: points) There are 4 problems, each problem with multiple parts, each worth 5 points. Make sure you answer all questions. Your answer should be as clear and readable

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Lecture 4: Least Squares (LS) Estimation

Lecture 4: Least Squares (LS) Estimation ME 233, UC Berkeley, Spring 2014 Xu Chen Lecture 4: Least Squares (LS) Estimation Background and general solution Solution in the Gaussian case Properties Example Big picture general least squares estimation:

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics)

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Probability quantifies randomness and uncertainty How do I estimate the normalization and logarithmic slope of a X ray continuum, assuming

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information