Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Size: px
Start display at page:

Download "Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y."

Transcription

1 Solutions to Homework Set #3 (Prepared by Yu Xiang). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N (n) = (λt)n e λt for n = 0,,,... n! Let the random variable Y be the time to get the n-th packet. Find the pdf of Y. Solution: To find the pdf f Y (t) of the random variable Y, note that the event Y t} occurs iff the time of the nth packet is in [0,t], that is, iff the number N(t) of packets arriving in [0,t] is at least n. Alternatively, Y > t} occurs iff N(t) < n}. Hence, the cdf F Y (t) of Y is given by (λt) k F Y (t) = PY t} = PN(t) n} = e λt. k! k=n k=n Differentiating F Y (t) with respect to t, we get the pdf f Y (t) as ] f Y (t) = [ λe λt(λt)k +λe λt(λt)k k! (k )! for t > 0. = λe λt(λt)n (n )! = λe λt(λt)n (n )! k=n λe λt(λt)k k! + k=n+ λe λt(λt)k (k )! Or we can use another way. Since we know that the time interval T between packet arrivals is an exponential random variable with pdf λe f T (t) = λt, if t 0, Let T i denote the i.i.d. exponential interarrival times, then Y = T + T + + T n. By convolving f T (t) with itself n times, which can be also computed by its Fourier transform (characteristic function), we can show that the pdf of Y is given by f Y (t) = λe λt (λt) n (n )!, if t 0,. Diamond distribution. Consider the random variables X and Y with the joint pdf c if x + y / f X,Y (x,y) = 0 otherwise, where c is a constant.

2 (a) Find c. (b) Find f X (x) and f X Y (x y). (c) Are X and Y independent random variables? Justify your answer. Solution: (a) The integral of the pdf f X,Y (x,y) over < x <, < y < is c, and therefore by the definition of joint density c =. (b) The marginal pdf is obtained by integrating the joint pdf with respect to y. For 0 x, x ( ) f X (x) = cdy = x, and for x 0, f X (x) = +x +x x ( ) cdy = +x. So the marginal pdf may be written as x x f X (x) = 0 otherwise. Now since f XY (x,y) is symmetrical, f Y (y) = f X (y). Thus, (c) X and Y are not independent since f X Y (x y) = f X,Y(x,y) f Y (y) = y x + y, y 0 otherwise. f X,Y (x,y) f X (x)f Y (y). Alternatively, X and Y are not independent since f X Y (x y) depends on the value of y. 3. First available teller. Consider a bank with two tellers. The service times for the tellers are independent exponentially distributed random variables X Exp(λ ) and X Exp(λ ) respectively. You arrive at the bank and find that both tellers are busy but that nobody else is waiting to be served. You are served by the first available teller once he/she is free. What is the probability that you are served by the first teller? Solution: From the memoryless property of the exponential distribution, the remaining services for the tellers are also independent exponentially distributed random variables with

3 parameters λ and λ, respectively. The probability that you will be served by the first teller is the probability that the first teller finishes the service before the second teller does. Thus, PX < X } = f X,X (x,x )dx dx x >x = = x =0 x =0 = λ λ +λ. x =x λ e λ x λ e λ x dx dx λ e (λ +λ )x dx 4. Coin with random bias. You are given a coin but are not told what its bias (probability of heads) is. You are told instead that the bias is the outcome of a random variable P Unif[0, ]. To get more infromation about the coin bias, you flip it independently 0 times. Let X be the number of heads you get. Thus X B(0,P). Assuming that X = 9, find and sketch the a posteriori probability of P, i.e., f P X (p 9). Solution: In order to find the conditional pdf of P, apply Bayes rule for mixed random variables to get f P X (p x) = p X P(x p) f P (p) = p X (x) Now it is given that X = 9, thus for 0 p p 9 ( p) f P X (p 9) = 0 p9 ( p)dp = p9 ( p) 0 = 0p 9 ( p). p X P (x p) 0 p X P(x p)f P (p)dp f P(p). Figure compares theunconditional andtheconditional pdfsforp. Itmay beseen that given the information that 0 independent tosses resulted in 9 heads, the pdf is shifted towards the value

4 4.5 4 f P (p) f P X (p 9) Figure : Comparison of a priori and a posteriori pdfs of P 5. Optical communication channel. Let the signal input to an optical channel be: X = λ0 with probability λ with probability. The conditional pmf of the output of the channel Y X = λ 0 } Poisson(λ 0 ), i.e., Poisson with intensity λ 0 and Y X = λ } Poisson(λ ). Show that the MAP rule reduces to: λ0, y < y D(y) = λ, otherwise. Find y and the corresponding probability of error. Solution: The MAP rule λ0 p D(y) = X Y (λ 0 y) > p X Y (λ y) λ otherwise minimizes the probability of decoding error. Since the a priori probabilities for the two X values are equal, the MAP rule is equivalent to the ML rule p λ Y X (y λ 0 ) D(y) = 0 p Y X (y λ ) > λ otherwise. Now, p Y X (y ) p Y X (y 0) = e λ0 λ y 0 /y! e λ y λ /y! = e λ λ 0 yln(λ /λ 0 ). 4

5 This ratio is greater than if y < λ λ 0 ln(λ ) ln(λ 0 ). Therefore, when λ 0 = and λ =, we have D(y) = y < ln() otherwise and y = ln() =.44. The probability of error is P e = PD(Y) X} = PY > y X = }PX = }+PY < y X = }PX = } e e y = y! y! y= = y=0 Repeating this for λ 0 = and λ = 00, we have y < 99 D(y) = ln(00) otherwise and y = ln() =.497. The probability of error is P e = PD(Y) X} e = 0.5+ y! y= y=0 e 00 (00) y y! 0.5.5(P(Y > X = )+ e 00 (00) ).5(/ ).05.! where the inequality is obtained by using Markov inequality as well as the shape of the PMF of Poisson random variable (increasing before k ). This can be further tightened up (using Chebyshev s Inequality): P e.5(p( Y > X = )+ e 00 (00) ).5(/() ).00.! 6. Iocane or Sennari. An absent-minded chemistry professor forgets to label two identically looking bottles. One contains a chemical named Iocane and the other contains a chemical named Sennari. It is well known that the radioactivity level of Iocane has the Unif[0, ] distribution, while the radioactivity level of Sennari has the Exp() distribution. 5

6 (a) Let X be the radioactivity level measured from one of the bottles. What is the optimal decision rule (based on the measurement X) that maximizes the chance of correctly identifying the content of the bottle? (b) What is the associated probability of error? Solution: Let Θ = 0 denote the case in which the content of the bottle is Iocane and let Θ = denote the case in which the content of the bottle is Sennari. Implicit in the problem statement is that P(Θ = 0) = P(Θ = ) = /. (a) The optimal MAP rule is equivalent to the ML rule 0, f D(x) = X Θ (x 0) > f X Θ (x ),, otherwise. Since the Unif(0,) pdf f X Θ (x 0) is larger than the Exp() pdf f X Θ (x ) for 0 < x <, we have 0, 0 < x <, D(x) =, otherwise. (b) The probability of error is given by P(Θ D(X)) = P(Θ D(X) Θ = 0)+ P(Θ D(X) Θ = ) = P(X > Θ = 0)+ P(0 < X < Θ = ) = ( e ). 7. Two independent uniform random variables. Let X and Y be independently and uniformly drawn from the interval [0,]. (a) Find the pdf of U = max(x,y). (b) Find the pdf of V = min(x,y). (c) Find the pdf of W = U V. (d) Find the probability P X Y /}. Solution: (a) We have F U (u) = PU u} = Pmin(X,Y) u} = PX u,y u} = PX u}py u} = u 6

7 for 0 u. Hence, f U (u) = u, 0 u, (b) Similarly, F V (v) = PV > v} = Pmax(X,Y) > v} = PX > v,y > v} = PX > v}py > v} = ( v), or equivalently, F V (v) = ( v), for 0 v. Hence, ( v), 0 v, f V (v) = (c) First note that W = U V = X Y. (Why?) Hence, PW w} = P X Y w} = P( w X Y w). Since X and Y are uniformly distributed over [0,], the above integral is equal to the area of the shaded region in the following figure: y w 0 w x The area can be easily calculated as ( w) for 0 w. Hence F W (w) = ( w) and ( w), 0 w, f W (w) = 7

8 (d) From the figure above, P X Y /} = PW /} = /4. 8. Waiting time at the bank. Consider a bank with two tellers. The service times for the tellers are independent exponentially distributed random variables X Exp(λ ) and X Exp(λ ), respectively. You arrive at the bank and find that both tellers are busy but that nobody else is waiting to be served. You are served by the first available teller once he/she becomes free. Let the random variable Y denote your waiting time. Find the pdf of Y. Solution: First observe that Y = min(x,x ). Since PY > y} = PX > y,x > y} = PX > y}px > y} = e λ y e λ y = e (λ +λ )y for y 0, Y is an exponential random variable with pdf (λ +λ )e (λ +λ )y, y 0, f Y (y) = 8

9 Additional Exercises Do not turn in solutions to these problems.. Independence. Let X X and Y Y be two independent discrete random variables. (a) Show that any two events A X and B Y are independent. (b) Show that any two functionsof X andy separately areindependent; that is, if U = g(x) and V = h(y) then U and V are independent. Solution: (a) Recall that the probability of any event A X is given by PX A} = x A X p X(x). Because of the independence of X and Y, we have PX A,Y B} = p X,Y (x,y) Therefore, A and B are independent. x A X y B Y = x A X y B Y = x A X p X (x) p X (x)p Y (y) y B Y = PX A}PY B}. p Y (y) (b) Let A x = x : g(x) < u} and B y = y : h(y) < v}. Then the joint distribution of U and V is F U,V (u,v) = PU u,v v} = Pg(X) u,h(y) v} = PX A x,y B y }. However, because of the independence of X and Y, so that Z and W are independent. F U,V (u,v) = PX A x,y B y = PX A x }PY B y } = Pg(X) < u}ph(y) < v} = PU < u}pv < v} = F(u)F(v),. Family planning. Alice and Bob choose a number X at random from the set,3,4} (so the outcomes are equally probable). If the outcome is X = x, they decide to have children until they have a girl or x children, whichever comes first. Assume that each child is a girl with probability / (independent of the number of children and gender of other children). Let Y be the number of children they will have. (a) Find the conditional pmf p Y X (y x) for all possible values of x and y. (b) Find the pmf of Y. 9

10 Solution: (a) Note that Y,,3,4}. The conditional pmf is as follows (b) The pmf of Y is: p Y () = p Y (3) = p Y X ( ) =, p Y X( ) =, p Y X ( 3) =, p Y X( 3) = 4, p Y X(3 3) = 4, p Y X ( 4) =, p Y X( 4) = 4, p Y X(3 4) = 8, p Y X(4 4) = 8. 4 p X (x)p Y X ( x) = /, p Y () = x= 4 p X (x)p Y X ( x) = /3, x= 4 p X (x)p Y X (3 x) = /8, p Y (4) = p X (4)p Y X (4 4) = /4. x=3 3. Joint cdf or not. Consider the function G(x,y) = if x+y 0 0 otherwise. Can G be a joint cdf for a pair of random variables? Justify your answer. Solution: No. Note that for every x, But for any genuine marginal cdf, lim G(x,y) =. y lim F X(x) = 0. x Therefore G(x,y) is not a cdf. Alternatively, assume that G(x,y) is a joint cdf for X and Y, then P < X, < Y } = G(,) G(,) G(, ) +G(, ) = +0 =. But this violates the property that the probability of any event must be nonnegative. 0

11 0.5 f Y S (y ) f Y S (y 0) f Y S (y ) Figure : f Y S (y,s) for λ = 4. Ternary signaling. Let the signal S be a random variable defined as follows: with probability 3 S = 0 with probability 3 + with probability 3. The signal is sent over a channel with additive Laplacian noise Z, i.e., Z is a Laplacian random variable with pdf f Z (z) = λ e λ z, < z <. The signal S and the noise Z are assumed to be independent and the channel output is their sum Y = S +Z. (a) Find f Y S (y s) for s =, 0, +. Sketch the conditional pdfs on the same graph. (b) Find the optimal decoding rule D(Y) for deciding whether S is, 0 or +. Give your answer in terms of ranges of values of Y. (c) Find the probability of decoding error for D(y) in terms of λ. Solution: (a) We use a trick here that is used several times in the lecture notes. Since Y = S+Z and Z and S are independent, the conditional pdf is f Y S (y s) = f Z (y s) = λe λ y s. The plots are shown for λ = in Figure on page.

12 (b) The optimal decoding rule is MAP: D(y) = s where s maximizes p(s y) = f(y s)p(s) f(y) Sincep S (s)isthesamefors =,0,+, themaprulebecomesthemaximum-likelihood decoding rule: D(y) = argmaxf(y s). The conditional pdfs are plotted in Figure. By inspection, the ML rule reduces to s y < g(y) = 0 < y < + + y > +. (c) Inspection of Figure shows how to calculate the probability of error. P e = Perror i sent}pi sent} i Perror i sent} = 3 i = 3 ( P < S +Z < + S = 0}) + 3 P S +Z > S = } + 3 P S +Z < + S = +} = ( 3 P < Z < }) + 3 PZ < }+ 3 PZ > + } = 3 PZ < }+ 3 PZ > + }+PZ < }+ 3 PZ > + } = 3( PZ < }+PZ > + }) = 4 3 PZ > + } (by symmetry) = 4 3 λe λ z dz = 3 e λ. 5. Signal or no signal. Consider a communication system that is operated only from time to time. When the communication system is in the normal mode (denoted by M = ), it transmits a random signal S = X with +, with probability /, X =, with probability /. When the system is in the idle mode (denoted by M = 0), it does not transmit any signal (S = 0). Both normal and idle modes occur with equal probability. Thus X, with probability /, S = 0, with probability /. The receiver observes Y = S+Z, where the ambient noise Z U[,] is independent of S..

13 (a) Find and sketch the conditional pdf f Y M (y ) of the receiver observation Y given that the system is in the normal mode. (b) Find and sketch the conditional pdf f Y M (y 0) of the receiver observation Y given that the system is in the idle mode. (c) Find the optimal decoder D(y) for deciding whether the system is normal or idle. Provide the answer in terms of intervals of y. (d) Find the associated probability of error. Solution: (a) If M =, Hence, we have Y = + Z, with probability /, + Z, with probability /. f Y M (y ) = f Z(y )+ f Z(y +) = 4, y, (b) If M = 0, Y = Z, so f Y M (y 0) = f Z (y) =, y, (c) Since both modes are equally likely, the optimal MAP decoding rule reduces to the ML rule, in which 0, if f d(y) = Y M (y 0) > f Y M (y ),, otherwise 0, if < y <, =, otherwise. (d) The probability of error is given by PM d(y)} = PM =, < Y < } = PM = }P < Y < M = } = = 4. 3

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number

More information

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018. EECS 6 Probability and Random Processes University of California, Berkeley: Spring 08 Kannan Ramchandran February 4, 08 Midterm Last Name First Name SID You have 0 minutes to read the exam and 90 minutes

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Final Solutions Fri, June 8

Final Solutions Fri, June 8 EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on

More information

Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 2 Random Variables. Random Variable Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

Independent random variables

Independent random variables Will Monroe July 4, 017 with materials by Mehran Sahami and Chris Piech Independent random variables Announcements: Midterm Tomorrow! Tuesday, July 5, 7:00-9:00pm Building 30-105 (main quad, Geology Corner)

More information

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

MA 519 Probability: Review

MA 519 Probability: Review MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7 UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

Naïve Bayes classification

Naïve Bayes classification Naïve Bayes classification 1 Probability theory Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. Examples: A person s height, the outcome of a coin toss

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C, Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is

More information

Lecture Notes 3 Convergence (Chapter 5)

Lecture Notes 3 Convergence (Chapter 5) Lecture Notes 3 Convergence (Chapter 5) 1 Convergence of Random Variables Let X 1, X 2,... be a sequence of random variables and let X be another random variable. Let F n denote the cdf of X n and let

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate

More information

Conditioning a random variable on an event

Conditioning a random variable on an event Conditioning a random variable on an event Let X be a continuous random variable and A be an event with P (A) > 0. Then the conditional pdf of X given A is defined as the nonnegative function f X A that

More information

6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon.

6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon. 6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon. DO NOT TURN THIS PAGE OVER UNTIL YOU ARE TOLD TO DO SO Name: Recitation Instructor: TA: Question Score Out of 1.1 4 1.2 4 1.3

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,

More information

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

Check Your Understanding of the Lecture Material Finger Exercises with Solutions Check Your Understanding of the Lecture Material Finger Exercises with Solutions Instructor: David Dobor March 6, 2017 While the solutions to these questions are available, you are strongly encouraged

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018 Midterm 1 Last Name First Name SID You have 10 minutes to read the exam and

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

Pairs of Random Variables

Pairs of Random Variables LECTURE 4 Pairs of Random Variables 4. TWO RANDOM VARIABLES Let(,F, P)beaprobabl spaceandconsidertwomeasurablemappings X : Ă Ă, Y : Ă Ă. In other words, X andy are two random variables defined on the common

More information

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon. 604/643 Fall 200 Final Exam Solutions Wednesday, December 5, 9:00AM - 2:00noon Problem (32 points) Consider a Markov chain {X n ; n 0,, }, specified by the following transition diagram 06 05 09 04 03 2

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions Name: University of Chicago Graduate School of Business Business 490: Probability Final Exam Solutions Special Notes:. This is a closed-book exam. You may use an 8 piece of paper for the formulas.. Throughout

More information

1 Introduction. P (n = 1 red ball drawn) =

1 Introduction. P (n = 1 red ball drawn) = Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability

More information

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability

Naïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability Probability theory Naïve Bayes classification Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. s: A person s height, the outcome of a coin toss Distinguish

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

EXAM # 3 PLEASE SHOW ALL WORK!

EXAM # 3 PLEASE SHOW ALL WORK! Stat 311, Summer 2018 Name EXAM # 3 PLEASE SHOW ALL WORK! Problem Points Grade 1 30 2 20 3 20 4 30 Total 100 1. A socioeconomic study analyzes two discrete random variables in a certain population of households

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

2.1 Elementary probability; random sampling

2.1 Elementary probability; random sampling Chapter 2 Probability Theory Chapter 2 outlines the probability theory necessary to understand this text. It is meant as a refresher for students who need review and as a reference for concepts and theorems

More information

General Random Variables

General Random Variables 1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable

More information

Lecture 2: Review of Basic Probability Theory

Lecture 2: Review of Basic Probability Theory ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

[Chapter 6. Functions of Random Variables]

[Chapter 6. Functions of Random Variables] [Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Special distributions

Special distributions Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions

More information

Lecture 4: State Estimation in Hidden Markov Models (cont.)

Lecture 4: State Estimation in Hidden Markov Models (cont.) EE378A Statistical Signal Processing Lecture 4-04/13/2017 Lecture 4: State Estimation in Hidden Markov Models (cont.) Lecturer: Tsachy Weissman Scribe: David Wugofski In this lecture we build on previous

More information

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm) Math 407: Probability Theory 5/10/2013 - Final exam (11am - 1pm) Name: USC ID: Signature: 1. Write your name and ID number in the spaces above. 2. Show all your work and circle your final answer. Simplify

More information

Conditional densities, mass functions, and expectations

Conditional densities, mass functions, and expectations Conditional densities, mass functions, and expectations Jason Swanson April 22, 27 1 Discrete random variables Suppose that X is a discrete random variable with range {x 1, x 2, x 3,...}, and that Y is

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y Math 480 Exam 2 is Wed. Oct. 31. You are allowed 7 sheets of notes and a calculator. The exam emphasizes HW5-8, and Q5-8. From the 1st exam: The conditional probability of A given B is P(A B) = P(A B)

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information