Math Spring Practice for the final Exam.

Similar documents
Math Spring Practice for the Second Exam.

Twelfth Problem Assignment

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Midterm Exam 1 Solution

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

STAT 430/510 Probability Lecture 16, 17: Compute by Conditioning

Expectation of Random Variables

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Math 510 midterm 3 answers

Probability and Statistics Notes

We introduce methods that are useful in:

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

ENGG2430A-Homework 2

Functions of two random variables. Conditional pairs

Exponential Distribution and Poisson Process

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

1 Basic continuous random variable problems

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

SOLUTION FOR HOMEWORK 11, ACTS 4306

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

More than one variable

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Lecture 5: Moment generating functions

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Review of Probability. CS1538: Introduction to Simulations

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Chapter 3: Random Variables 1

Chapter 5 continued. Chapter 5 sections

CME 106: Review Probability theory

ECSE B Solutions to Assignment 8 Fall 2008

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

STAT Chapter 5 Continuous Distributions

Test Problems for Probability Theory ,

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Notes for Math 324, Part 19

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

14.30 Introduction to Statistical Methods in Economics Spring 2009

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

2 (Statistics) Random variables

STAT 414: Introduction to Probability Theory

Stat 100a, Introduction to Probability.

Chapter 3: Random Variables 1

Nonparametric hypothesis tests and permutation tests

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

ISyE 2030 Practice Test 1

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

Continuous Random Variables

Homework 10 (due December 2, 2009)

Chapter 4. Continuous Random Variables

Final Exam # 3. Sta 230: Probability. December 16, 2012

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

MATH Solutions to Probability Exercises

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

Continuous random variables

Basics of Stochastic Modeling: Part II

STAT 430/510: Lecture 16

ACM 116: Lectures 3 4

1 Random Variable: Topics

Chapter 5. Chapter 5 sections

Lecture 2: Repetition of probability theory and statistics

Tutorial 1 : Probabilities

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Probability and Distributions

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

6.041/6.431 Fall 2010 Quiz 2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH A Test #2 June 11, Solutions

1 Basic continuous random variable problems

Basic concepts of probability theory

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Basic concepts of probability theory

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

p. 6-1 Continuous Random Variables p. 6-2

1.1 Review of Probability Theory

Class 8 Review Problems solutions, 18.05, Spring 2014

3. Probability and Statistics

Basic concepts of probability theory

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

CS145: Probability & Computing

Renewal theory and its applications

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Transcription:

Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function of W.. Assume that you are thowing a dart to a round target of radious. Assume also that any point in the target is equal probable to score. Let be the center of the target and let (X, Y ) be point that the dart hits the target. Let W := X be the distance of the first coordinate from the center. Compute E[W ]. 3. (Honors) Use the Chernoff bound to obtain the following bound for the function Φ: Φ(x) e x. 4. Let X, Y be indepedent exponential random variables with expectation. Compute the pdf function of W := X Y. Compute the following probabilities: P( W t + s W s) and P( W t). What can you say about W? 5. Let X, Y be two random variable with join density function f X,Y (x, y) = x e x if y x, x <, and f X,Y = otherwise. Compute Cov(X, Y ). 6. Let X, Y, Z be independent and uniformly distributed on (, ). a. Compute the probability P(X Y +Z ). b. Compute the expectation of the random variable W = e X Y Z. 7. Let X be a random variable with ( ) M X (t) = + 5 e t + 5 e 5t + 3 e t. Compute E[X] and P(X ). Let X, X, X 3 be three independent copies of X. Compute the moment generating function of W := X + X + X 3. Then compute the probability P(W = ). 8. Let X, Y, Z be the number of units produced by three different factories. Assume that E[X] =, E[Y ] = 3, E[Z] = 5. Let W := X + Y + Z be the total production.

. Compute P(W 5).. Assume that X, Y are indepedent and var(x) = var(y ) = var(z) =, Cov(Z, X + Y ) =. Compute again the probability P(W 5). 3. Assume that X, Y, Z are independet Poisson with expectations, 3, 5. Compute P(W = 5). 9. Each day there are 4 different jobs that may take some of your time to be completed. In each day each job may appear with probability indepdently (of each other). Each job requires time X i to be completed, where X i are independent and have exponential distribution with expectation. Let Y be the total time that you may spent in a day to complete all (if any) four jobs. Compute E[Y ] and var(y ). (Honors) Compute also the moment generating function of Y. Use Chernoff bound to compute the probability P(Y 7).. A system consist of parts that work indepdently and have probability / to fail during a given period of time. The system will be functional as long at least 7 of them will be working. Use the central limit theorem to compute the probability that the system will not fail.. The number of people entering in a store follows Poisson with expectation. Let X i be the amount spent by each customer and assume that X i follows exponetial with expectation. Let W be the total amount spent in the store. Compute the expectation the variance and the moment generating function of W. Assume that X i and N are indepedent.. Let N be a d.r.v. uniformly distributed on the numbers,, 3. Let Z i, i =,, 3 be independent standard normal random variables (indepdent of N). Let Y := Z + + Z N. Is it true that Y is also normal? Explain your answer. 3. (Honors) Let X, X, be a sequence of random variables converging to with probability. Show that X i converges to in probability. Bring a Blue book with you. Additional office hours May st, :3-:3 and 4:- 5: The 3rd exam will be on Chapters: 3 (only 3.4, 3.5 and 3.6), 4, 5. The final exam schedule: 4- and 4-5 is on May 4th 8: till :, 4-53 is on May 4th :3 till :3 and 4-5 is on May 7th at 8: till : in class) Hints-Solutions

. Let B be the unit disk in R. When t, P(W t) = 4 vol( tb ) = π 4 t.. Let (X, Y ) be the random vector with density f X,Y (x, y) = if π x + y and otherwise. We have that t f X (t) = t π dy = t, t. π We have that F X (s) = P( X s) = P( s X s) = π Taking derivatives we get that s s t dt = 4 π f X (x) = 4 π x, x and s t dt. 3. E[ X ] = 4 π x x dx = π zdz. Φ(t) = P(Z t) e st M Z (s) = e (st s ), for all s > by Chernoff s bound. Since max s> {st s } = t, we conclude that Φ(t) e t. 4. We have that F Y (t) = P( Y t) = P(Y t) = So f y (t) = e t, t. We have that for s f W (s) = s t f X (t)f Y (s t)dt = e s s e t dt = e t, t e t dt = e s. Working in the same way for s we conclude that f W (s) = e s, s. Moreover F W (t) = P( W t) = P( t W t) = t t e s ds = e t = F Q (t), where Q is an exponential random variable with expectation. memoryless property. So W has the 3

5. Cov(X, Y ) = E[XY ] E[X]E[Y ]. E[XY ] = x xy x e x dydx = You work in the same way to compute E[X] and E[Y ]. 6. a. P(X Y +Z ) = y+z dxdydz. xe x dx = 4. b. E[e X Y Z] = E[e X ]E[Y ]E[Z] = ex dx y dy zdz =. 7. Set Y to be the descrete random variable with PMF p Y ( ) = 3, p Y ( 5) = 5, p Y ( ) = 5 and p Y () =. We have that M X (t) = M Y (t). So P(X ) = P(Y = ) + P(Y = ) = 3/. Also E[X] = dm X(t) dt t= = 57. Also ( M X +X +X 3 (t) = + 5 e t + 5 e 5t + 3 ) 3 e t = +terms with exponentials 3 Working as before we conclude that P(X + X + X 3 = ) = 3. 8. By Markov s inequality we have that P(W 5). We also have that 5 var(x + Y + Z) = var(x + Y ) + var(z) + Cov(X + Y, Z) = 7. By Chebychev s inequality we get that P(W 5) P( W 5) 7 5. c) Note that X + Y + Z is a Poisson with parameter (). 9. Let N be the number of jobs appearing during the day. Then N is binomial with parameters and 4. Let Y be to the total time needed to complete all jobs. Then X = X + + X N. Then we have that E[Y ] = E[N]E[X] =, var(y ) = E[N]var(X) + (E[X]) var(n) =.. Let X i be the independent Bernouli random variables that takes value if the i-th part works. Let X = X + + X be the total number of parts working. Using the De Moivre-Laplace theorem we get that (X 7) = Φ 4 7 4.

. Let W := X + + X N. We have that E[N] =, var(n) =, M N (t) = e (et ), So we get that E[X ] =, var(x ) =, M N (t) = E[W ] = E[N]E[X ] =, t. var(w ) := E[N]var(X) + (E[X]) var(n) =. M W (t) := e t t.. Assume that N is binomial (say with parameters (, ). Then M N(t) = 4 ( + et ). So if W := Z + + Z N, then M W (t) := 4 ( + e t ). But this is not the moment generating function of a normal. So W is not a normal r.v. 3. See book page 9. 5