Tutorial 1 : Probabilities

Similar documents
Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Probability. Lecture Notes. Adolfo J. Rumbos

Lecture 2: Review of Probability

Midterm Exam 1 Solution

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Review of Probability. CS1538: Introduction to Simulations

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Midterm Exam 1 (Solutions)

PRACTICE PROBLEMS FOR EXAM 2

Lecture 2: Repetition of probability theory and statistics

CME 106: Review Probability theory

Probability and Statistics Concepts

ECEn 370 Introduction to Probability

2 (Statistics) Random variables

Northwestern University Department of Electrical Engineering and Computer Science

1 Basic continuous random variable problems

Chapter 1: Revie of Calculus and Probability

SS257a Midterm Exam Monday Oct 27 th 2008, 6:30-9:30 PM Talbot College 342 and 343. You may use simple, non-programmable scientific calculators.

1 Probability Review. 1.1 Sample Spaces

1 Basic continuous random variable problems

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Discrete Distributions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1

Math Spring Practice for the final Exam.

STT 441 Final Exam Fall 2013

Probability and Probability Distributions. Dr. Mohammed Alahmed

Problem Set 7 Due March, 22

Notes for Math 324, Part 19

ECE Homework Set 2

MAT 271E Probability and Statistics

ISyE 3044 Fall 2015 Test #1 Solutions

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability and Distributions

Brief Review of Probability

ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16)

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Final Solutions Fri, June 8

FINAL EXAM: Monday 8-10am

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

Probability and random variables

Math 151. Rumbos Spring Solutions to Review Problems for Exam 1

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

Part 3: Parametric Models

ECE 650 1/11. Homework Sets 1-3

ISyE 3044 Fall 2017 Test #1a Solutions

STATISTICS 1 REVISION NOTES

ECSE B Solutions to Assignment 8 Fall 2008

STA 256: Statistics and Probability I

BASICS OF PROBABILITY

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

University of Illinois ECE 313: Final Exam Fall 2014

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Class 8 Review Problems 18.05, Spring 2014

Edexcel past paper questions

1.1 Review of Probability Theory

Review of Basic Probability Theory

Final Exam # 3. Sta 230: Probability. December 16, 2012

LECTURE 1. Introduction to Econometrics

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Twelfth Problem Assignment

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

ISyE 6644 Fall 2016 Test #1 Solutions

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Homework 4 Solution, due July 23

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Part I: Discrete Math.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Exam 1. Problem 1: True or false

conditional cdf, conditional pdf, total probability theorem?

Lecture 1: August 28

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Data Analysis and Monte Carlo Methods

4. Distributions of Functions of Random Variables

1 Presessional Probability

Recitation 2: Probability

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =

6.041/6.431 Fall 2010 Quiz 2 Solutions

EE 345 MIDTERM 2 Fall 2018 (Time: 1 hour 15 minutes) Total of 100 points

Single Maths B: Introduction to Probability

MATH/STAT 3360, Probability

Exponential Distribution and Poisson Process

Formulas for probability theory and linear models SF2941

Math 105 Course Outline

ISyE 2030 Practice Test 1

Queueing Theory and Simulation. Introduction

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

ECE531: Principles of Detection and Estimation Course Introduction

1 Probability and Random Variables

Transcription:

Lund University ETSN01 Advanced Telecommunication Tutorial 1 : Probabilities Author: Antonio Franco Emma Fitzgerald Tutor: Farnaz Moradi January 11, 2016

Contents I Before you start 3 II Exercises 3 1 Probabilities 3 11 3 12 3 13 3 14 3 15 3 16 4 17 4 18 4 19 5 110 5 111 5 112 5 113 6 114 6 115 6 III Solutions 7 2 Probabilities 7 21 7 22 7 23 7 24 7 25 8 26 8 27 9 28 9 29 10 210 11 211 11 212 12 213 12 214 13 215 13

Part I Before you start This tutorial is given to prepare you to the exam Since time is limited, it is highly advised that you first try to solve the exercises (Part II) at home, then have a look at the solutions (Part III), and, finally, ask questions during the exercises sessions Part II Exercises 1 Probabilities 11 What is the difference between discrete and continuous random variables? 12 Explain what is meant by a stochastic process and give the two ways of describing stochastic processes 13 Give an example of a stochastic process that is 14 1 discrete-time, discrete-value 2 discrete-time, continuous-value 3 continuous-time, discrete-value 4 continuous-time, continuous-value Explain what is meant by a confidence interval and why they are necessary when reporting simulation or experimental results 15 Consider a random variable X describing the outcome of rolling an ordinary six-sided die, and events A defined as X {1, 2, 3}, B defined as X {4, 5, 6} and C defined as X {2, 4, 6} 3

16 1 What is Ω (the set of possible outcomes)? 2 What is the probability of each event? 3 Which events are mutually exclusive? 4 Which events are independent? 5 Draw a venn diagram with Ω, A, B, and C 6 What are the probabilities of A B, A C, B C and A B C? 7 What are the probabilities of A B, A C, B C and A B C? 8 What are P (A B) and P (A C)? The mean and variance of X are 50 and 4, respectively Evaluate: a) the mean of X 2 b) the variance and standard deviation of 2X + 3 c) the variance and standard deviation of X 17 Consider a random variable X with the following distribution: Let Y = X 2 Pr[X = 1] = 025 Pr[X = 0] = 05 Pr[X = 1] = 025 a) Are X and Y independent random variables? Justify your answer b) Calculate the covariance Cov(X, Y ) c) Are X and Y uncorrelated? Justify your answer 18 We have a transmitter T and a receiver R, that communicate over a noisy channel; they can only exchange two symbols {0, 1}, ie it is a binary channel; you know from previous measurements that a symbol is accurately detected 87% of the time (ie if you transmit a 1, it will be correctely detected as a 1 87% of the time, the same for a 0); you also know that only 30% of the messages are transmitted as 1 Given that a 1 was transmitted, what is the probability that having received a 1, the symbol is correct? 4

19 A patient has a test for some disease that comes back positive (indicating he has the disease) You are told that: the accuracy of the test is 87% (ie, if a patient has the disease, 87% of the time, the test yields the correct result, and if the patient does not have the disease, 87% of the time, the test yields the correct result) the incidence of the disease in the population is 1% Given that the test is positive, how probable is it that the patient really has the disease? 110 A taxicab was involved in a fatal hit-and-run accident at night Two cab companies, the Green and the Blue, operate in the city You are told that: 85% of the cabs in the city are Green and 15% are Blue A witness identified the cab as Blue The court tested the reliability of the witness under the same circumstances that existed on the night of the accident and concluded that the witness was correct in identifying the color of the cab 80% of the time What is the probability that the cab involved in the incident was Blue rather than Green? 111 A pair of fair dice (the probability of each outcome is 1 / 6 ) is thrown Let X be the maximum of the two numbers that come up a) Find the distribution of X b) Find the expectation E[X], the variance Var[X], and the standard deviation σ X 112 Let X be a continous, positive random variable with cumulative distribution function F X (t) = 1 e µt 1 Calculate the mean of X 2 Calculate the variance of X 5

113 A source generates customers according to a poisson distribution with a mean 1 λ = 5 seconds; the generated customers arrive at a facility, and the facility wants to know the expected number of customers in the interval (0,t) seconds, where t is 2 minutes; a) Derive a general expression for the expected number of customers arriving at the facility with an intergeneration rate of λ customers per second in the interval (0,t); b) find the value asked by the facility 114 A programmer wants to generate some events according to different CDFs: a) Exponential: F (x) = 1 e λx ; b) Pareto: F (x) = 1 ( x mx ) α for x xm c) Triangular: 0 for x < a, (x a) 2 (b a)(c a) for a x c, F (x) = 1 (b x)2 (b a)(b c) for c < x b, 1 for b < x The programming API however, does not provide any random generation apart from a uniform random number generator Name the method (s)he can use to generate the wanted events and describe the steps (including the resulting formula) to generate the events 115 Given two random variables X and Y independent and identically distributed (iid) according to a uniform distribution between 0 and 1, find the probability density function of Z = X + Y Remember that when a random variable is uniformly distributed between 0 and 1 its probability density function (PDF) is: 1 x [0, 1] f X (x) = 0 otherwise 6

Part III Solutions 2 Probabilities 21 A discrete random variable can only take one of a set of distinct (ie discrete) values, whereas a continuous random variable can take any real number within a specified range If it s possible to give a list of all the values the variable can take, then it is discrete Note that either of these can be infinite, eg a discrete random variable that can take any integer value 22 A stochastic process is a random variable that changes over time It can be described as either a random variable at each instant of time, or a function of time for each possible outcome 23 24 1 The winner of each round in a boxing match 2 The average temperature each day in a given place 3 The number of chocolates in a chocolate box 4 The amount of jam in a jar A confidence interval is a range (specified by two values) within which the population, or true mean falls with a given probability, eg 95% For example, if we have a 95% confidence interval [2, 5], we are saying that the population mean lies between 2 and 5 with 95% probability The smaller the confidence interval, the more confident we are about our results: we know with high probability that the real mean does not deviate too far from the measured mean Confidence intervals are required when reporting experimental results because the results we actually measure (whether through simulations, experiments with real hardware, etc) are stochastic, that is, they are random variables Thus we need to average across multiple samples in order to determine the true result However, since this is a random process, it s possible that the results we measure are outliers A confidence interval quantifies the variance in our results and the likelihood that the measured result is substantially incorrect 7

25 1 {1, 2, 3, 4, 5, 6} 2 P (A) = P (B) = P (C) = 05 3 A and B 4 None of the events are independent 5 6 P (A B) = 1, P (A C) = 1 P (X = 5) = 5 6, P (B C) = P (X {1, 3}) = 2 3, P (A B C) = 1 7 P (A B) = 0, P (A C) = P (X = 2) = 1 6, P (B C) = P (X {4, 6} = 1 3 8 P (A B) = 0, P (A C) = 1 3 26 We have: So: E[X] = 50 Var[X] = E[X 2 ] (E[X]) 2 = 4 a) E[X 2 ] = (E[X]) 2 + Var[X] = 50 2 + 4 = 2504 b) Var[2X + 3] = 2 2 Var[X] = 16; σ 2X+3 = 4 c) Var[ X] = Var[X] = 4; σ X = 2 8

27 a) Since Y = h(x) they are not, obviously, independent For example, if they were independent, Pr{X = 0, Y = 1} = Pr{X = 0} Pr{X = 0} = 05 05 = 025, but, since X = 0 Y = 0 2 = 0 we have Pr{X = 0, Y = 1} = 0 b) We have: Cov[X, Y ] = E[XY ] E[X] E[Y ] E[X] = i Pr[X = i] = 1 025 + 0 05 + 1 025 = 0 i X E[Y ] = i Pr[Y = i] = 1 05 + 0 05 = 05; i Y E[XY ] = 1 025 + 0 05 + 1 025 = 0 Cov[X, Y ] = 0 0 05 = 0 c) Being Cov[X, Y ] = 0 they are, by definition, uncorrelated 28 T 1 087 R 1 013 013 T 0 087 R 0 We define the following events: T 1 - a 1 is transmitted T 0 - a 0 is transmitted R 1 - a 1 is received R 0 - a 0 is received 9

We know: Pr{T 1 } = 03 Pr{T 0 } = 1 Pr{T 1 } = 07 Pr{R i T j } = We use the bayes theorem: Pr{T 1 R 1 } = Pr{T 1, R 1 } Pr{R 1 } { 087 i = j 1 087 = 013 i j = Pr{T 1} Pr{R 1 T 1 } Pr{R 1 } 03 087 = Pr{(T 0 R 1 ) (T 1 R 1 )} 0261 = Pr{T 0, R 1 } + Pr{T 1, R 1 } 0261 = Pr{T 0 } Pr{R 1 T 0 } + Pr{T 1 } Pr{R 1 T 1 } 0261 = 07 013 + 03 087 = 0261 0352 074, so, having sent a 1, we have a 74% chance our symbol will be correctly decoded receiver side! 29 We define the following events: D - patient has the disease T ok - test is positive; We can immediately write 1 : Pr{D T ok } = Pr{D T ok} Pr{T ok } = 001 087 001 087 + 099 013 = 00633 ; the probability is still low (just over 63%) even though the test was positive, because the test s accuracy is low 1 Remember that Pr{T ok } = Pr{(D T ok ) ( D T ok )} = Pr{D} Pr{T ok D} + (1 Pr{D}) (1 Pr{T ok D}), see Tutorial 2 exercise 13 10

210 We define the following events: B - taxi was Blue W B - witness said Blue; We can immediately write: Pr{B W B } = Pr{B W B} 015 08 = Pr W B 015 08 + 085 02 = 04138 211 The maximum will be: 1 in 1 case out of 36 (both dice come out as 1 ); 2 in 3 cases out of 36 ( 1,2, 2,1, 2,2 ); 3 in 5 cases out of 36 ( 1,3, 2,3, 3,3, 3,1, 3,2 ); 4 in 7 cases out of 36 ( 1,4, 2,4, 3,4, 4,4, 4,1, 4,2, 4,3 ); 5 in 9 cases out of 36 (reasoning as above), and 6 in 11 cases out of 36 (reasoning as above) a) Probability function: P 1 = 1/36, P 2 = 3/36, P 3 = 5/36, P 4 = 7/36, P 5 = 9/36, P 6 = 11/36 b) E[X] = (1 1/36) + (2 3/36) + (3 5/36) + (4 7/36) + (5 9/36) + (6 11/36) = 161/36 4472 E[X 2 ] = (1 2 1/36) + (2 2 3/36) + (3 2 5/36) + (4 2 7/36) + (5 2 9/36) + (6 2 11/36) = 791/36 21972 Var[X] = E[X 2 ] (E[X]) 2 = 21972 19999 1973 σ X = Var[X] 1404 11

212 First we calculate the density function: f X (t) = d dt F X(t) = µe µt 1 E(X) = tf X (t)dt = 0 0 tµe µt dt = 1 µ 2 First the second moment is calculated E(X 2 ) = After that we get 0 t 2 µe µt dt = 2 µ 2 213 V (X) = E(X 2 ) E 2 (X) = 2 µ 2 1 µ 2 = 1 µ 2 a) The probability P k (t) of k customers arriving in the interval (0,t), being a Poisson process is P k (t) = e λt (λt) k k!, so the expected number of customers is: E[K] = kp k (t) = e λt k=0 since the factorial cannot be negative: k=0 k (λt)k k! E[K] = e λt k=1 (λt) k (k 1)! = e λt (λt) (λt) k k=0 k! = e λt (λt)e λt = λt ; so, we will have, on average, λt customers arriving in the interval (0,t) seconds b) E[K] = λt = 1 120 = 6 customers 5 12

214 We can use the inverse sampling transform thaorem that states, given a uniformly distributed random variable U between 0 and 1, we can generate samples of the variable X with CDF F (x) by simply writing X = F 1 (U); a) F (x) = 1 e λx F 1 (u) = ln(1 u) λ ; since U is distributed uniformly between 0 and 1 X = ln(u) λ ( ) 1 b) Similarly, F 1 α (u) = xm 1 u X = ( ) 1 x mu α c) Applying the same rule: X = a + U(b a)(c a) for 0 < U < c a b a X = b (1 U)(b a)(b c) for c a b a U < 1 215 Since we are going from a two dimensional space [X,Y] to a one dimensional space Z, we need first to map the transformation to a two dimensional space: Z = X + Y X = Z Y Y = Y Y = Y ; we calculate the determinant of the Jacobian matrix: X X J = Z Y Y = 1 1 0 1 = 1 Z Y Y then we use the standard transformation formula, noticing that the two random variables are independent: f Y Z (y, z) = J f XY (z y, y) = f X (z y)f Y (y) now we saturate over y in order to find f Z (z): f Z (z) = + z dy z [0, 1] 0 z z [0, 2] 1 f X (z y)f Y (y)dy = dy z y [0, 1] = 0 otherwise z 1 0 otherwise 13