Analysis of Engineering and Scientific Data. Semester

Similar documents
STAT2201. Analysis of Engineering & Scientific Data. Unit 3

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Fundamental Tools - Probability Theory II

SDS 321: Introduction to Probability and Statistics

Topic 3: The Expectation of a Random Variable

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 4 : Discrete Random Variables

Joint Distribution of Two or More Random Variables

Random Variables and Their Distributions

More on Distribution Function

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

Brief Review of Probability

A random variable is a quantity whose value is determined by the outcome of an experiment.

Chapter 4. Chapter 4 sections

p. 4-1 Random Variables

Lecture 10. Variance and standard deviation

SDS 321: Introduction to Probability and Statistics

STAT 430/510: Lecture 16

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

2 Continuous Random Variables and their Distributions

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

1 Random Variable: Topics

Week 12-13: Discrete Probability

STA 111: Probability & Statistical Inference

Expectation of Random Variables

Part (A): Review of Probability [Statistics I revision]

STAT 430/510: Lecture 10

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

STAT/MATH 395 PROBABILITY II

Chapter 3: Random Variables 1

Introduction to Statistical Inference Self-study

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

EE 345 MIDTERM 2 Fall 2018 (Time: 1 hour 15 minutes) Total of 100 points

Things to remember when learning probability distributions:

Probability Distributions

Northwestern University Department of Electrical Engineering and Computer Science

Random variables (discrete)

Continuous Random Variables and Continuous Distributions

Notes for Math 324, Part 19

Statistics 100A Homework 5 Solutions

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Continuous Random Variables

the amount of the data corresponding to the subinterval the width of the subinterval e x2 to the left by 5 units results in another PDF g(x) = 1 π

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Probability and Distributions

Chapter 3 Discrete Random Variables

Lecture 3. Discrete Random Variables

Topic 3: The Expectation of a Random Variable

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Recitation 2: Probability

Discrete random variables and probability distributions

The expected value E[X] of discrete random variable X is defined by. xp X (x), (6.1) E[X] =

Chapter 2: Random Variables

It can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with

3 Multiple Discrete Random Variables

CMPSCI 240: Reasoning Under Uncertainty

SDS 321: Introduction to Probability and Statistics

The random variable 1

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

General Random Variables

Chapter 2. Continuous random variables

Debugging Intuition. How to calculate the probability of at least k successes in n trials?

Introduction to Machine Learning

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Refresher on Discrete Probability

Review of Probability. CS1538: Introduction to Simulations

2 (Statistics) Random variables

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Random Models. Tusheng Zhang. February 14, 2013

Jointly Distributed Random Variables

Quick Tour of Basic Probability Theory and Linear Algebra

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Probability and Statistics Concepts

CS145: Probability & Computing

Chapter 3: Random Variables 1

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Lecture 08: Poisson and More. Lisa Yan July 13, 2018

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss?

1.6 Families of Distributions

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

STAT Chapter 5 Continuous Distributions

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

Notes for Math 324, Part 17

M378K In-Class Assignment #1

18.440: Lecture 19 Normal random variables

CME 106: Review Probability theory

Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom

Chapter 4. Continuous Random Variables

STAT 345 Spring 2018 Homework 4 - Discrete Probability Distributions

3 Continuous Random Variables

Twelfth Problem Assignment

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

Lecture 2: Repetition of probability theory and statistics

Data, Estimation and Inference

random variables T T T T H T H H

Transcription:

Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au

Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each number is equally possible. uniform distribution. What is the cdf F of X? Recall, the pdf of the uniform distribution is { 1 f (x) = 2 1 x 3 0 else. Therefore F (x) = P(X x) = x 1 x > 3 f (x) dx = 1 2 (x 1) 1 x 3 0 x < 1.

Is any of these F a cdf of a continuous R.V.? 0 x < 1 a) F (x) = 0.3 1 x < 1 1 x 1, 0 x < 0 b) F (x) = x 2 0 xleq1 1 x > 1. Check the properties: 1. 0 F (x) 1, 2. F is increasing, 3. lim x F (x) = 1 and lim x F (x) = 0, 4. lim h 0 + F (x + h) = F (x).

0 x < 1 a) F (x) = 0.3 1 x < 1 1 x 1, 1. 0 F (x) 1, 2. F is increasing, 3. lim x F (x) = 1 and lim x F (x) = 0, 4. lim h 0 + F (x + h) = F (x) at x = 1: at x = 1: lim F ( 1 + h) h 0 }{{} 0.3 lim F (1 + h) h 0 }{{} 1 = F ( 1) }{{} 0.3 = F (1) }{{} 1 All conditions are satisfied, so in fact F is a cdf.

0 x < 0 b) F (x) = x 2 0 x 1 1 x > 1. 1. 0 F (x) 1, 2. F is increasing, 3. lim x F (x) = 1 and lim x F (x) = 0, 4. lim h 0 + F (x + h) = F (x) at x = 0: lim F (h) h 0 }{{} h 2 0 = F (0) }{{} 0 2 =0 at x = 1: lim F (1 + h) = F (1) h 0 }{{}}{{} 1 1 2 The last condition is not satisfied, so F is not a cdf.

The cdf and/or pmf/pdf specify the distribution of a R.V. X. Crucial values for a distribution of X : E[X ] = expectation of a random variable weighted average of values of X, weighted by their probability Var(X ) = variance of a random variable spread/dispersion around E[X ] of the distribution.

Expectation of a R.V. Definition (Expectation) Let X be a R.V. with pmf P/pdf f. E[X ]= the expectation (or expected value/mean value) of X, calculated as: X is discrete R.V.: E[X ] = µ X = x i Ω x i P(X = x i ). X is continuous R.V.: E[X ] = µ X = x Ω xf (x) dx.

Expectation - Example What is the expected value of a tossing of a fair die? Note, x 1 = 1, x 2 = 2,..., x 6 = 6 and we have P(X = 1) = = P(X = 6) = 1 6, E[X ] = x i Ω x i P(X = x i ) = 6 i=1 i 1 6 = 7/2. Note: E[X ] is not necessarily a possible outcome of X.

Expectation - Example What is the expected value of a uniform distributed R.V. X on (0, 50]? Recall pdf of uniform distributed R.V. X is: { 0 x (, 0] (50, ) f (x) = 1 50 x (0, 50] Therefore, 50 E[X ] = xf (x) dx = x 1 0 50 dx = 1 50 1 ( 50 2 0 ) = 25 2

Expectation - Interpretation I Interpretation of E[X ] as expected profit : Suppose we play a game where you throw two dice. You are getting paid the sum of the dice in $. To enter the game you must pay z$. What would be a fair amount for z? 12 z =E[X ] = i P(X = i) = i=2 =2 P(X = 2) +... + 12 P(X = 12) = 1 =2 36 + 3 3 36 +... + 12 1 36 = 7.

Expectation - Interpretation II Interpretation of E[X ] as Centre of Mass : Imagine point masses with weights p 1, p 2,..., p n are placed at positions x 1, x 2,..., x n : Centre of mass = place where we can balance the weights = x 1 p 1 + + x n p n = x i x i p i = x i x i P(X = x i ) = E[X ].

Expectation of Transformations Definition (Expectation of a function of X ) For any g : R R, if X is a discrete R.V., then E[g(X )] = x i Ω g(x i ) P(X = x i ), where P is the pmf of X. if X is a continuous R.V., then E[g(X )] = where f is the pdf of X. g(x)f (x)dx,

Example: Let X be the outcome of the toss of a fair die. Find E [ X 2]. E [ X 2] = Note: 6 i=1 x 2 i P(X = x i ) = 1 2 1 6 + 22 1 6 +... + 62 1 6 = 91 6. 91 6 = E [ X 2] (E [X ]) 2 = ( ) 7 2 = 49 2 4.

Example: Let X be uniformly distributed in (0, 2π], and let Y = g(x ) = a cos(ωt + X ) be the value of a sinusoid signal at time t (i.e., X is like a uniform random phase). Find the expected value E[Y ]. E[Y ] = E[a cos(ωt + X )] = 2π = a 1 sin(ωt + x) 2π 0 = 0. 2π 0 acos(ωt + x) 1 2π dx

Properties of E 1. E[aX + b] = ae[x ] + b. 2. E[g(X ) + h(x )] = E[g(X )] + E[h(X )]. Proof. 1. Wlog X continuous with pdf f (x) (discrete case similar) E[aX + b] = (ax + b)f (x)dx = a xf (x)dx + b f (x)dx Ω = ae[x ] + b 1 = ae[x ] + b. 2. E[g(X ) + h(x )] = (g(x) + h(x))f (x)dx = g(x)f (x)dx + h(x)f (x)dx = E[g(X )] + E[h(X )]. Ω Ω

Variance of a R.V. Definition (Variance) The variance of a random variable X, denoted by Var(X ) is defined by Var(X ) = σ 2 X = E[X E[X ]]2. It may be regarded as a measure of the consistency of outcome: a smaller value of Var(X ) implies that X is more often near E[X ]. The square root of the variance is called the standard deviation.

The variance of a random variable

Properties of variance 1. Var(X ) = E [ X 2] (E [X ]) 2. 2. Var(aX + b) = a 2 Var(X ). Proof. 1. Var(X ) = E [ (X µ X ) 2] = E [ X 2 2X µ X + µ 2 ] X = E [ X 2] 2µ X (E[X ]) + µ 2 X = E [ X 2] µ 2 X. 2. Var(aX + b) = E [(ax + b (aµ X + b)) 2] = E [ a 2 (X µ X ) 2] = a 2 Var(X ).

Moments of a random variable E[X ] = E [ X 1] E [ X 2] = Var(X ) + ( E [ X 1]) 2 E [X r ] = r-th moment of X : If X is discrete: E[X r ] = n i=1 x r i p i If X is continuous: E[X r ] = Ω x r f (x) dx Remark: However, note that the expectation, or any moment of a random variable, need not always exist (could be ± ).

Important Distributions: for a discrete R.V. X (discrete distribution) 1. Bernoulli Distribution 2. Binomial Distribution 3. Geometric Distribution for a continuous R.V. X (continuous distribution) 1. Uniform Distribution 2. Exponential Distribution 3. Normal Distribution

Discrete Distribution I Bernoulli Distribution

Discrete Distribution I - Bernoulli R.V. X has a Bernoulli distribution with success probability p, denoted by X Ber(p), if X {0, 1} and P(X = 1) = p 1 = p P(X = 0) = 1 P(X = 1) = (1 p) It models for example: a single coin toss experiment, a success or a failure of message passing, a success of a certain drug, or, randomly selecting a person from a large population, and ask if she votes for a certain political party.

Bernoulli Distribution - Properties The expected value is E[X ] = 1 p + 0 (1 p) = p. The variance is Var(X ) = E[X 2 ] (E[X ]) 2 = 1 2 p+0 2 (1 p) p 2 = p(1 p).

Discrete Distribution II Binomial Distribution

Discrete Distribution II - Binomial Binomial distribution = Sequence of independent Bernoulli trials, denoted by X Bin(n, p). Ω = {0, 1} n X = number of success in n trials, p = probability of success (for each Bernoulli trial). Pmf of X : P(X = x) = ( ) n p x (1 p) n x, x x = 0, 1, 2,..., n

Binomial Distribution - Properties E[X ] = n i=0 x ip i = n Alternatively: x=0 x( n x ) p x (1 p) n x X = X 1 + X 2 + + X n, where X i Ber(p), so E[X ] = E[X 1 + X 2 + + X n ] = n E[X i ] = np. i=1 Var(X ) = Var(X 1 + X 2 + + X n ) = n i=1 Var(X i) = np(1 p).

Example (Binomial Distribution): In a country, 51% favours party A and 49% favours party B. What is the probability that out of 200 randomly selected individuals more people vote for B than for A? Let a vote for A be a success. Selecting the 200 people is equivalent to performing a sequence of 200 independent Bernoulli trials with success probability 0.51. We are looking for the probability that we have less than 100 successes, which is 99 i=0 ( 200 i ) 0.51 i 0.49 200 i 0.36.

Binomial Distribution

Discrete Distribution III Geometric Distribution

Discrete Distribution III - Geometric Geometric Distribution: X is the number of (independent) Bernoulli trials until the first success, denoted by X G(p) The pmf is: P(X = x) = (1 p) x 1 p, x = 1, 2, 3,... E[X ] = 1 p Var(X ) = 1 p p 2

Example (Geometric Distribution): Suppose you are looking for a job but the market is not very good, so the probability that the person is accepted (after a job interview) is only p = 0.05. Let X be the number of interviews in which the person is rejected until he eventually finds a job. TWhat is the probability that you need k > 50 interviews? P(X = k) = (1 p) 50 (1 p) (k 51) p < (1 p) 50 0.07. What is the probability that you need at least 15 but less than 30 interviews? P(15 X < 30) = 29 i=15 0.050.095 i 1 = 0.05 28 i=14 0.95 i = [ 1 0.95 29 = 0.05 1 0.95 1 ] 0.9514 = 0.05 [15.4813 10.2465] 0.2617 1 0.95