ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

Similar documents
Conditional Probability

Recitation 2: Probability

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Week 2. Review of Probability, Random Variables and Univariate Distributions

Northwestern University Department of Electrical Engineering and Computer Science

SDS 321: Introduction to Probability and Statistics

Lecture 3. Discrete Random Variables

EE 505 Introduction. What do we mean by random with respect to variables and signals?

6.3 Bernoulli Trials Example Consider the following random experiments

Probability and Probability Distributions. Dr. Mohammed Alahmed

Topic 3: The Expectation of a Random Variable

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Lecture 4: Random Variables and Distributions

Classical Probability

Fundamental Tools - Probability Theory II

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Lecture 16. Lectures 1-15 Review

Quick Tour of Basic Probability Theory and Linear Algebra

Chapter 2: Random Variables

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

1 Presessional Probability

Analysis of Engineering and Scientific Data. Semester

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

6 The normal distribution, the central limit theorem and random samples

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Week 12-13: Discrete Probability

Discrete Random Variables

Disjointness and Additivity

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

M378K In-Class Assignment #1

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Binomial and Poisson Probability Distributions

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Special distributions

Probability and Distributions

CSC Discrete Math I, Spring Discrete Probability

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Set Theory Digression

Continuous Random Variables

ACM 116: Lectures 3 4

Binomial random variable

Random Variables Example:

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Discrete Probability

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections

1 Review of Probability and Distributions

Deep Learning for Computer Vision

Lecture 10. Variance and standard deviation

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Brief Review of Probability

Chapter 7: Theoretical Probability Distributions Variable - Measured/Categorized characteristic

Chapter 3 Discrete Random Variables

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur

Random Variables and Their Distributions

Counting principles, including permutations and combinations.

Chap 2.1 : Random Variables

STAT Chapter 5 Continuous Distributions

Recap of Basic Probability Theory

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

Basics on Probability. Jingrui He 09/11/2007

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

p. 4-1 Random Variables

EE4601 Communication Systems

Chapter 2. Continuous random variables

ECE 302: Probabilistic Methods in Electrical Engineering

Recap of Basic Probability Theory

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Communication Theory II

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Guidelines for Solving Probability Problems

Random Models. Tusheng Zhang. February 14, 2013

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Laws of Probability, Bayes theorem, and the Central Limit Theorem

Lecture 2: Repetition of probability theory and statistics

Discrete Random Variables

Discrete Random Variable

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

Stochastic Models of Manufacturing Systems

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Relationship between probability set function and random variable - 2 -

Chapter 3. Chapter 3 sections

Single Maths B: Introduction to Probability

Lecture 1: Review on Probability and Statistics

Chapter 5. Chapter 5 sections

Class 26: review for final exam 18.05, Spring 2014

Name: Firas Rassoul-Agha

Laws of Probability, Bayes theorem, and the Central Limit Theorem

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

Introduction to Probability and Stocastic Processes - Part I

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Transcription:

1 ACM 116: Lecture 2 Agenda Independence Bayes rule Discrete random variables Bernoulli distribution Binomial distribution Continuous Random variables The Normal distribution Expected value of a random variable

2 Bayes rule B 1,..., B n mutually exclusive, n i=1 B i = Ω, then for any event A, P (B j A) = P (A B j)p (B j ) n i=1 P (A B i)p (B i ) because P (A B j )P (B j ) = P (A B j ) P (A B i )P (B i ) = P (A)

3 Example : Polygraph tests (lie-detector tests) Event +/- : polygraph reading is positive/negative Event T : subject telling the truth Event L : subject is lying Studies of polygraph reliability P (+ L) =.88 P ( L) =.12 P ( T ) =.86 P (+ T ) =.14 On a particular question, a vast majority of the subjects have no reason to lie : P (T ) =.99 P (L) =.01

4 Example : Polygraph tests Probability that a person is telling the truth when the polygraph is + P (T +) = = P (+ T )P (T ) P (+ T )P (T ) + P (+ L)P (L).14.99.14.99 +.88.01 =.94 Conclusion : screening this population of largely innocent people, 94% of positive readings will be in error. This examples illustrates the dangers in using screening procedures on large populations.

5 Independence Two events are independent if the chance of getting B is the same whether A occurred or not. P (B A) = P (B A c ) = P (B) i.e. P (A B) = P (A)P (B) Definition : Two events are independent if P (A B) = P (A)P (B)

6 Independence Example : Sex distribution for 3 children {b, b, b}, {b, b, g},..., {g, g, g} each with p = 1/8 Event A : there is at most one girl Event B : family has children of both sexes P (A) = 4/8 P (B) = 6/8 P (A B) = 3/8 = P (A)P (B) because A B = {b, b, g}, {b, g, b}, {g, b, b} Independence

7 Independence : generalization The collection of events A 1, A 2,..., A n are said to be mutually independent if P (A i1... A in ) = P (A i1 )... P (A im ) for any subcollection A i1,... A im. Remark : pairwise independence is not enough! Example : two coin tosses A : first toss H B : second toss H C : exactly one H Pairwise independence but it is clear that A, B, C are not independent.

8 Independence : generalization Example : System made of several components 1. In a series : system fails if any of the component fails P (works) = (1 p) n e.g. p =.5, n = 10, P (works =.6) 2. Parallel : system fails if all components fail P (fails) = p n e.g. P (fails) = (.5) 10 10 13

9 Random variables Sometimes, you are not interested in all the details of the outcome of an experiment, but rather interested in the numerical value of some quantity determined by the outcome of an experiment. Example : Survey, random sample of size N = 100 R.V. = proportion of people who favor candidate A. We are not interested in individual responses but rather in the # of people supporting candidate A. Example : Physics R.V. = # particles hitting a detector. We are not interested in individual particles. Definition A r.v. is a function from the sample space to the real numbers

10 Discrete Random Variables R.v. that can take on only a discrete set of values (finite or countable) Example : experiment that consists of sampling people from a large population until you find someone with a given disease. X is the number of people you sampled. Probability distribution : X can take on values x 1, x 2,... p(x i ) = P (X = x i ) p(x i ) = 1 Cumulative distribution function (cdf) i F (x) = P (X x), < x <

11 Examples The Bernoulli distribution The binomial distribution The Poisson distribution (later)

12 The Bernoulli distribution Bernoulli r.v. takes on only two values : 0 or 1. 0 w.p. 1 p X = 1 w.p. p e.g. interested in whether some event A occurs or not : 0 if A does not occur I A = 1 if A occurs P (I A = 1) = P (A)

13 The Binomial distribution Binomial distribution n independent experiments or trials are performed n is fixed Each experiment results in a success with a probability p, and failure with a probability 1 p Then total number of successes is a binomial r.v. with parameters n and p. X can take on values 0, 1,..., n and ( ) n P (X = k) = p k (1 p) n k k Why?

14 The Binomial distribution Each configuration with k successes is equally likely p k (1 p) n k How many such configurations? ( n) k ( ) n P (X = k) = p k (1 p) n k k Example : Power supply problem N customers use electric power intermittently (say N = 100) Assume each customer has the same probability p of requiring power at any given time t (say p = 1/3) Suppose power is adjusted to k power units. How large should we choose k s.t. the probability of overload is.1%?

15 P (X = k) = ( ) n p k (1 p) n k k Take the minimum k s.t. P (X > k) 10 4, which gives k 51.01% k 48.1% k 45 1%

16 Example : Digital Communications Input Signal Transmission Channel Output Channel is noisy : each bit has probability p of being incorrectly transmitted. To improve reliability, send bits a repeated number of times n. Say n odd Majority decoder X = #errors Binomial(n, p) P (Bit correctly decoded) = P (X < n/2) E.g. n = 5, p =.1 P (Bit correctly decoded) = P (X 2) =.9914 Improvement in reliability.

17 Continuous random variables A R.V, X is a mapping X : Ω R ω X(ω) Cumulative distribution function (CDF) : F (x) = P (X x) X is said to be a continuous random variable if there exists f s.t. P (X A) = f(x) dx for any reasonable set A f is called the density function A

18 Continuous random variables Interpretation a b x x+dx

19 The normal distribution Central role in probability and statistics Introduced by Gauss to model measurement errors Sum of independent r.v. s is approximately normal (CLT) Two parameters µ (mean), σ (standard deviation) f(x) = 1 2πσ e 1 2 (x µ) 2 σ 2 Notation : X N(µ, σ 2 ) X follows a normal distribution with parameters µ and σ 2

20 The normal distribution This is the bell curve σ µ Symmetric around µ Bell-shaped Spread is about σ

21 The uniform distribution Interval [a, b] Notation : X U[a, b] f(x) = 1 b a if x [a, b] 0 otherwise 1/(b a) a b

Expected value of a random variable 22 Definition The expected value of a r.v. X is denoted by E(x) and defined by E(X) = i x i p(x = x i ) = i x i p i provided that the sum is absolutely convergent, i.e. x i p(x i ) < Other names : Expectation, Mean Interpretation i E(x) : point at which the histogram balances out

23 Expected value Example X Binomial(n, p) E(X) = np i.e. np is the mean of the r.v. X. For a continuous r.v., sum is replaced by an integral. Definition X continuous r.v. with density f(x), then E(X) = xf(x) dx Example : X N(µ, σ 2 ) E(X) = µ

Expectations of functions of r.v. s 24 X random variable. We are not interested in E(X) but in E(g(X)). Example : Kinetic theory of gas : distribution of the modulus of the velocity v f V (v) = 2/π σ 3 v 2 e v 2 /(2σ 2 ) and we are interested in Y = 1 2 mx2, the kinetic energy. What is E(Y )? Theorem : Let Y = g(x) X discrete with distribution function p(x) E(Y ) = x g(x)p(x) provided sum is absolutely convergent X with density f(x) E(Y ) = g(x)f(x) dx

25 Expectations of functions of r.v. s Proof (discrete case) E(Y ) = i y i P (Y = y i ) Let A i be the set of x s such that g(x) = y i. P (Y = y i ) = p Y (y i ) = x A i p(x) E(Y ) = i y i p Y (y i ) = i = x g(x)p(x) A i g(x)p(x)

26 Expectations of functions of r.v. s Example : Kinetic theory of gas 1 E(Y ) = 2 mx2 f X (x) dx = m 2/π x 4 e x2 /(2σ 2) dx 2 σ 3 0 = m 2/πσ 2 u 4 e u4 /2 du 2 = 3 2 mσ2 0

27 Linearity of the expectation X 1,..., X n n r.v. s with expectation E(X i ) Y = b 1 X 1 +... + b n X n Then E(Y ) = b 1 E(X 1 ) +... + b n E(X n ) e.g. the expected value of a sum of r.v. s is equal to the sum of their expected values. Example : Hat problem n men attend a dinner leave their hat take a random hat when they leave N : # of men who pick their own hat. What is E(N)?

28 Linearity of the expectation N = I 1 +... + I n with I i = 1 if person i picks his hat 0 otherwise E(N) = E(I 1 +... I n ) = E(I 1 ) +... + E(I n ) E(I i ) = P (I i = 1) = 1 n E(N) = 1