What is a random variable

Similar documents
Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Expected Value 7/7/2006

Conditional Probability

What is Probability? Probability. Sample Spaces and Events. Simple Event

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Probability. VCE Maths Methods - Unit 2 - Probability

Homework 4 Solution, due July 23

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Deep Learning for Computer Vision

CSC Discrete Math I, Spring Discrete Probability

k P (X = k)

Lecture 3: Random variables, distributions, and transformations

2. Conditional Probability

Steve Smith Tuition: Maths Notes

Discrete Random Variable

18.05 Practice Final Exam

SDS 321: Introduction to Probability and Statistics

Dept. of Linguistics, Indiana University Fall 2015

Probability Distributions for Discrete RV

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

p. 4-1 Random Variables

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

More on Distribution Function

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

STAT/MA 416 Answers Homework 4 September 27, 2007 Solutions by Mark Daniel Ward PROBLEMS

27 Binary Arithmetic: An Application to Programming

CS 361: Probability & Statistics

Applied Statistics I

Probability Pearson Education, Inc. Slide

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Conditional Probability

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

Probability (10A) Young Won Lim 6/12/17

What does independence look like?

Part (A): Review of Probability [Statistics I revision]

Chapter 2: The Random Variable

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture #13 Tuesday, October 4, 2016 Textbook: Sections 7.3, 7.4, 8.1, 8.2, 8.3

Class 26: review for final exam 18.05, Spring 2014

Probability Theory and Simulation Methods

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Random Variables Chris Piech CS109, Stanford University. Piech, CS106A, Stanford University

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Lecture 2. Constructing Probability Spaces

Joint Distribution of Two or More Random Variables

Discrete random variables and probability distributions

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

Lecture Lecture 5

Conditional Probability and Bayes Theorem (2.4) Independence (2.5)

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Lecture 1 : The Mathematical Theory of Probability

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

CSE 312: Foundations of Computing II Random Variables, Linearity of Expectation 4 Solutions

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Lecture 3. Discrete Random Variables

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

324 Stat Lecture Notes (1) Probability

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

Bayes Rule for probability

Lecture 3 - Axioms of Probability

CS206 Review Sheet 3 October 24, 2018


ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

Lecture 10. Variance and standard deviation

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Introduction to Probability 2017/18 Supplementary Problems

STAT 430/510 Probability

Quantitative Methods for Decision Making

M378K In-Class Assignment #1

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Topic 3: The Expectation of a Random Variable

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Multivariate Distributions (Hogg Chapter Two)

Review of Probability. CS1538: Introduction to Simulations

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

Marquette University MATH 1700 Class 5 Copyright 2017 by D.B. Rowe

Discrete Probability Distribution

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

Chapter 1 Probability Theory

the time it takes until a radioactive substance undergoes a decay

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

CS 361: Probability & Statistics

1 Normal Distribution.

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Exam III #1 Solutions

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

Discrete Structures for Computer Science

Term Definition Example Random Phenomena

Name: Firas Rassoul-Agha

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Lecture notes for probability. Math 124

Transcription:

OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr 4/0/20 Lecture 3 What is a random variable Random Variables A random variable X is not a variable like algebra A random variable X is a function: From a set of outcomes of a random event (the sample space S of an experiment) To the set of real numbers Realizations of a random variable are called random variates. Set of outcomes of a coin toss heads Random Variable X ( ξ ) R tails - 4/0/20 Lecture 3 2

Example Experiment: throw 3 coins Sample Space: S = {(H,H,H), (H,H,T), (H,T,H), (T,H,H), (H, T, T), (T,H,T), (T,T,H),(T,T,T)} Y is a random variable, giving the number of heads that landed: PY ( = 0) = (H,H,H) 8 3 (H,H,T) (H,T,H) PY 3 ( = ) = 8 (T,H,H) H) 2 3 (T,H,T) (T,T,H) (H,T,T) (T,T,T) 4/0/20 Lecture 3 3 0 ( 2) PY= = ( 3) 8 PY= = 8 Three balls are to be randomly selected without replacement from an urn containing 20 balls numbered through 20. If we bet that at least one of the balls that are drawn has a number as large as or larger than 7, what is the probability that we win the bet? Let X be the largest of the three numbers drawn. i 2 P{ X = i} = i = 3,4,...,20 20 3 { 7} = { = 7} + { = 8} + { = 9} + { = 20} P X P X P X P X P X = 0.508 4/0/20 Lecture 3 4 2

Independent trials consisting of the flipping of a coin having probability p of coming up heads are continually performed until either a head occurs or a total of n flips is made. If we let X denote the number of times the coin is flipped, then X is a random variable taking on one of the values, 2, 3,..., n with respective probabilities: bili i { = } = p { = 2} = ( ) { = 3} = ( ) 2 P X P X p p P X p p n 2 P{ X = n } = ( p) p n P X = n = p { } ( ) 4/0/20 Lecture 3 5 Three balls are randomly chosen from an urn containing 3 white, 3 red, and 5 black balls. Suppose that we win $ for each white ball selected and lose $ for each red ball selected. If we let X denote our total winnings from the experiment, then X is a random variable taking on the possible values 3, 2,, 0,, 2, 3 with respective probabilities bili i Suppose every ball has a number. Then your balls are: W, W2, W3, R, R2, R3, B, B2, B3, B4, B5 Or for convenience I will number them from to. So there are ways to choose three balls from this set. 3 4/0/20 Lecture 3 6 3

The list of possible values for X is { 3, 2,,0,,2,3} To get 3, we must choose RRR. To get 2, we must choose 2 R and one B To get, we must choose 2 R and one W or one R and two B. To get 0, we must choose one R, one W and one B or BBB To get +, we must choose 2 W and one R or one W and two B To get +2, we must choose 2W and one B To get +3, we must choose WWW. So: 3 3 P{ X = 3} = P{ X = 3 } = = 65 3 4/0/20 Lecture 3 7 3 5 2 5 P{ X = 2} = P{ X = 2 } = = 65 3 3 3 3 5 2 + 2 39 P{ X = } = P{ X = } = = 65 3 5 3 3 5 3 + 55 P{ X = 0} = = 65 3 4/0/20 Lecture 3 8 4

The cumulative distribution function For a random variable X, the function F defined by F( x) = P{ X x} < x< is called the cumulative distribution function,, or, the distribution function, of X. Thus, the distribution function specifies, for all real values x, the probability that the random variable is less than or equal to x. F(x) is a nondecreasing function of x, that is, If a < b then F(a)<F(b) < F(b). 4/0/20 Lecture 3 9 { } P{ X } For the previous example: P{ X = 3} = P{ X = 3} = 65 5 P { X = 2 } = P { X = 2 } = 65 39 P X = = = = 65 55 P{ X = 0} = 65 4/0/20 Lecture 3 0 5

For the previous example: F ( 3) = 65 5 6 F ( 2) = + = 65 65 65 5 39 55 F ( ) = + + = 65 65 65 65 5 39 55 0 F ( 0) = + + + = 65 65 65 65 65 5 39 55 39 49 F ( + ) = + + + + = 65 65 65 65 65 65 5 39 55 39 5 64 F ( + 2) = + + + + + = 65 65 65 65 65 65 65 5 39 55 39 5 65 F ( + 3) = + + + + + + = = 65 65 65 65 65 65 65 65 4/0/20 Lecture 3 Probability Mass Function Is defined for a discrete variable X. Suppose that ( ) p a p( a) = P{ X = a} ( ) 0 for i,2,... p xi = = p( x) = 0 for all other values of x Then since x must be one of the values x i, i= ( ) p x i = 4/0/20 Lecture 3 2 6

Example of probability mass function ( ) P{ X } () P{ X } ( ) P { X } p 0 = = 0 = 4 p = = = 2 p 2 = = 2 = 4 4/0/20 Lecture 3 3 Example The probability mass function of a random variable X is given i by p() i = cλ i! i=0,,2, where λ is some positive value. Find (a) P{X = 0} and (b) P{X > 2}. i= 0 pi () = i λ λ pi () = c = ce = i= 0 i = 0 i! c= e λ e x x = i i i= 0! 4/0/20 Lecture 3 4 7

The cumulative distribution function The cumulative distribution function F can be expressed in terms of p(a) by Fa ( ) = p ( x ) all x a If X is a discrete random variable whose possible values are x, x 2, x 3, where x < x 2 < x 3 < then the distribution function F of X is a step function. 4/0/20 Lecture 3 5 Example For example, suppose the probability mass function (pmf) of X is p ( ) = p ( 2 ) = p ( 3 ) = p ( 4 ) = 4 2 8 8 then the distribution function F of X is 0 a < 4 a < 2 78 3 a < 4 4 a ( ) = 34 2 a < 3 F a 4/0/20 Lecture 3 6 8

Expectation of a random variable If X is a discrete random variable having a probability mass function p(x) then the expectation or the expected value of X denoted by E[X] is defined by [ ] E X = xp : ( x) > 0 xp( x) In other words, Take every possible value for X Multiply it by the probability of getting that value Add the result. 4/0/20 Lecture 3 7 Examples of expectation For example, suppose you have a fair coin. You flip the coin, and define a random variable X such that If the coin lands heads, X = If the coin lands tails, X = 2 Then the probability mass function of X is given by p () = p( 2) = Or we can write p( x) = { E[ X ] = + 2 =.5 2 2 2 2 if x= or x= 2, 0 otherwise. 4/0/20 Lecture 3 8 9

Examples of expectation Next, suppose you throw a fair die. You flip the die, and define a random variable Y such that If the die lands a number less than or equal to 5, then Y = 0 If the die lands a number greater than 5, then Y = Then the probability mass function of Y is given by 56 if y = 0, p( y) = Pr{ Y = y} = 6 if y =, 0 otherwise. 5 E[ X ] = 0 + = 6 6 6 4/0/20 Lecture 3 9 Frequency interpretation of probabilities The law of large numbers we will see in chapter 8 assumes that if we have an experiment (e.g. tossing a coin) and we perform it an infinite number of times, then the proportion of time that any event E occurs will be P(E). ( ) [Recall here than event means a subset of the sample space, or a set of outcomes for the experiment] So for instance suppose X is a random variable which will be equal to x with probability p(x ), x 2 with probability p(x 2 ),, x n with probability p(x n ). By the frequency interpretation, if we keep playing this game, then the proportion of time that we win x i will be p(x i ). 4/0/20 Lecture 3 20 0

Frequency interpretation of probabilities Or we can say that when we play the game N times, where N is a very big number, we will win x i about Np(x i ) times. Then the average winnings per game will be: ( No. of times I won ) ( No. of times I won )... ( No. of times I won ) x x + x2 x2 + + xn x No. of times I played Np ( x ) + x Np( x ) +... + x Np( x ) = = xpx = n 2 2 n n n N i= ( ) EX [ ] n n 4/0/20 Lecture 3 2 Question: Example 3a Find E[X] where X is the outcome when we roll a fair die. Solution: Since p() = p( 2) = p() 3 = p( 4) = p( 5) = p( 6) = 6 [ ] = p ( ) + 2 p ( 2 ) + 3 p ( 3 ) + 4 p ( 4 ) + 5 p ( 5 ) + 6 p ( 6 ) E X p p p p p p 6 7 = ( + 2+ 3+ 4+ 5+ 6) = = 3.5 6 6 2 4/0/20 Lecture 3 22

Example 3b Question: We say that I is an indicator variable for an event A if if A occurs I = c 0 if A occurs What is E[I]? c [ ] = ( ) + 0 ( ) = ( ) E I p A p A p A 4/0/20 Lecture 3 23 Example 3d A school class of 20 students is driven in 3 buses to a symphonic performance. There are 36 students in one of the busses, 40 in another, and 44 in the third bus. When the busses arrive, one of the 20 students is randomly chosen. Let X denote the number of students on the bus of that randomly chosen student, and find E[X]. Solution: [ ] 36 Pr{ Student is on st bus} 40 Pr{ Student is on 2nd bus} + 44 Pr { Student is on 3rd bus } E X = + { } { } { } = Pr Student is on st bus = 36 20 Pr Student is on 2nd bus = 40 20 Pr Student is on 2nd bus 44 20 E[ X ] = 36 36 20 + 40 40 20 + 44 44 20 = 40.27 4/0/20 Lecture 3 24 2

Example 3d Same problem as before, but assume that the bus is chosen randomly instead of the student, and find E[X]. Solution: [ ] 36 Pr{ st bus is chosen} 40 Pr{ 2nd bus is chosen} + 44 Pr{ 3rd bus is chosen} E X = + { } = { } = { } = Pr st bus is chosen Pr 2nd bus is chosen Pr 3rd bus is chosen E[ X ] = 36 3+ 40 3+ 44 3 = 40.00 3 4/0/20 Lecture 3 25 Expectation of a function of a random variable To find E[g(x)], that is, the expectation of g(x) Two step process: find the pmf of g(x) find E[g(x)] 4/0/20 Lecture 3 26 3

Let X denote a random variable that takes on any of the values, 0, and with respective probabilities PX { = } = 0.2 PX { = 0} = 0.5 PX { = } = 0.3 2 Compute E X Solution Let Y = X 2. PY { = } = PX { = } + PX { = } = 0.5 PY { = 0} = P{ X= 0} = 0.5 Then the probability mass function of Y is given by 0.5 if y = 0 or y = p y = 0 otherwise. [ ] 2 = E Y = + = E X ( ) { (0.5) 0(0.5) 0.5 4/0/20 Lecture 3 27 Statistics vs. Probability You may have noticed that the concept of expectation seems a lot like the concept of average. So why do we use this fancy new word expectation? Why not just call it average? We find the average of a list of numbers. The numbers are already known. We find the expectation of a random variable. We may have only one such random variable. We may only toss the coin or die once. 4/0/20 Lecture 3 28 4

Statistics vs. Probability For instance, let us define a random variable X using the result of a coin toss: let X = if the coin lands heads, X = 0 if the coin lands tails. If we perform this experiment K times, we will get a list of values for X. We can find the average value for K by adding all the values for X, and dividing by K. K i X K i = Is this coin fair? We don t know, but we can find out. p p Number of times the coin lands heads = = = K Number of times the coin lands tails = = = K ( 0) Pr{ X 0} () Pr{ X } 4/0/20 Lecture 3 29 Statistics vs. Probability What we did on the previous slide was statistics: we analyzed the data to draw some conclusions about the process or mechanism (i.e. the coin) that generated that data. Probability is how we draw conclusions about the future. So suppose I did the experiments on the previous slide yesterday. Today I will come into the class and toss the coin exactly once. Then I can use the statistics from yesterday to help find out what I can expect the result of the coin toss to be today: { } = ( ) = 0 ( 0) + ( ) E X ip i p p i= 0 4/0/20 Lecture 3 30 5

Statistics vs. Probability Okay, so I got 0.5. What does this mean? X can never equal 0.5. Expectation makes more sense with continuous random variables, e.g. when you measure a voltage on a voltmeter. With the coin toss you can think of it this way: Suppose someone wants you to guess X. But you will pay a lot of money if you re wrong, and the money you pay is proportional to how wrong you are. If you guess g, and the result was actually a, then you have to pay 00( g a) 2 What should you guess? You must minimize ( g ) 2 p( ) + ( g 0) 2 p( 0) If you guess g=e[x], then this penalty is minimized. 4/0/20 Lecture 3 3 Statistics: how to find the pmf of a random voltage from measurements Suppose you are going to measure a voltage. You know that the voltage is really about 5V. But you have an old voltmeter that doesn t measure very well. The voltmeter is digital and has decimal place. So you can only read voltages 0.00, 0.,, 4.7, 4.8, 4.9, 5.0, 5.,, 9.9. You start measuring the voltage. You get the following measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2, From these measurements you can construct a probability mass function graph as follows. 4/0/20 Lecture 3 32 6

Pmf drawn from results of experiment Measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2,5.0, 4.5, 4.8, 5., 5.0, 5., 4.9, 5.3, 5., 5.2, 5., 5.4 8 3 9 5 9 7 6 4 4 8 6 0 7 3 2 2 8 5 20 4.5 4.6 4.7 4.8 4.9 5.0 5. 5.2 5.3 5.4 5.5 4/0/20 Lecture 3 33 And to show this with animation Measurements: 4.7, 5.0, 4.9, 5.0, 5.3, 4.9, 4.8, 5.2,5.0, 4.5, 4.8, 5., 5.0, 5., 4.9, 5.3, 5., 5.2, 5., 5.4 8 3 9 5 9 7 6 4 4 8 6 0 7 3 2 2 8 5 20 4.5 4.6 4.7 4.8 4.9 5.0 5. 5.2 5.3 5.4 5.5 4/0/20 Lecture 3 34 7

pmf derived mathematically Based on the frequency interpretation, we can define the pmf as follows: p 45 = 4 ( 4.5) = p ( 5.) 20 20 p ( 4.6) = 0 p ( 5.2) = 2 20 p ( 4.7) = 20 p ( 5.3) = 2 20 p ( 4.8) = 2 20 p ( 5.4) = 20 p ( 4.9) = 3 p 20 ( 5.5) = 0 p ( 5.0) = 5 20 Now I can predict the future based on this pmf. Probability does not bother with data. Statistics is all about data. 4/0/20 Lecture 3 35 Statistics vs. Probability Are these the correct probabilities? I don t know. Even if we ran the experiment millions of times, we would be wrong, probably a little wrong, maybe even very wrong. It is always possible to throw 000 heads in a row even with a fair die, although it is very unlikely that this will happen. In any case, when studying probability we are not concerned with whether the pmf is correct for this experiment, because we do not care about experiments or data. Statisticians, or the people who designed this experiment must take care to design it well, so they can give us a good statistical model. All we know is the statistical model (that is the pmf) and we derive, mathematically, predictions about the future based on this pmf. 4/0/20 Lecture 3 36 8

20 Lecture 5 37 20 Lecture 5 38 9

20 Lecture 5 39 20 Lecture 5 40 20

Variance 20 Lecture 5 4 Variance Consider the p.m.f. for the following three variables: W = 0 with probability with probability /2 Y = { + with probability / 2 { 0 with probability / 2 Z = + 0 with probability / 2 All three variables have the same expectation, but their probability mass functions are very different. W is always the same, Y changes a bit, Z changes a lot. The variance quantifies these changes. 20 Lecture 5 42 2