and 1 P (w i)=1. n i N N = P (w i) lim

Similar documents
Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Math 105 Course Outline

Week 12-13: Discrete Probability

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Continuous distributions

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Statistics and Econometrics I

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Recitation 2: Probability

p. 6-1 Continuous Random Variables p. 6-2

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Review of Probability. CS1538: Introduction to Simulations

Review: mostly probability and some statistics

Question Points Score Total: 76

Chapter 3. Chapter 3 sections

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Random Variables Example:

Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of

Fourier and Stats / Astro Stats and Measurement : Stats Notes

RVs and their probability distributions

BASICS OF PROBABILITY

12 1 = = 1

Probability Theory and Statistics. Peter Jochumzen

The random variable 1

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Probability Theory and Simulation Methods

Exam 1 Review With Solutions Instructor: Brian Powers

Probability. Lecture Notes. Adolfo J. Rumbos

Chapter 2: Random Variables

1 Random variables and distributions

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Week 2. Review of Probability, Random Variables and Univariate Distributions

Probability, Random Processes and Inference

Elementary Discrete Probability

Name: Firas Rassoul-Agha

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

1 Probability theory. 2 Random variables and probability theory.

Deep Learning for Computer Vision

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2 a: Conditional Probability and Bayes Rule

Lecture 2: Review of Probability

L2: Review of probability and statistics

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

MAT 271E Probability and Statistics

EE514A Information Theory I Fall 2013

MATH Solutions to Probability Exercises

If we want to analyze experimental or simulated data we might encounter the following tasks:

Introduction to probability theory

Gaussian Random Fields

Basic Probability. Introduction

Example A. Define X = number of heads in ten tosses of a coin. What are the values that X may assume?

A Refresher in Probability Calculus

Single Maths B: Introduction to Probability

Homework 2. Spring 2019 (Due Thursday February 7)

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

1 Joint and marginal distributions

Statistics 100A Homework 5 Solutions

Statistical Theory 1

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Probability and random variables

Notes 6 Autumn Example (One die: part 1) One fair six-sided die is thrown. X is the number showing.

Probability and Statistics Concepts

Northwestern University Department of Electrical Engineering and Computer Science

Recap of Basic Probability Theory

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

1.1 Review of Probability Theory

MAT 271E Probability and Statistics

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Discrete Random Variables. Discrete Random Variables

Elements of Probability Theory

Conditional Probability (cont'd)

Recap of Basic Probability Theory

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Axioms of Probability

Introduction to Statistics. By: Ewa Paszek

18.440: Lecture 19 Normal random variables

Algorithms for Uncertainty Quantification

CS 246 Review of Proof Techniques and Probability 01/14/19

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Chapter 4 Multiple Random Variables

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

1: PROBABILITY REVIEW

1 Probability and Random Variables

5.2 Continuous random variables

P [(E and F )] P [F ]

More on Distribution Function

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Continuous Random Variables

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Fractional Roman Domination

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Transcription:

Chapter 1 Probability 1.1 Introduction Consider an experiment, result of which is random, and is one of the nite number of outcomes. Example 1. Examples of experiments and possible outcomes: Experiment Outcomes Toss a coin {H, T} Roll a dice {1, 2, 3, 4, 5, 6} Toss two coins {HH, HT, TH, TT} Let Ω={w 1,w 2,...,w n } be the set of outcomes of an experiment, called as sample space. To each outcome, a probability P is assigned such that P (w i ) [0, 1] n and 1 P (w i)=1. Example 2. A fair coin is tossed. By fair coin, one means that the probability of getting H is same as that of getting T. Thus P (H) =P (T) =1/2. Here the meaning of the probability is in frequency sense. There are two ways of interpreting this. The experiment is performed N times. appears. Then Let n i is the number of times outcome w i n i lim N N = P (w i) Alternatively, N identical experiments are performed simultaneously. Such collection of experiments is called ensemble. 2

Example 3. A jar contains 6 yellow, 4 white and 7 blue balls. The balls, except color, are identical. Experiment is to nd the color of a ball picked randomly. Sample space is {yellow, white, blue}. The probabilities are P (yellow) = 6 17 P (white) = 4 17 P (blue) = 7 17 An event is a subset of the sample space. Let A be an event. Then the probability of A is dened as P (A) = w A P (w). Example 4. A fair dice is rolled. What is the probability that outcome is less than or equal to 3? Then this event is A = {1, 2, 3}. Then P (A) =P (1) + P (2) + P (3) = 1 2. Theorem 5. Probabilities assigned to events in sample space Ω satsify following properties 1. P (φ) =0and P (Ω)=1. 2. If A B Ω then P (A) P (B). 3. P (A B) =P (A)+P (B) P (A B). 4. P (Ā) =1 P (A) A physicist deals with measurable quantities which are represented by real numbers. Thus, in the rest of this chapter, sample spaces will be assumed to be numerical. 1.2 Discrete Probability Distributions Denition 6. Consider an experiment whose outcome depends on chance. We represent the outcome of the experiment by a capital Roman letter, such as X, called a random variable. The sample space of the experiment is the set of all possible outcomes. If the sample space is either nite or countably innite, the random variable is said to be discrete. The assignment of probabilities to outcomes is described next. Denition 7. Let X be a random variable which denotes the value of the outcome of a certain experiment, and assume that this experiment has only nitely many possible outcomes. Let Ω be the sample space of the experiment. A distribution function for X is a real-valued function P whose domain is Ω and which satises: 3

1. P (ω) 0 for all ω Ω; 2. ω Ω P (ω) =1. Example 8. Consider four magnetic dipoles which can be either in up or down state. Magentic moment of a dipole is +µ when it is in up state and is µ when it is in down state. The experiment is to nd total magnetism given by the sum of all dipole moments, that is M = 4 µ i. i=1 Now, let up state be represented by letter u and down state by d. Then, the conguration uudd means that the rst dipole is up, the second is up, third and fourth are down. And magnetization of this conguration is 0. There are total 16 possible congurations. The sample space is set of all possible values for total magnetism. Let Ω M = { 4µ, 2µ, 0, 2µ, 4µ}. Now if all conguration are equally likely to occur, then congurations # of Arrangements Magnetization Probability uuuu 1 4µ 1/16 uuud 4 2µ 1/4 uudd 6 0 3/8 uddd 4 2µ 1/4 dddd 1 4µ 1/16 Normally, it is not interesting to know the detailed probability distribution. Usually the moments of the distribution are quoted. Two interesting moments are average and variance. Denition 9. Let X be a discrete random variable with sample space Ω and distribution funtion P, then the expectation value of X, denoted by E(X) or X, is dened as E(X) = ω Ω ωp (ω) provided the sum converges. The expectation value is interpreted again in frequecy sense. If the experiment is performed large number of times, then the sample mean is close to the expectation value. Example 10. In magnetization example above, the exectation value of the magnetization is given by E(M) = ( 4µ) 1 16 +( 2µ) 1 4 + (0) 3 8 +(2µ) 1 4 +(4µ) 1 16 = 0 4

Theorem 11. Let X be a discrete random variable with sample space Ω and distribution funtion P. Letφ : ω R. Then the expectation value of φ (X) is E(φ (X)) = ω Ω φ (ω) P (ω) provided the sum converges. Example 12. Again in magnetization example, E ( M 2) = ( 4µ) 2 1 1 16 +( 2µ)2 4 + 3 1 1 (0)2 8 +(2µ)2 4 +(4µ)2 16 = 4µ 2 Denition 13. Let X be a discrete random variable with sample space Ω and distribution funtion P. Let E(X) =µ. The variance of X, denoted by V (X), is dened as V (X) = E ((X µ) 2) = ω Ω (ω µ) 2 P (ω) The standard deviation of X, denoted by σ X, is dened as V (X). The variance is a measure of the spread of the the probability distributions about the mean value. In the gure below, the red probability distribution has larger variance. P Ω Ω Theorem 14. If X is a discrete random variable, then V (X) =E ( X 2) (E(X)) 2 Example 15. In magnetization example, E(M) =0. Then, V (M) =E(M 2 )=4µ 2. 5

1.3 Continuous Probability Distribution A random variable is said to be continuous if the sample space is continuous, we will assume that the sample is either R or intervals in R. Example 16. Suppose a particle is trapped in a 1D box (say in [0,L]), and is going back and forth with speed u. It was set in motion in past from some random location in the box. The experiment is to nd the position of particle at some time. Let P (X <x) be the probability of nding particle to the left of x. Then P (X <x) = 0 x<0 = 1 x>1 = x/l otherwise. This is intutive, since all points in box are equally likely, probability that particle is in some interval [a, b] [0,L] is proportional to length b a. Then, probability of nding the particle at a point becomes 0 because length of a point is 0! This is a contradiction, because then the net probability of nding the particle in the box will be 0. The way out is to dene probability density function. Denition 17. Let X be a continuous real valued random variable. A probability density function f is a real function such that P (a <X<b)= ˆ b a f(x)dx. Thus, probability of nding particle at x is 0. However, probability that particle is near x, that is the particle is in [x, x + dx] is given by f(x)dx. The function F (x) =P (X < x) is called a cumulative distribution function. Clearly, f(x) = df dx (x). Example 18. For particle-in-a-box example, the probability density function is { 1 f(x) = L x [0,L] 0 otherwise. The expectation value and variance are dened exactly in the same way as in case of discrete variables. Thus ˆ E(X) = xf(x)dx and V (x) =E ( X 2) (E(x)) 2. 6

Example 19. For particle-in-a-box example, ˆ L ( ) 1 E(X) = x dx = L 0 L 2 and ˆ L ( ) 1 V (X) = x 2 dx = L2 0 L 3. Another Example. Example 20. A ball is released from height H. What is pdf for nding particle at height y measured downwards from the point of release? Consider a small interval dy at y. Probabilitythat particle is found in [y, y+dy] is proportional to the time the ball spends in that section. That is, f(y)dy = P (X [y, y + dy]) = Kdt = K dy v(y) where, K is a constant and v(y) = 2gy is the speed of the ball at y. Thus, f(y) =K/ 2gy. The constant of proportionalitycan be found bynormalizing f. The normalization condition, ˆ H 0 f(y)dy =1 gives f(y) =1/ ( 2 Hy ). The cdf F (x) = y/h. The average height is given by ˆ H ( ) 1 E(y) = y 0 2 dy = H Hy 3 The graph below shows the pdf f y Exercise 21. A particle is performing simple harmonic motion with amplitude A and frequency ω. Show that the pdf for nding particle at a position x is given by f(x) = 1 1 π A 2 x. 2 Plot this function and nd the variance of the distribution. H y 7