Probabilities and distributions

Similar documents
Linear algebra basics

Binomial and Poisson Probability Distributions

Probability Theory and Random Variables

2. AXIOMATIC PROBABILITY

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Chapter 8: An Introduction to Probability and Statistics

X = X X n, + X 2

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Scientific Measurement

PRACTICE PROBLEMS FOR EXAM 2

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Probability and Probability Distributions. Dr. Mohammed Alahmed

Chapter 6 Continuous Probability Distributions

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

18.175: Lecture 13 Infinite divisibility and Lévy processes

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

324 Stat Lecture Notes (1) Probability

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

HW MATH425/525 Lecture Notes 1

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Chapter 1: Revie of Calculus and Probability

STAT 516 Midterm Exam 2 Friday, March 7, 2008

1 Basic continuous random variable problems

Part 3: Parametric Models

Chapter 7: Section 7-1 Probability Theory and Counting Principles

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

1 INFO Sep 05

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?

M378K In-Class Assignment #1

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY

Probability. Lecture Notes. Adolfo J. Rumbos

Lecture 14. More discrete random variables

CHAPTER 3 PROBABILITY: EVENTS AND PROBABILITIES

Lecture 1: Probability Fundamentals

Week 4: Chap. 3 Statistics of Radioactivity

Question Bank In Mathematics Class IX (Term II)

A brief review of basics of probabilities

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

Let us think of the situation as having a 50 sided fair die; any one number is equally likely to appear.

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Statistics 100A Homework 5 Solutions

Recap of Basic Probability Theory

1 Basic continuous random variable problems

STAT 414: Introduction to Probability Theory

Lecture 2: Probability and Distributions

Twelfth Problem Assignment

STAT 418: Probability and Stochastic Processes

Exam 1 Solutions. Problem Points Score Total 145

Probabilistic models

Recap of Basic Probability Theory

, x {1, 2, k}, where k > 0. Find E(X). (2) (Total 7 marks)

1 The Basic Counting Principles

Introduction to Machine Learning

Probability and Statistics Notes

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

4.2 Probability Models

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Statistical Theory 1

Topic 5: Probability. 5.4 Combined Events and Conditional Probability Paper 1

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

Exercises with solutions (Set D)

An introduction to biostatistics: part 1

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

Exercises. Class choose one. 1.11: The birthday problem. 1.12: First to fail: Weibull.

Discrete Distributions

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Math 10 - Compilation of Sample Exam Questions + Answers

Review of probabilities

Some Statistics. V. Lindberg. May 16, 2007

Chapter 4: An Introduction to Probability and Statistics

II. Probability. II.A General Definitions

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

STAT 516 Midterm Exam 3 Friday, April 18, 2008

MTH302 Long Solved Questions By

Analysis of Engineering and Scientific Data. Semester

Math 1313 Experiments, Events and Sample Spaces

2.6 Tools for Counting sample points

Probability Theory and Simulation Methods

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

Formalizing Probability. Choosing the Sample Space. Probability Measures

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

Solutions - Final Exam

V. Probability. by David M. Lane and Dan Osherson

Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin,

Chapter (4) Discrete Probability Distributions Examples

Probability Distributions

STAT:5100 (22S:193) Statistical Inference I

STAT Chapter 3: Probability

ΔP(x) Δx. f "Discrete Variable x" (x) dp(x) dx. (x) f "Continuous Variable x" Module 3 Statistics. I. Basic Statistics. A. Statistics and Physics

Discrete Random Variable

Discrete Mathematics

Distribusi Binomial, Poisson, dan Hipergeometrik

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

1 INFO 2950, 2 4 Feb 10

Transcription:

Appendix B Probabilities and distributions B.1 Expectation value and variance Definition B.1. Suppose a (not necessarily quantum) experiment to measure a quantity Q can yield any one of N possible outcomes {Q i } (1 i N)}, with respective probabilities pr i. Then Q is called a random variable and the set of values {pr i } for all values of i is called the probability distribution. The expectation (mean) value of Q is Q = N i=1 pr i Q i. (B.1) Exercise B.1. Find the expectation of the value displayed on the top face of a fair die. random variable Q mean Q standard deviation Q 2 0 100 200 300 400 500 sample number Fig. B.1 Mean and rms standard deviation of a random variable. Definition B.2. The mean square variance of random variable Q is Springer-Verlag GmbH Germany, part of Springer Nature 2018 A. I. Lvovsky, Quantum Physics, Undergraduate Lecture Notes in Physics, https://doi.org/10.1007/978-3-662-56584-1 277

278 A. I. Lvovsky. Quantum Physics Q 2 = (Q Q ) 2 = pr i (Q i Q ) 2. (B.2) i The root mean square (rms) standard deviation, or uncertainty of Q is then Q 2. i=1 While the expectation value, Q = N pr i Q i, shows the mean measurement output, the statistical uncertainty shows by how much, on average, a particular measurement result will deviate from the mean (Fig. B.1). Exercise B.2. Show that, for any random variable Q, Q 2 = Q 2 Q 2. (B.3) Exercise B.3. Calculate the mean square variance of the value displayed on the top face of a fair die. Show by direct calculation that Eqs. (B.2) and (B.3) yield the same. Exercise B.4. Two random variables Q and R are independent, i.e., the realization of one does not affect the probability distribution of the other (for example, a die and a coin being tossed next to each other). Show that QR = Q R. Is this statement valid if Q and R are not independent? Hint: Independence means that events Q i and R j occur at the same time with probability pr Q i prr j for each pair (i, j), where pr Q i is the probability of the ith value of variable Q and pr R j is the probability of the j th value of R. Exercise B.5. Suppose a random variable Q is measured (for example, a die is thrown) N times. Consider the random variable Q that is the sum of the N outcomes. Show that the expectation and variance of Q equal Q = N Q and Q 2 = N Q 2, respectively. B.2 Conditional probabilities The conditional probability pr A B is the probability of some event A given that another event, B, is known to have occurred. Examples are: the probability that the value on a die is odd given that it is greater than 3; the probability that Alice s HIV test result will be positive given that she is actually not infected; the probability that Bob plays basketball given that he is a man and 185 cm tall;

B.2 Conditional probabilities 279 the probability that it will rain tomorrow given that it has rained today. Let us calculate the conditional probability using the third example. Event A is Bob plays basketball. Event B is Bob is a 185-cm tall man. The conditional probability is equal to the number N(A and B) of 185-cm tall men who play basketball divided by the number N(B) of 185-cm tall men [Fig. B.2(a)]: pr A B = N(A and B). (B.4) N(B) Let us divide both the numerator and the denominator of the above fraction by N, the total number of people in town. Then we have in the numerator N(A and B)/N = pr A and B the probability that a randomly chosen person is a 185-cm tall man who plays basketball, and in the denominator, N(B)/N = pr B the probability that a random person is a 185-cm tall man. Hence pr A B = pr A and B pr B. (B.5) This is a general formula for calculating conditional probabilities. Exercise B.6. Suppose events B 1,...,B n are mutually exclusive and collectively exhaustive, i.e., one of them must occur, but no two occur at the same time [Fig. B.2(b)]. Show that, for any other event A, pr A = n i=1 pr A Bi pr Bi. (B.6) This result is known as the theorem of total probability. a) b) B A and B A B B A B 1 2 3 Fig. B.2 Conditional probabilities. a) Relation between the conditional and combined probabilities, Eq. (B.5). b) Theorem of total probability (Ex. B.6). Exercise B.7. The probability that a certain HIV test gives a false positive result is pr positive not infected = 0.05. The probability of a false negative result is zero. It is known that, of all people taking the test, the probability of actually being infected is pr infected = 0.001.

280 A. I. Lvovsky. Quantum Physics a) What is the probability pr positive and not infected that a random person taking the test is not infected and shows a false positive result? b) What is the probability pr positive that a random person taking the test shows a positive result? c) A random person, Alice, has been selected and the test has been performed on her. Her result turned out to be positive. What is the probability that she is not infected? Hint: To visualize this problem, imagine a city of one million. How many of them are infected? How many are not? How many positive test results will there be all together? B.3 Binomial and Poisson distributions Exercise B.8. A coin is tossed n times. Find the probability that heads will appear k times, and tails n k times: a) for a fair coin, i.e., the probability of getting heads or tails in a single toss is 1/2; b) for a biased coin, with the probabilities for the heads and tails being p and 1 p, respectively. Answer: pr k = ( ) n p k (1 p) n k. k (B.7) The probability distribution defined by Eq. (B.7) is called the binomial distribution. We encounter this distribution in everyday life, often without realizing it. Here are a few examples. Exercise B.9. a) On a given day in a certain city 20 babies were born. What is the probability that exactly nine of them are girls? b) A student answers 3/4 of questions on average. What is the probability that (s)he scores perfectly on a 10-question test? c) A certain politician has 60% electoral support. What is the probability that (s)he receives more than 50% of the votes in a district with 100 voters? Exercise B.10. Find the expectation value and the uncertainty of the binomial distribution (B.7). Answer: k = np; k 2 = np(1 p). (B.8) Exercise B.11. In a certain big city, 10 babies are born per day on average. What is the probability that on a given day, exactly 12 babies are born? a) The city population is 100000.

B.4 Probability densities 281 b) The city population is 1000000. Hint: Perhaps there is a way to avoid calculating 1000000!. We see from the above exercise that in the limit p 0 and n, but λ = pn = const, the probabilities in the binomial distribution become dependent on λ, rather than p and n individually. This important extension of the binomial distribution is known as the Poisson (Poissonian) distribution. Exercise B.12. Show that in the limit p 0 and n, but λ = pn = const, the binomial distribution (B.7) becomes using the following steps. ( ) 1 n a) Show that lim = n n k k k! 1. b) Show that lim (1 p) n k = e λ. n c) Obtain Eq. (B.9). pr k = e λ λ k Exercise B.13. Find the answer to Ex. B.11 in the limit of an infinitely large city. Here are some more examples of the Poisson distribution. Exercise B.14. k! (B.9) a) A patrol policeman posted on a highway late at night has discovered that, on average, 60 cars pass every hour. What is the probability that, within a given minute, exactly one car will pass that policeman? b) A cosmic ray detector registers 500 events per second on average. What is the probability that this number equals exactly 500 within a given second? c) The average number of lions seen on a one-day safari is 3. What is the probability that, if you go on that safari, you will not see a single lion? Exercise B.15. Show that both the mean and variance of the Poisson distribution (B.9) equal λ. For example, in a certain city, 25 babies are born per day on average, so λ = 25. The root mean square uncertainty in this number λ = 5, i.e., on a typical day we are much more likely to see 20 or 30 babies rather than 10 or 40 (Fig. B.3). Although the absolute uncertainty of n increases with n, the relative uncertainty λ/λ decreases. In our example above, the relative uncertainty is 5/25 = 20%. But in a smaller city, where n = 4, the relative uncertainty is as high as 2/4 = 50%. B.4 Probability densities So far, we have studied random variables that can take values from a discrete set, with the probability of each value being finite. But what if we are dealing with

282 A. I. Lvovsky. Quantum Physics pr n 0.15 0.10 0.05 10 20 30 40 n Fig. B.3 the Poisson distribution with n = 4 (empty circles) and n = 25 (filled circles). a continuous random variable for example, the wind speed, the decay time of a radioactive atom, or the range of a projectile? In this case, there is now way to assign a finite probability value to each specific value of Q. The probability that the atom decays after precisely two milliseconds, or the wind speed is precisely five meters per second, is infinitely small. However, the probability of detecting Q within some range for example, that the atom decays between times 2 ms and 2.01 ms is finite. We can therefore discretize the continuous variable: divide the range of values that Q can take into equal bins of width δq. Then we define a discrete random variable Q with possible values Q i equal to the central point of each bin, and the associated finite probability pr Q i that Q falls within that bin [Fig. B.4(a,b)]. As for any probability distribution, i pr Q i = 1. Of course, the narrower the bin width we choose, the more precisely we describe the behavior of the continuous random variable. The probability values associated with neighboring bins can be expected to be close to each other if the bin width is chosen sufficiently small. For atomic decay, for example, we can write pr [2.00 ms,2.01 ms] pr [2.01 ms,2.02 ms] 1 2 pr [2.00 ms,2.02 ms]. In other words, for small bin widths, the quantity pr Q i /δq is independent of δq. Hence we can introduce the notion of the probability density or continuous probability distribution 1 : pr Q pr(q) = lim i (Q) δq 0 δq, (B.10) where i(q) is the number of the bin within which the value of Q is located and the limit is taken over a set of discretized probability distributions for Q. This probability density is the primary characteristic of a continuous random variable. Note also that, because the discrete probability pr Q i (Q) is a dimensionless quantity, the dimension of a continuous probability density pr(q) is always the reciprocal dimension of the corresponding random variable Q. 1 Throughout this book, I use subscripts for discrete probabilities, such as in pr i or pr Q i, and parentheses for continuous probability densities, e.g., pr(q).

B.4 Probability densities 283 a) b) c) pr i pr i pr( Q) Q Q Q i Q i Q Fig. B.4 Continuous probability distribution. a), b) Discretization of the continuous random variable with bin widths δ Q = 0.5 and 0.1, respectively. c) Continuous probability density. The probability of observing the variable in a range between Q and Q is Q Q pr(q)dq. Note the variation of the vertical scales in the three plots. Exercise B.16. For a continuous random variable with probability density pr(q), show that: a) the probability of observin the variable in the range between Q and Q is Q pr [Q,Q ] = Q pr(q)dq; (B.11) b) the probability density function is normalized: pr(q)dq = 1; (B.12) c) the expectation value of Q is given by Q = Qpr(Q)dQ; (B.13) d) the variance of Q is given by Q 2 = (Q Q ) 2 pr(q)dq = Q 2 Q 2. (B.14) Exercise B.17. Find the probability density, expectation and root mean square uncertainty for the decay time t of a radioactive nucleus with half-life τ = 1 ms. A probability density that frequently occurs in nature is the Gaussian, or normal distribution:

284 A. I. Lvovsky. Quantum Physics G b (x) = 1 b π e x2 /b 2, (B.15) where b is the width of the Gaussian distribution (Fig. B.5). Typically, the Gaussian distribution governs physical quantities that are affected by multiple small random effects that add up 2. Examples include: the position of a particle subjected to Brownian motion; the time shown by a clock affected by random fluctuations of the temperature in the room; the component of the velocity of a gas molecule along a particular axis. 2 b 1 2 1 x Fig. B.5 Normalized Gaussian functions of different widths. Exercise B.18. For a Gaussian distribution G b (x a), show the following: a) Normalization holds, i.e., G b (x)dx = 1. Note that Eq. (B.17) also holds for complex b, as long as Re(b) > 0. b) The mean equals x = a. c) The variance is x 2 = b 2 /2. Hint: use (B.16) 2 The rigorous formulation of this statement is called the central limit theorem.

B.4 Probability densities 285 e x2 /b 2 dx = b π; (B.17) x 2 e x2 /b 2 dx = b3 π. (B.18) 2