1 INFO Sep 05

Size: px
Start display at page:

Download "1 INFO Sep 05"

Transcription

1 Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually independent.) Example: suppose events A, B, and C are pairwise independent, i.e., A and B are independent, B and C are independent, and A and C are independent. Note that this pairwise independence does not necessarily imply mutual independence of A, B, and C. To check that p( i S A i ) = i p(a i) for all subsets S {A, B, C} in this case means checking the non-trivial subsets with 2 or more elements: {A, B}, {A, C}, {B, C}, {A, B,C}. By assumption it follows for the first three, so the only one we need to check is p(a, B, C)? = p(a)p(b)p(c). But that this is not always the case can be seen by an explicit counterexample: consider tossing a fair coin three times, and consider the three events: A = the number of heads is even, B = the first two flips are the same, C = the second two flips are heads. It follows that p(a) = p(b) = 1/2, p(c) = 1/4, p(a, B) = 1/4 = p(a)p(b), p(a, C) = p(b, C) = 1/8 = p(a)p(c) = p(b)p(c); but p(a, B, C) = 0 p(a)p(b)p(c) = 1/16.) The complement of a set A S in S is denoted A = S A, i.e. the set of elements in S not contained in A. We can prove that an event A is independent of another event B if and only if A is independent of B. To show this, first recall that if S can be written as the union of a set of non-interecting subsets S i : S = i S i, S i S j = φ, then p(a) = i p(a S i) = i p(a, S i). The two sets S 1 = B, S 2 = B clearly satisfy these conditions, so we can write p(a) = p(a, B) + p(a, B). Note also that p(b)+p(b) = 1. If A and B are independent, then by definition p(a, B) = p(a)p(b) and substituting in the above results in p(a, B) = p(a)(1 p(b)) = p(a)p(b), so A and B are independent. In the opposite direction, if p(a, B) = p(a)p(b) then substitution in the above gives p(a, B) = p(a)(1 p(b)) = p(a)p(b), and A and B are independent. Finally, note that the notions of disjoint and independent events are very different. Two events A, B are disjoint if their intersection is empty, whereas they are independent if P(A, B) = p(a)p(b). Two events that are disjoint necessarily have p(a, B) = p(a B) = 0, so if their independent probabilites are non-zero they are necessarily negatively correlated (P(A, B) < p(a)p(b)). For example, if we flip 2 coins, and event A = exactly 1 H, and event B = exactly 2 H, these are disjoint but not independent events: they re negatively correlated since p(a, B) = 0 is less than p(a)p(b) = (1/2)(1/4). Non-disjoint events can be positively or negatively correlated, or they can be independent. If we take event C = exactly 1T, then A and C are not disjoint (they re equal): and they re positively correlated since p(a, C) = 1/2 is greater than p(a)p(c) = 1/4. If we now flip 3 coins and let C = at least 1 H and at least one T, and D = at most 1 H. We see that C D=1H, and independence of events C, D follows from p(c)p(d) = (6/8)(1/2) = 3/8 = p(c, D). 1 INFO Sep 05

2 Random variables, mean and variance: Suppose in a collection of number of people there are some number with height 6, and equal numbers with heights 5 11 and 6 1. The mean or average of this distribution is 6, as can be determined by summing the heights of all the people and dividing by the number of people, or equivalently by summing over distinct heights weighted by the fractional number of people with that height. Suppose for example, that the numbers in the above height categories are 5,30,5, then the latter calculation corresponds to (1/8) (3/4) 6 + (1/8) 6 1 = 6. But the average gives only limited information about a distribution. Suppose there were instead only people with heights 5 and 7, and an equal number of each, then the average would still be 6 though these are very different distributions. It is useful to characterize at least the variation within the distribution from the mean. The average deviation from the mean gives zero due to equal positive and negative variations (as proven below), so the quantity known as the variance (or mean square deviation) is defined as the average of the squares of the differences between the values in the distribution and their mean. For the first distribution above, this gives the variance V = 1 8 ( 1 ) (0 ) (1 )2 = 1 4 (inch)2, and for the second distribution the much larger result V = 1 2 ( 1 ) (1 )2 = 1(foot) 2. The standard or r.m.s ( root mean square ) deviation σ is defined as the square root of the variance, σ = V. The above two distributions have σ = (1/2 inch) and σ = (1 foot) respectively. More generally, a random variable is a function X : S IR, assigning some real number to each element of the probability space S. The average of this variable is determined by summing the values it can take weighted by the corresponding probability, <X> = s S p(s)x(s). (An alternate notation for this is E[X] = <X>, for the expectation value of X.) Example 1: roll two dice and let X be the sum of two numbers rolled. X({1, 1}) = 2, X({1, 2}) = X({2, 1}) = 3,..., X({6, 6}) = 12. The average of X is Thus <X> = = 7. Example 2: flip a coin 3 times, and let X be the number of tails. The average is <X> = = 3 2. The expectation of the sum of two random variables X, Y satisfies <X + Y > = <X>+<Y >. In general, they satisfy a linearity of expectation <ax +by > = a<x>+ b<y > proven as follows: <ax + by > = s p(s)(ax(s) + by (s)) = a s p(s)x(s) + b s p(s)y (s) = a<x> + b<y >. Thus an alternate way to calculate the mean of X = 2 INFO Sep 05

3 X 1 + X 2 for the two dice rolls in example 1 above is to calculate the mean for a single die, X 1 = ( )/6 = 21/6 = 7/2, and so for two rolls <X> = <X 1 >+<X 2 > = 7/2 + 7/2 = 7. By definition, independent random variables X, Y satisfy p(x=a Y =b) = p(x = a)p(y = b) (i.e., the joint probability is the product of their independent probabilities, just as for independent events). For such variables, it follows that the expectation value of their product satisfies <XY > = <X><Y > (X, Y independent) since r,s p(r, s)x(r)y (s) = r,s p(r)p(s)x(r)y (s) = ( r p(r)x(r))( s p(s)y (s)). As indicated above, the average of the differences of a random variable from the mean vanishes: s S p(s)( X(s) <X> ) = <X> <X> s p(s) = <X> <X> = 0. The variance of a probability distribution for a random variable is defined as the average of the squared differences from the mean, V [X] = s S p(s) ( X(s) <X> ) 2. (V 1) The variance satisfies the important relation V [X] = <X 2 > <X> 2, (V 2) following directly from the definition above: V [X] = p(s) ( X(s) <X> ) 2 s S = s X 2 (s)p(s) 2<X> s p(s)x(s) + <X> 2 s p(s) = <X 2 > 2<X> 2 + <X> 2 = <X 2 > <X> 2. In the case of independent random variables X, Y, as defined above, the variance is additive: V [X + Y ] = V [X] + V [Y ]. To see this, use (V 2) together with <XY > = <X><Y >: V [X + Y ] = <(X + Y ) 2 > (<X> + <Y >) 2 = <X 2 > + 2<XY > + <Y > 2 <X> 2 2<X><Y > <Y > 2 = <X 2 > <X> 2 + <Y 2 > <Y > 2 = V [X] + V [Y ]. Example: again flip a coin 3 times, and let X be the number of tails. <X 2 > = = 3 so V [X] = 3 (3/2)2 = 3/4. If we let X = X 1 + X 2 + X 3, where X i 3 INFO Sep 05

4 is the number of tails (0 or 1) for the i th roll, then the X i are independent variables with <X i > = 1/2 and <Xi 2> = (1/2) 1 + (1/2) 0 = 1/2, so V [X i] = 1/2 1/4 = 1/4 (or equivalently V [X i ] = 1/2(1/2) 2 + 1/2( 1/2) 2 = 1/8 + 1/8 = 1/4). For the three rolls, V [X] = V [X 1 ] + V [X 2 ] + V [X 3 ] = 1/4 + 1/4 + 1/4 = 3/4, confirming the result above. A Bernoulli trial is a trial with two possible outcomes: success with probability p, and failure with probability 1 p. The probability of r successes in N trials is ( N r ) p r (1 p) N r. Note the correct overall normalization automatically follows from N ( N ) r=0 r p r (1 p) N r = [ ] N ( p + (1 p) = 1 N = 1. The overall probability for r successes is a competition between N ) r, which is maximum at r N/2, and p r (1 p) N r with is largest for small r when p < 1/2 (or large r for p > 1/2). In class, we considered the case of rolling a standard six-sided die, with a roll of 6 considered a success, so p = 1/6. For a larger number N of trials, the distribution of expected number of successes became more narrowly peaked and more symmetrical about a fractional distance r = N/6. To analyze this in the framework outlined above, let the random variable X i = 1 if the i th trial is success. Then <X i > = p. Let X = X 1 + X X N count the total number of successes. Then it follows that the average satisfies <X> = i <X i > = Np. (B1) From V [X i ] = <X 2 i > <X i> 2 = p p 2 = p(1 p), it follows that the variance satisfies V [X] = i V [X i ] = Np(1 p), (B2) and the standard deviation is σ = V [X] = Np(1 p). This explains the observation that the probability gets more sharply peaked as the number of trials increases, since the width of the distribution (σ) divided by the average <X> behaves as σ/<x> N/N 1/ N, a decreasing function of N. By the central limit theorem (not proven in class), many such distributions under fairly relaxed assumptions always tend for sufficiently large number of trials to a gaussian or normal distribution, of the form P(x) 1 (x µ) 2 σ 2π e 2σ 2. (G) This is properly normalized, with dx P(x) = 1, and also has dx xp(x) = µ, dx x2 P(x) = σ 2 + µ 2, so the above distribution has mean µ and variance σ 2. Setting µ = Np and σ = Np(1 p) for p = 1/6 in (G) thus gives a good approximation to the distribution of successful rolls of 6 for large number of trials in the example above. 4 INFO Sep 05

5 Telegraphic review of exponentials and logarithms: x m represents the result of multiplying m factors of x, and satisfies the useful relations x m x n = x m+n, (x n ) m = x nm, x 0 = 1. (E1) The exponential function f(x) = y x is frequently employed with y = e, where e = is a transcendental number, which can be introduced as follows. Consider investing $1 at a 100%/year interest rate for a year. If interest is compounded a single time at the end of the year, then the result is $1 (1 +1) = $2. If instead interest is compounded twice, at 50% after a half year and the remaining 50% at the end of the year, then the result is the slightly larger, $1 (1+1/2)(1+1/2) = $2.25. Now consider breaking up the time interval into N pieces, so that interest is compounded N times, then the total is (1 + 1/N) N. In the limit that N becomes arbitrarily large (compounded continuously), the result converges to lim (1 + N 1/N)N = e. (D1) If the interest rate is instead only 5% per year compounded continuously, then the result would be lim N (1+.05/N) N = lim M (1+1/M).05M = e For general interest rate x, it follows that lim N (1 + x/n) N = lim M (1 + 1/M) Mx = e x. This function can also be written as an infinite sum by expanding the product, and keeping only the terms that survive in the large N limit: ( e x = lim 1 + x ) N ( = lim 1 + N x N N N N = 1 + x + x2 2! + x3 3! + + xm m! + = N(N 1) ( x ) ( 2 N ! N m n=0 x n n! ) ( x N ) m + ). (E2) The logarithm function is the inverse of the exponential function: if y x = X then x = log y X. For example, 10 3 = 1000 so log = 3, and 2 10 = 1024 so log = 10. Useful properties of the logarithm function follow directly from the corresponding properties (E1) of the exponential function log XZ = log X + log Z, log X s = s log X, log 1 = 0. (L1) (For example if X = y x, Z = y z, then log y XZ = log y y x y z = x+z = log y X +log y Z, and log X s = log y y xs = xs = s log y X. Logarithms to different bases y, z are simply related by a numerical factor log z y: log y x = log z x log z y Note that the rough relation implies that 10 3/10 2 and 2 10/3 10, which permits estimating log /10 and log /3, good approximations to the actual values of and , respectively. 5 INFO Sep 05

6 (following from log z x = log z y log y x = log y x log z y). The special notation ln = log e is used for logarithms to the base e. For example, ln 2 = log e The series expansion for the function ln(1+t) = a 1 t+a 2 t 2 +a 3 t 3 + can be determined by setting e x = 1 + t (defined so that t is small when x is small) and substituting in (E2): 1+t = 1+(a 1 t+a 2 t 2 +a 3 t 3 + )+ 1 2 (a 1t+a 2 t 2 +a 3 t 3 + ) ! (a 1t+a 2 t 2 +a 3 t 3 + ) 3 +. Comparing the coefficients of powers of t gives a 1 = 1, a 2 +a 2 1/2 = 0, a 3 +2a 1 a 2 /2+a 3 1/6 = 0,..., and we infer that a 2 = 1/2, a 3 = 1/2 1/6 = 1/3, i.e., ln(1+t) = t t 2 /2+t 3 /3. Letting t t, the full result can be written in the form ln(1 t) = t t2 2 t3 3 = t n n. n=1 (L2) Example: Suppose a software program has a bug that occurs with probability p = 1/1000. a) How many times do we need to run the program in order to have a 50% chance of seeing the bug occur? b) What is the probability of seeing the bug if we run the program 1000 times? a) The probability of not seeing the problem occur in a single run is P 1 = 1 p so, assuming that successive runs are independent, the probability of seeing no problem after N runs is P N = (1 p) N. For (1 p) N = 1/2, we find N ln(1 p) = ln(1/2), or using (L2) to lowest order, Np ln 2, so N (1/p) ln2 = 1000 ln2 693, so after N = 693 trials the probability of no problem falls to roughly 50%. b) P 1000 = ( ) is large so by (D1) this is approximated by P 1000 e (Equivalently, using (L2) we can estimate lnp 1000 = 1000 ln( ) 1000( ) = 1, so again P 1000 e 1 ). The probability of seeing the bug is therefore 1 P , i.e., has climbed to roughly 63.2% after 1000 trials. This problem is directly analogous to the problem of nuclear decay, where p is instead the probability per unit time of decay of some nuclear species. (In the original problem, p was a probability per trial, so a trial becomes an infinitesimal time step in the nuclear problem.) The half-life, i.e., the time for half of some sample to be likely to decay, is given by the equivalent formula τ = (1/p) ln2. 6 INFO Sep 05

7 Figure: probability of seeing a problem, given by 1 (999/1000) N 1 e N/1000. The points plotted are at the values (693,.5), (1000,.632), and (5000,.993), corresponding to 50% at N = 693, 63.2% at N = 1000, and 99.3% at N = INFO Sep 05

Info 2950, Lecture 7

Info 2950, Lecture 7 Info 2950, Lecture 7 16 Feb 2017 Prob Set 2: due Fri night 24 Feb The registrar has designated the final exam date/time for this course as Sun, 21 May 2017, 2:00-4:30pm Suppose that q = m / 132 prefer

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized

More information

Probability- describes the pattern of chance outcomes

Probability- describes the pattern of chance outcomes Chapter 6 Probability the study of randomness Probability- describes the pattern of chance outcomes Chance behavior is unpredictable in the short run, but has a regular and predictable pattern in the long

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Lecture 16. Lectures 1-15 Review

Lecture 16. Lectures 1-15 Review 18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability

More information

Statistical Inference

Statistical Inference Statistical Inference Lecture 1: Probability Theory MING GAO DASE @ ECNU (for course related communications) mgao@dase.ecnu.edu.cn Sep. 11, 2018 Outline Introduction Set Theory Basics of Probability Theory

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Random variables (discrete)

Random variables (discrete) Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that

More information

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

HW MATH425/525 Lecture Notes 1

HW MATH425/525 Lecture Notes 1 HW MATH425/525 Lecture Notes 1 Definition 4.1 If an experiment can be repeated under the same condition, its outcome cannot be predicted with certainty, and the collection of its every possible outcome

More information

Econ 325: Introduction to Empirical Economics

Econ 325: Introduction to Empirical Economics Econ 325: Introduction to Empirical Economics Lecture 2 Probability Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 3-1 3.1 Definition Random Experiment a process leading to an uncertain

More information

MATH Solutions to Probability Exercises

MATH Solutions to Probability Exercises MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe

More information

1 INFO 2950, 2 4 Feb 10

1 INFO 2950, 2 4 Feb 10 First a few paragraphs of review from previous lectures: A finite probability space is a set S and a function p : S [0, 1] such that p(s) > 0 ( s S) and s S p(s) 1. We refer to S as the sample space, subsets

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Probability Theory and Applications

Probability Theory and Applications Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

Single Maths B: Introduction to Probability

Single Maths B: Introduction to Probability Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by: Chapter 8 Probability 8. Preliminaries Definition (Sample Space). A Sample Space, Ω, is the set of all possible outcomes of an experiment. Such a sample space is considered discrete if Ω has finite cardinality.

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019 Week 2 Section 1.2-1.4 Texas A& M University Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019 Oğuz Gezmiş (TAMU) Topics in Contemporary Mathematics II Week2 1

More information

Lecture notes for probability. Math 124

Lecture notes for probability. Math 124 Lecture notes for probability Math 124 What is probability? Probabilities are ratios, expressed as fractions, decimals, or percents, determined by considering results or outcomes of experiments whose result

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Lecture 2: Probability, conditional probability, and independence

Lecture 2: Probability, conditional probability, and independence Lecture 2: Probability, conditional probability, and independence Theorem 1.2.6. Let S = {s 1,s 2,...} and F be all subsets of S. Let p 1,p 2,... be nonnegative numbers that sum to 1. The following defines

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr. Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick

More information

Notes on Mathematics Groups

Notes on Mathematics Groups EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties

More information

A brief review of basics of probabilities

A brief review of basics of probabilities brief review of basics of probabilities Milos Hauskrecht milos@pitt.edu 5329 Sennott Square robability theory Studies and describes random processes and their outcomes Random processes may result in multiple

More information

The random variable 1

The random variable 1 The random variable 1 Contents 1. Definition 2. Distribution and density function 3. Specific random variables 4. Functions of one random variable 5. Mean and variance 2 The random variable A random variable

More information

CMPSCI 240: Reasoning about Uncertainty

CMPSCI 240: Reasoning about Uncertainty CMPSCI 240: Reasoning about Uncertainty Lecture 2: Sets and Events Andrew McGregor University of Massachusetts Last Compiled: January 27, 2017 Outline 1 Recap 2 Experiments and Events 3 Probabilistic Models

More information

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Conditional Probability and Bayes

Conditional Probability and Bayes Conditional Probability and Bayes Chapter 2 Lecture 5 Yiren Ding Shanghai Qibao Dwight High School March 9, 2016 Yiren Ding Conditional Probability and Bayes 1 / 13 Outline 1 Independent Events Definition

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

02 Background Minimum background on probability. Random process

02 Background Minimum background on probability. Random process 0 Background 0.03 Minimum background on probability Random processes Probability Conditional probability Bayes theorem Random variables Sampling and estimation Variance, covariance and correlation Probability

More information

Bernoulli Trials, Binomial and Cumulative Distributions

Bernoulli Trials, Binomial and Cumulative Distributions Bernoulli Trials, Binomial and Cumulative Distributions Sec 4.4-4.6 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,

More information

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio 4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio Wrong is right. Thelonious Monk 4.1 Three Definitions of

More information

Mathematical Fundamentals (Part 1)

Mathematical Fundamentals (Part 1) Mathematical Fundamentals (Part 1) Mathematics Chapter 1 ALGEBRAIC OPERATIONS (+)(+) = POSITIVE (-)(-) = POSITIVE (+)(-) = NEGATIVE (-)(+) = NEGATIVE a x b a x (-b) = ab = -ab a(b + c) = ab + ac a(b -

More information

Lecture 1. ABC of Probability

Lecture 1. ABC of Probability Math 408 - Mathematical Statistics Lecture 1. ABC of Probability January 16, 2013 Konstantin Zuev (USC) Math 408, Lecture 1 January 16, 2013 1 / 9 Agenda Sample Spaces Realizations, Events Axioms of Probability

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Chapter 2.5 Random Variables and Probability The Modern View (cont.) Chapter 2.5 Random Variables and Probability The Modern View (cont.) I. Statistical Independence A crucially important idea in probability and statistics is the concept of statistical independence. Suppose

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob Lecture 2 31 Jan 2017 Logistics: see piazza site for bootcamps, ps0, bashprob Discrete Probability and Counting A finite probability space is a set S and a real function p(s) ons such that: p(s) 0, s S,

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

The probability of an event is viewed as a numerical measure of the chance that the event will occur. Chapter 5 This chapter introduces probability to quantify randomness. Section 5.1: How Can Probability Quantify Randomness? The probability of an event is viewed as a numerical measure of the chance that

More information

Chapter 6: Probability The Study of Randomness

Chapter 6: Probability The Study of Randomness Chapter 6: Probability The Study of Randomness 6.1 The Idea of Probability 6.2 Probability Models 6.3 General Probability Rules 1 Simple Question: If tossing a coin, what is the probability of the coin

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

TA Qinru Shi: Based on poll result, the first Programming Boot Camp will be: this Sunday 5 Feb, 7-8pm Gates 114.

TA Qinru Shi: Based on poll result, the first Programming Boot Camp will be: this Sunday 5 Feb, 7-8pm Gates 114. TA Qinru Shi: Based on poll result, the first Programming Boot Camp will be: this Sunday 5 Feb, 7-8pm Gates 114. Prob Set 1: to be posted tomorrow. due in roughly a week Finite probability space S 1) a

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

Lecture 1: Basics of Probability

Lecture 1: Basics of Probability Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning

More information

Probability Dr. Manjula Gunarathna 1

Probability Dr. Manjula Gunarathna 1 Probability Dr. Manjula Gunarathna Probability Dr. Manjula Gunarathna 1 Introduction Probability theory was originated from gambling theory Probability Dr. Manjula Gunarathna 2 History of Probability Galileo

More information

Lecture 8: Probability

Lecture 8: Probability Lecture 8: Probability The idea of probability is well-known The flipping of a balanced coin can produce one of two outcomes: T (tail) and H (head) and the symmetry between the two outcomes means, of course,

More information

Lecture 10. Variance and standard deviation

Lecture 10. Variance and standard deviation 18.440: Lecture 10 Variance and standard deviation Scott Sheffield MIT 1 Outline Defining variance Examples Properties Decomposition trick 2 Outline Defining variance Examples Properties Decomposition

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

Probability. Carlo Tomasi Duke University

Probability. Carlo Tomasi Duke University Probability Carlo Tomasi Due University Introductory concepts about probability are first explained for outcomes that tae values in discrete sets, and then extended to outcomes on the real line 1 Discrete

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution Random Variable Theoretical Probability Distribution Random Variable Discrete Probability Distributions A variable that assumes a numerical description for the outcome of a random eperiment (by chance).

More information

CS206 Review Sheet 3 October 24, 2018

CS206 Review Sheet 3 October 24, 2018 CS206 Review Sheet 3 October 24, 2018 After ourintense focusoncounting, wecontinue withthestudyofsomemoreofthebasic notions from Probability (though counting will remain in our thoughts). An important

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of

Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of repetitions. (p229) That is, probability is a long-term

More information

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc. Chapter 14 From Randomness to Probability Copyright 2012, 2008, 2005 Pearson Education, Inc. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,

More information

Chapter 2 Class Notes

Chapter 2 Class Notes Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such

More information

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics:

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics: Executive Assessment Math Review Although the following provides a review of some of the mathematical concepts of arithmetic and algebra, it is not intended to be a textbook. You should use this chapter

More information

Review: Probability. BM1: Advanced Natural Language Processing. University of Potsdam. Tatjana Scheffler

Review: Probability. BM1: Advanced Natural Language Processing. University of Potsdam. Tatjana Scheffler Review: Probability BM1: Advanced Natural Language Processing University of Potsdam Tatjana Scheffler tatjana.scheffler@uni-potsdam.de October 21, 2016 Today probability random variables Bayes rule expectation

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability What is Probability? the chance of an event occuring eg 1classical probability 2empirical probability 3subjective probability Section 2 - Probability (1) Probability - Terminology random (probability)

More information

Origins of Probability Theory

Origins of Probability Theory 1 16.584: INTRODUCTION Theory and Tools of Probability required to analyze and design systems subject to uncertain outcomes/unpredictability/randomness. Such systems more generally referred to as Experiments.

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics February 26, 2018 CS 361: Probability & Statistics Random variables The discrete uniform distribution If every value of a discrete random variable has the same probability, then its distribution is called

More information

Deviations from the Mean

Deviations from the Mean Deviations from the Mean The Markov inequality for non-negative RVs Variance Definition The Bienaymé Inequality For independent RVs The Chebyeshev Inequality Markov s Inequality For any non-negative random

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

EE514A Information Theory I Fall 2013

EE514A Information Theory I Fall 2013 EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Probability Year 9. Terminology

Probability Year 9. Terminology Probability Year 9 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

Statistical Theory 1

Statistical Theory 1 Statistical Theory 1 Set Theory and Probability Paolo Bautista September 12, 2017 Set Theory We start by defining terms in Set Theory which will be used in the following sections. Definition 1 A set is

More information

Counting principles, including permutations and combinations.

Counting principles, including permutations and combinations. 1 Counting principles, including permutations and combinations. The binomial theorem: expansion of a + b n, n ε N. THE PRODUCT RULE If there are m different ways of performing an operation and for each

More information

Outline. Probability. Math 143. Department of Mathematics and Statistics Calvin College. Spring 2010

Outline. Probability. Math 143. Department of Mathematics and Statistics Calvin College. Spring 2010 Outline Math 143 Department of Mathematics and Statistics Calvin College Spring 2010 Outline Outline 1 Review Basics Random Variables Mean, Variance and Standard Deviation of Random Variables 2 More Review

More information

Introduction to Stochastic Processes

Introduction to Stochastic Processes Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample

More information

Properties of Probability

Properties of Probability Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.

More information

Advanced Herd Management Probabilities and distributions

Advanced Herd Management Probabilities and distributions Advanced Herd Management Probabilities and distributions Anders Ringgaard Kristensen Slide 1 Outline Probabilities Conditional probabilities Bayes theorem Distributions Discrete Continuous Distribution

More information

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E. Compound Events Because we are using the framework of set theory to analyze probability, we can use unions, intersections and complements to break complex events into compositions of events for which it

More information

MA : Introductory Probability

MA : Introductory Probability MA 320-001: Introductory Probability David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma320/ Spring 2017 David Murrugarra (University of Kentucky) MA 320:

More information

Chapter 1 (Basic Probability)

Chapter 1 (Basic Probability) Chapter 1 (Basic Probability) What is probability? Consider the following experiments: 1. Count the number of arrival requests to a web server in a day. 2. Determine the execution time of a program. 3.

More information

CIVL Why are we studying probability and statistics? Learning Objectives. Basic Laws and Axioms of Probability

CIVL Why are we studying probability and statistics? Learning Objectives. Basic Laws and Axioms of Probability CIVL 3103 Basic Laws and Axioms of Probability Why are we studying probability and statistics? How can we quantify risks of decisions based on samples from a population? How should samples be selected

More information

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or

More information

STAT 430/510 Probability

STAT 430/510 Probability STAT 430/510 Probability Hui Nie Lecture 3 May 28th, 2009 Review We have discussed counting techniques in Chapter 1. Introduce the concept of the probability of an event. Compute probabilities in certain

More information

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Probability assigns a likelihood to results of experiments that have not yet been conducted. Suppose that the experiment has

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Statistics for Managers Using Microsoft Excel (3 rd Edition) Statistics for Managers Using Microsoft Excel (3 rd Edition) Chapter 4 Basic Probability and Discrete Probability Distributions 2002 Prentice-Hall, Inc. Chap 4-1 Chapter Topics Basic probability concepts

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Fundamentals of Probability CE 311S

Fundamentals of Probability CE 311S Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

FE 490 Engineering Probability and Statistics. Donald E.K. Martin Department of Statistics

FE 490 Engineering Probability and Statistics. Donald E.K. Martin Department of Statistics FE 490 Engineering Probability and Statistics Donald E.K. Martin Department of Statistics 1 Dispersion, Mean, Mode 1. The population standard deviation of the data points 2,1,6 is: (A) 1.00 (B) 1.52 (C)

More information

Probability Theory for Machine Learning. Chris Cremer September 2015

Probability Theory for Machine Learning. Chris Cremer September 2015 Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares

More information

Review of Statistics

Review of Statistics Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and

More information