Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
|
|
- Homer Bruce Robbins
- 5 years ago
- Views:
Transcription
1 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures. They are nowhere near accurate representations of what was actually lectured, and in particular, all errors are almost surely mine. Basic concepts Classical probability, equally likely outcomes. Combinatorial analysis, permutations and combinations. Stirling s formula (asymptotics for log n! proved). [3] Axiomatic approach Axioms (countable case). Probability spaces. Inclusion-exclusion formula. Continuity and subadditivity of probability measures. Independence. Binomial, Poisson and geometric distributions. Relation between Poisson and binomial distributions. Conditional probability, Bayes s formula. Examples, including Simpson s paradox. [5] Discrete random variables Expectation. Functions of a random variable, indicator function, variance, standard deviation. Covariance, independence of random variables. Generating functions: sums of independent random variables, random sum formula, moments. Conditional expectation. Random walks: gambler s ruin, recurrence relations. Difference equations and their solution. Mean time to absorption. Branching processes: generating functions and extinction probability. Combinatorial applications of generating functions. [7] Continuous random variables Distributions and density functions. Expectations; expectation of a function of a random variable. Uniform, normal and exponential random variables. Memoryless property of exponential distribution. Joint distributions: transformation of random variables (including Jacobians), examples. Simulation: generating continuous random variables, independent normal random variables. Geometrical probability: Bertrand s paradox, Buffon s needle. Correlation coefficient, bivariate normal random variables. [6] Inequalities and limits Markov s inequality, Chebyshev s inequality. Weak law of large numbers. Convexity: Jensen s inequality for general random variables, AM/GM inequality. Moment generating functions and statement (no proof) of continuity theorem. Statement of central limit theorem and sketch of proof. Examples, including sampling. [3] 1
2 Contents IA Probability (Theorems) Contents 0 Introduction 3 1 Classical probability Classical probability Counting Stirling s formula Axioms of probability Axioms and definitions Inequalities and formulae Independence Important discrete distributions Conditional probability Discrete random variables Discrete random variables Inequalities Weak law of large numbers Multiple random variables Probability generating functions Interesting problems Branching processes Random walk and gambler s ruin Continuous random variables Continuous random variables Stochastic ordering and inspection paradox Jointly distributed random variables Geometric probability The normal distribution Transformation of random variables Moment generating functions More distributions Cauchy distribution Gamma distribution Beta distribution* More on the normal distribution Multivariate normal Central limit theorem 15 8 Summary of distributions Discrete distributions Continuous distributions
3 0 Introduction IA Probability (Theorems) 0 Introduction 3
4 1 Classical probability IA Probability (Theorems) 1 Classical probability 1.1 Classical probability 1.2 Counting Theorem (Fundamental rule of counting). Suppose we have to make r multiple choices in sequence. There are m 1 possibilities for the first choice, m 2 possibilities for the second etc. Then the total number of choices is m 1 m 2 m r. 1.3 Stirling s formula Proposition. log n! n log n Theorem (Stirling s formula). As n, ( ) n!e n log = log ( ) 1 2π + O n n n+ 1 2 Corollary. n! 2πn n+ 1 2 e n Proposition (non-examinable). We use the 1/12n term from the proof above to get a better approximation: 2πn n+1/2 e n+ 1 12n+1 n! 2πn n+1/2 e n+ 1 12n. 4
5 2 Axioms of probability IA Probability (Theorems) 2 Axioms of probability 2.1 Axioms and definitions Theorem. (i) P( ) = 0 (ii) P(A C ) = 1 P(A) (iii) A B P(A) P(B) (iv) P(A B) = P(A) + P(B) P(A B). Theorem. If A 1, A 2, is increasing or decreasing, then ( ) lim P(A n) = P lim A n. n n 2.2 Inequalities and formulae Theorem (Boole s inequality). For any A 1, A 2,, ( ) P A i P(A i ). i=1 Theorem (Inclusion-exclusion formula). ( n ) n P A i = P(A i ) P(A i1 A j2 ) + P(A i1 A i2 A i3 ) i 1 i 1<i 2 i 1<i 2<i 3 + ( 1) n 1 P(A 1 A n ). Theorem (Bonferroni s inequalities). For any events A 1, A 2,, A n and 1 r n, if r is odd, then ( n ) P A i P(A i1 ) P(A i1 A i2 ) + P(A i1 A i2 A i3 ) + 1 i 1 i 1<i 2 i 1<i 2<i 3 + P(A i1 A i2 A i3 A ir ). i 1<i 2< <i r If r is even, then ( n ) P A i P(A i1 ) P(A i1 A i2 ) + P(A i1 A i2 A i3 ) + 1 i 1 i 1<i 2 i 1<i 2<i 3 P(A i1 A i2 A i3 A ir ). i 1<i 2< <i r 2.3 Independence Proposition. If A and B are independent, then A and B C are independent. i=1 5
6 2 Axioms of probability IA Probability (Theorems) 2.4 Important discrete distributions Theorem (Poisson approximation to binomial). Suppose n and p 0 such that np = λ. Then ( n q k = )p k (1 p) n k λk k k! e λ. 2.5 Conditional probability Theorem. (i) P(A B) = P(A B)P(B). (ii) P(A B C) = P(A B C)P(B C)P(C). (iii) P(A B C) = P(A B C) P(B C). (iv) The function P( B) restricted to subsets of B is a probability function (or measure). Proposition. If B i is a partition of the sample space, and A is any event, then P(A) = P(A B i ) = P(A B i )P(B i ). i=1 Theorem (Bayes formula). Suppose B i is a partition of the sample space, and A and B i all have non-zero probability. Then for any B i, i=1 P(B i A) = P(A B i)p(b i ) j P(A B j)p(b j ). Note that the denominator is simply P(A) written in a fancy way. 6
7 3 Discrete random variables IA Probability (Theorems) 3 Discrete random variables 3.1 Discrete random variables Theorem. (i) If X 0, then E[X] 0. (ii) If X 0 and E[X] = 0, then P(X = 0) = 1. (iii) If a and b are constants, then E[a + bx] = a + be[x]. (iv) If X and Y are random variables, then E[X + Y ] = E[X] + E[Y ]. This is true even if X and Y are not independent. (v) E[X] is a constant that minimizes E[(X c) 2 ] over c. Theorem. For any random variables X 1, X 2, X n, for which the following expectations exist, [ n ] n E X i = E[X i ]. Theorem. i=1 i=1 (i) var X 0. If var X = 0, then P(X = E[X]) = 1. (ii) var(a + bx) = b 2 var(x). This can be proved by expanding the definition and using the linearity of the expected value. (iii) var(x) = E[X 2 ] E[X] 2, also proven by expanding the definition. Proposition. E[I[A]] = ω p(ω)i[a](ω) = P(A). I[A C ] = 1 I[A]. I[A B] = I[A]I[B]. I[A B] = I[A] + I[B] I[A]I[B]. I[A] 2 = I[A]. Theorem (Inclusion-exclusion formula). ( n ) n P A i = P(A i ) P(A i1 A j2 ) + P(A i1 A i2 A i3 ) i 1 i 1<i 2 i 1<i 2<i 3 + ( 1) n 1 P(A 1 A n ). Theorem. If X 1,, X n are independent random variables, and f 1,, f n are functions R R, then f 1 (X 1 ),, f n (X n ) are independent random variables. Theorem. If X 1,, X n are independent random variables and all the following expectations exists, then [ ] E Xi = E[X i ]. 7
8 3 Discrete random variables IA Probability (Theorems) Corollary. Let X 1, X n be independent random variables, and f 1, f 2, f n are functions R R. Then [ ] E fi (x i ) = E[f i (x i )]. Theorem. If X 1, X 2, X n are independent random variables, then ( ) var Xi = var(x i ). Corollary. Let X 1, X 2, X n be independent identically distributed random variables (iid rvs). Then ( 1 ) var Xi = 1 n n var(x 1). 3.2 Inequalities Proposition. If f is differentiable and f (x) 0 for all x (a, b), then it is convex. It is strictly convex if f (x) > 0. Theorem (Jensen s inequality). If f : (a, b) R is convex, then ( n n ) p i f(x i ) f p i x i i=1 for all p 1, p 2,, p n such that p i 0 and p i = 1, and x i (a, b). This says that E[f(X)] f(e[x]) (where P(X = x i ) = p i ). If f is strictly convex, then equalities hold only if all x i are equal, i.e. X takes only one possible value. Corollary (AM-GM inequality). Given x 1,, x n positive reals, then i=1 ( ) 1/n 1 xi xi. n Theorem (Cauchy-Schwarz inequality). For any two random variables X, Y, (E[XY ]) 2 E[X 2 ]E[Y 2 ]. Theorem (Markov inequality). If X is a random variable with E X < and ε > 0, then P( X ε) E X. ε Theorem (Chebyshev inequality). If X is a random variable with E[X 2 ] < and ε > 0, then P( X ε) E[X2 ] ε 2. 8
9 3 Discrete random variables IA Probability (Theorems) 3.3 Weak law of large numbers Theorem (Weak law of large numbers). Let X 1, X 2, be iid random variables, with mean µ and var σ 2. Let S n = n i=1 X i. Then for all ε > 0, as n. We say, Sn n ( ) S n P n µ ε 0 tends to µ (in probability), or S n n p µ. Theorem (Strong law of large numbers). ( ) Sn P n µ as n = 1. We say S n n as µ, where as means almost surely. 3.4 Multiple random variables Proposition. (i) cov(x, c) = 0 for constant c. (ii) cov(x + c, Y ) = cov(x, Y ). (iii) cov(x, Y ) = cov(y, X). (iv) cov(x, Y ) = E[XY ] E[X]E[Y ]. (v) cov(x, X) = var(x). (vi) var(x + Y ) = var(x) + var(y ) + 2 cov(x, Y ). (vii) If X, Y are independent, cov(x, Y ) = 0. Proposition. corr(x, Y ) 1. Theorem. If X and Y are independent, then E[X Y ] = E[X] Theorem (Tower property of conditional expectation). E Y [E X [X Y ]] = E X [X], where the subscripts indicate what variable the expectation is taken over. 9
10 3 Discrete random variables IA Probability (Theorems) 3.5 Probability generating functions Theorem. The distribution of X is uniquely determined by its probability generating function. Theorem (Abel s lemma). E[X] = lim z 1 p (z). If p (z) is continuous, then simply E[X] = p (1). Theorem. E[X(X 1)] = lim z 1 p (z). Theorem. Suppose X 1, X 2,, X n are independent random variables with pgfs p 1, p 2,, p n. Then the pgf of X 1 + X X n is p 1 (z)p 2 (z) p n (z). 10
11 4 Interesting problems IA Probability (Theorems) 4 Interesting problems 4.1 Branching processes Theorem. F n+1 (z) = F n (F (z)) = F (F (F ( F (z) )))) = F (F n (z)). Theorem. Suppose E[X 1 ] = kp k = µ and Then var(x 1 ) = E[(X µ) 2 ] = (k µ) 2 p k <. E[X n ] = µ n, var X n = σ 2 µ n 1 (1 + µ + µ µ n 1 ). Theorem. The probability of extinction q is the smallest root to the equation q = F (q). Write µ = E[X 1 ]. Then if µ 1, then q = 1; if µ > 1, then q < Random walk and gambler s ruin 11
12 5 Continuous random variables IA Probability (Theorems) 5 Continuous random variables 5.1 Continuous random variables Proposition. The exponential random variable is memoryless, i.e. P(X x + z X x) = P(X z). This means that, say if X measures the lifetime of a light bulb, knowing it has already lasted for 3 hours does not give any information about how much longer it will last. Theorem. If X is a continuous random variable, then E[X] = 0 P(X x) dx 0 P(X x) dx. 5.2 Stochastic ordering and inspection paradox 5.3 Jointly distributed random variables Theorem. If X and Y are jointly continuous random variables, then they are individually continuous random variables. Proposition. For independent continuous random variables X i, (i) E[ X i ] = E[X i ] (ii) var( X i ) = var(x i ) 5.4 Geometric probability 5.5 The normal distribution Proposition. Proposition. E[X] = µ. Proposition. var(x) = σ πσ 2 e 1 2σ 2 (x µ)2 dx = Transformation of random variables Theorem. If X is a continuous random variable with a pdf f(x), and h(x) is a continuous, strictly increasing function with h 1 (x) differentiable, then Y = h(x) is a random variable with pdf f Y (y) = f X (h 1 (y)) d dy h 1 (y). Theorem. Let U U[0, 1]. For any strictly increasing distribution function F, the random variable X = F 1 U has distribution function F. Proposition. (Y 1,, Y n ) has density g(y 1,, y n ) = f(s 1 (y 1,, y n ), s n (y 1,, y n )) J if (y 1,, y n ) S, 0 otherwise. 12
13 5 Continuous random variables IA Probability (Theorems) 5.7 Moment generating functions Theorem. The mgf determines the distribution of X provided m(θ) is finite for all θ in some interval containing the origin. Theorem. The rth moment X is the coefficient of θr r! in the power series expansion of m(θ), and is E[X r ] = dn dθ n m(θ) = m (n) (0). θ=0 Theorem. If X and Y are independent random variables with moment generating functions m X (θ), m Y (θ), then X + Y has mgf m X+Y (θ) = m X (θ)m Y (θ). 13
14 6 More distributions IA Probability (Theorems) 6 More distributions 6.1 Cauchy distribution Proposition. The mean of the Cauchy distribution is undefined, while E[X 2 ] =. 6.2 Gamma distribution 6.3 Beta distribution* 6.4 More on the normal distribution Proposition. The moment generating function of N(µ, σ 2 ) is E[e θx ] = exp (θµ + 12 ) θ2 σ 2. Theorem. Suppose X, Y are independent random variables with X N(µ 1, σ 2 1), and Y (µ 2, σ 2 2). Then (i) X + Y N(µ 1 + µ 2, σ σ 2 2). (ii) ax N(aµ 1, a 2 σ 2 1). 6.5 Multivariate normal 14
15 7 Central limit theorem IA Probability (Theorems) 7 Central limit theorem Theorem (Central limit theorem). Let X 1, X 2, be iid random variables with E[X i ] = µ, var(x i ) = σ 2 <. Define Then for all finite intervals (a, b), S n = X X n. ( lim P a S ) n nµ b n σ n b 1 = e 1 2 t2 dt. a 2π Note that the final term is the pdf of a standard normal. We say S n nµ σ n D N(0, 1). Theorem (Continuity theorem). If the random variables X 1, X 2, have mgf s m 1 (θ), m 2 (θ), and m n (θ) m(θ) as n for all θ, then X n D the random variable with mgf m(θ). 15
16 8 Summary of distributions IA Probability (Theorems) 8 Summary of distributions 8.1 Discrete distributions 8.2 Continuous distributions 16
Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationProbability. J.R. Norris. December 13, 2017
Probability J.R. Norris December 13, 2017 1 Contents 1 Mathematical models for randomness 6 1.1 A general definition............................... 6 1.2 Equally likely outcomes.............................
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationEE514A Information Theory I Fall 2013
EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationSTAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.
STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationProbability. Copier s Message. Schedules. This is a pure course.
Probability This is a pure course Copier s Message These notes may contain errors In fact, they almost certainly do since they were just copied down by me during lectures and everyone makes mistakes when
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationProbability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.
Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel
More information3. DISCRETE RANDOM VARIABLES
IA Probability Lent Term 3 DISCRETE RANDOM VARIABLES 31 Introduction When an experiment is conducted there may be a number of quantities associated with the outcome ω Ω that may be of interest Suppose
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationPROBABILITY VITTORIA SILVESTRI
PROBABILITY VITTORIA SILVESTRI Contents Preface 2 1. Introduction 3 2. Combinatorial analysis 6 3. Stirling s formula 9 4. Properties of Probability measures 12 5. Independence 17 6. Conditional probability
More informationProbability. Prof. F.P. Kelly. Lent These notes are maintained by Paul Metcalfe. Comments and corrections to
Probability Prof. F.P. Kelly Lent 996 These notes are maintained by Paul Metcalfe. Comments and corrections to pdm23@cam.ac.uk. Revision:. Date: 2002/09/20 4:45:43 The following people have maintained
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationChapter 7. Basic Probability Theory
Chapter 7. Basic Probability Theory I-Liang Chern October 20, 2016 1 / 49 What s kind of matrices satisfying RIP Random matrices with iid Gaussian entries iid Bernoulli entries (+/ 1) iid subgaussian entries
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationPart IB Statistics. Theorems with proof. Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua. Lent 2015
Part IB Statistics Theorems with proof Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationProbability: Handout
Probability: Handout Klaus Pötzelberger Vienna University of Economics and Business Institute for Statistics and Mathematics E-mail: Klaus.Poetzelberger@wu.ac.at Contents 1 Axioms of Probability 3 1.1
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationLecture 1: Review on Probability and Statistics
STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating
More informationLecture 1 Measure concentration
CSE 29: Learning Theory Fall 2006 Lecture Measure concentration Lecturer: Sanjoy Dasgupta Scribe: Nakul Verma, Aaron Arvey, and Paul Ruvolo. Concentration of measure: examples We start with some examples
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More informationCSE 312 Final Review: Section AA
CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationMA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2
MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More informationExpectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationIntroduction to Machine Learning
Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationProbability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3
Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationDisjointness and Additivity
Midterm 2: Format Midterm 2 Review CS70 Summer 2016 - Lecture 6D David Dinh 28 July 2016 UC Berkeley 8 questions, 190 points, 110 minutes (same as MT1). Two pages (one double-sided sheet) of handwritten
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationMeasure-theoretic probability
Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is
More informationIf g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get
18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More information2 n k In particular, using Stirling formula, we can calculate the asymptotic of obtaining heads exactly half of the time:
Chapter 1 Random Variables 1.1 Elementary Examples We will start with elementary and intuitive examples of probability. The most well-known example is that of a fair coin: if flipped, the probability of
More informationChapter 6: Functions of Random Variables
Chapter 6: Functions of Random Variables We are often interested in a function of one or several random variables, U(Y 1,..., Y n ). We will study three methods for determining the distribution of a function
More informationMidterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley
Midterm 2 Review CS70 Summer 2016 - Lecture 6D David Dinh 28 July 2016 UC Berkeley Midterm 2: Format 8 questions, 190 points, 110 minutes (same as MT1). Two pages (one double-sided sheet) of handwritten
More informationProbability and Statistics
Probability and Statistics Jane Bae Stanford University hjbae@stanford.edu September 16, 2014 Jane Bae (Stanford) Probability and Statistics September 16, 2014 1 / 35 Overview 1 Probability Concepts Probability
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationProduct measure and Fubini s theorem
Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationReview of Probabilities and Basic Statistics
Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to
More informationSTAT 418: Probability and Stochastic Processes
STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationPractice Examination # 3
Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single
More information2. AXIOMATIC PROBABILITY
IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationHANDBOOK OF APPLICABLE MATHEMATICS
HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume II: Probability Emlyn Lloyd University oflancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester - New York - Brisbane
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More information