Limit Theorems. STATISTICS Lecture no Department of Econometrics FEM UO Brno office 69a, tel

Size: px
Start display at page:

Download "Limit Theorems. STATISTICS Lecture no Department of Econometrics FEM UO Brno office 69a, tel"

Transcription

1 STATISTICS Lecture no. 6 Department of Econometrics FEM UO Brno office 69a, tel jiri.neubauer@unob.cz

2 If we repeat some experiment independently we can create using given observed values distribution of relative frequencies and calculate some measures (mean, median, variance... ).

3 If we repeat some experiment independently we can create using given observed values distribution of relative frequencies and calculate some measures (mean, median, variance... ). This distribution (measures) we call sample distribution (sample measures).

4 If we repeat some experiment independently we can create using given observed values distribution of relative frequencies and calculate some measures (mean, median, variance... ). This distribution (measures) we call sample distribution (sample measures). Under particular conditions we can expect that the sample distribution (measures) will converge toward a theoretical distribution (measures). The more repetitions of the experiment the better convergence.

5 Notice that the convergence of the sample values toward theoretical ones is not the convergence in the sense of mathematical convergence, but the probability convergence.

6 Notice that the convergence of the sample values toward theoretical ones is not the convergence in the sense of mathematical convergence, but the probability convergence. The probability convergence if the number of experiments increases, the probability of deviation between sample values and theoretical values decreases.

7 Convergence in probability Definition If the sequence of random variables X 1, X 2,..., X n,... fulfils lim P( X n c < ɛ) = 1, ɛ > 0, n it is said that the sequence {X n } converges in probability to the constant c, we write X n P c.

8 Chebyshev s Inequality Theorem For any random variable X with the mean E(X ), the finite variance D(X ) and for every ɛ > 0 we have P( X E(X ) < ɛ) 1 D(X ) ɛ 2. Chebyshev s inequality is useful fist of all in the theoretical field. It allow us to estimate some probabilities of random variables with unknown distribution.

9 Bernoulli s Theorem Theorem If the random variable X denotes the number of occurrence of the event in the sequence of n independent experiments, where π is the probability of occurrence of the event in one experiment, then for every ɛ > 0 is ( ) lim P X n n π < ɛ = 1.

10 Theorem Let X be a random variable with binomial distribution X B(n, π) a For the standardized random variable U = X nπ nπ(1 π) we have lim P(U u) = Φ(u), n where Φ(u) is the distribution function of the standard normal distribution N(0, 1). a X = X 1, X 2,..., X n, where X i, i = 1..., n, are independent Bernoulli random variables E(X i ) = π, D(X i ) = π(1 π), which means E(X ) = nπ and D(X ) = nπ(1 π).

11 The de Moivre-Laplace theorem says that for n the binomial distribution converges to the normal distribution. Given approximation is acceptable if nπ(1 π) > 9 and 1 n + 1 < π < n n + 1.

12 for proportion Theorem Let X be a random variable with binomial distribution X B(n, π) The random variable X n has the mean E ( ) X n = π and the variance D ( ) X n = π(1 π) n. For the standardized random variable U = X n π π(1 π) n we have lim P(U u) = Φ(u), n where Φ(u) is the distribution function of the standard normal distribution N(0, 1).

13 Theorem Let the random variable be X = X 1 + X X n, where X i, i = 1,..., n are independent random variables with the same distribution with the mean E(X i ) = µ and the finite variance D(X i ) = σ 2, a For the standardized random variable U = X nµ nσ 2 we have lim P(U u) = Φ(u), n where Φ(u) is the distribution function of the standard normal distribution N(0, 1). a E(X ) = nµ and D(X ) = nσ 2

14 for the Mean Theorem Let the random variable X be the mean of n independent random variables X 1, X 2,..., X n, with the same distribution and the mean E(X i ) = µ and the finite variance D(X i ) = σ 2, i = 1,..., n, then E(X ) = µ and D(X ) = σ2 n and for the standardized random variable we have U = X µ n σ lim P(U u) = Φ(u), n where Φ(u) is the distribution function of the standard normal distribution N(0, 1).

15 For M = X X n is: M = n X i as.n(nµ, nσ 2 ), E(M) = nµ, D(M) = nσ 2 i=1

16 For M = X X n is: M = n X i as.n(nµ, nσ 2 ), E(M) = nµ, D(M) = nσ 2 i=1 U = M E(M) D(M) = M nµ nσ 2 as.n(0, 1)

17 For M = X X n is: M = n X i as.n(nµ, nσ 2 ), E(M) = nµ, D(M) = nσ 2 i=1 U = M E(M) D(M) P(M m) = F (m) Φ = M nµ as.n(0, 1) nσ 2 ( ) m nµ nσ 2

18 For M = X X n is: M = n X i as.n(nµ, nσ 2 ), E(M) = nµ, D(M) = nσ 2 i=1 U = M E(M) D(M) = M nµ nσ 2 as.n(0, 1) ( ) m nµ nσ 2 ) P(M m) = F (m) Φ ( P u 1 α/2 < m nµ < u nσ 2 1 α/2 = 1 α

19 For the sample mean X is: n X = 1 n X i as.n(µ, σ2 σ2 n ), E(M) = µ, D(M) = n i=1

20 For the sample mean X is: n X = 1 n X i as.n(µ, σ2 σ2 n ), E(M) = µ, D(M) = n i=1 U = X E(X ) D(X = X µ ) σ n as.n(0, 1)

21 For the sample mean X is: n X = 1 n X i as.n(µ, σ2 σ2 n ), E(M) = µ, D(M) = n i=1 U = X E(X ) D(X = X µ ) σ P(X x) = F (x) Φ n as.n(0, 1) ( x µ ) σ n

22 For the sample mean X is: n X = 1 n X i as.n(µ, σ2 σ2 n ), E(M) = µ, D(M) = n i=1 U = X D(X E(X ) = X µ ) σ n as.n(0, 1) ( P(X x) = F (x) Φ x µ ) σ n ( P u 1 α/2 < x µ ) σ n < u1 α/2 = 1 α

23 Continuity Correction In the case of using the normal distribution as an approximation of a distribution of a discrete random variable, it is recommended to apply so called continuity correction which improves this approximation. If we calculate P(X x) or P(X x) by normal approximation, we get undervalued results. On the contrary if we calculate P(X < x) or P(X > x) by normal approximation, we get overvalued results.

24 Continuity Correction Some examples of continuity correction: before correction x < 3 x 3 x = 5 x 7 x > 7 after correction x < 2.5 x < < x < 5.5 x > 6.5 x > 7.5

25 Example 1 The probability that you hit the target is 0.8. What is the probability that the difference between the number of hits in the sequence of 200 shots and the mean of the this number will not be large than 10?

26 Example 1 The binomial distribution: E(X ) = nπ = = 160 D(X ) = nπ(1 π) = (1 0.8) = 32

27 Example 1 The binomial distribution: E(X ) = nπ = = 160 D(X ) = nπ(1 π) = (1 0.8) = 32 P(150 X 170) = p(150) + p(151) + + p(170) = = ( ) ( ) ( ) = 0.937

28 Example 1 de Moivre-Laplace theorem: ( ) x nπ F (x) Φ nπ(1 π) P(150 X 170) = F (170) F (149) Φ ( ) Φ ( ) = 0.936

29 Example 1 de Moivre-Laplace theorem (with continuity correction): ( ) x nπ F (x) Φ nπ(1 π) P(150 X 170) P(149.5 < X < 170.5) = F (170.5) F (149.5) = = Φ ( ) Φ ( ) = 0.937

30 Example 1 Chebyshev s inequality: P( X E(X ) < ɛ) 1 D(X ) ɛ 2

31 Example 1 Chebyshev s inequality: P( X E(X ) < ɛ) 1 D(X ) ɛ 2 E(X ) = nπ = = 160 D(X ) = nπ(1 π) = (1 0.8) = 32

32 Example 1 Chebyshev s inequality: P( X E(X ) < ɛ) 1 D(X ) ɛ 2 E(X ) = nπ = = 160 D(X ) = nπ(1 π) = (1 0.8) = 32 P( X 160) < 10) = 0.68

33 Example 1 Chebyshev s inequality: P( X E(X ) < ɛ) 1 D(X ) ɛ 2 E(X ) = nπ = = 160 D(X ) = nπ(1 π) = (1 0.8) = 32 P( X 160) < 10) = 0.68 P( X 160) < 11) = 0.736

34 Example 2 In some elections the coalition obtained 52 % of votes. What is the probability that in the public opinion research of the size 2600 respondents the opposition won?

35 Example 2 X... the number of respondents who voted the opposition X B(2600; 0.48)

36 Example 2 X... the number of respondents who voted the opposition X B(2600; 0.48) E(X ) = nπ = = 1248 D(X ) = nπ(1 π) = (1 0.48) =

37 Example 2 P(X > 1300) = 1 P(X [ 1300) = 1 [p(0) + + p(1300)] = (2600 ) = ( ) ] = =

38 Example 2 de Moivre-Laplace theorem: ( ) x nπ F (x) Φ nπ(1 π) P(X > 1300) = 1 P(X 1300) = 1 F (1300) 1 Φ ( ) = =

39 Example 2 de Moivre-Laplace theorem (with continuity correction): ( ) x nπ F (x) Φ nπ(1 π) P(X > 1300) = 1 P(X 1300) 1 P(X < ) = = 1 Φ ( ) = = =

Hypothesis Testing One Sample Tests

Hypothesis Testing One Sample Tests STATISTICS Lecture no. 13 Department of Econometrics FEM UO Brno office 69a, tel. 973 442029 email:jiri.neubauer@unob.cz 12. 1. 2010 Tests on Mean of a Normal distribution Tests on Variance of a Normal

More information

MAS113 Introduction to Probability and Statistics

MAS113 Introduction to Probability and Statistics MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically

More information

Introduction to Probability

Introduction to Probability LECTURE NOTES Course 6.041-6.431 M.I.T. FALL 2000 Introduction to Probability Dimitri P. Bertsekas and John N. Tsitsiklis Professors of Electrical Engineering and Computer Science Massachusetts Institute

More information

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random

More information

Fundamental Tools - Probability Theory IV

Fundamental Tools - Probability Theory IV Fundamental Tools - Probability Theory IV MSc Financial Mathematics The University of Warwick October 1, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory IV 1 / 14 Model-independent

More information

Practice Problem - Skewness of Bernoulli Random Variable. Lecture 7: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example

Practice Problem - Skewness of Bernoulli Random Variable. Lecture 7: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example A little more E(X Practice Problem - Skewness of Bernoulli Random Variable Lecture 7: and the Law of Large Numbers Sta30/Mth30 Colin Rundel February 7, 014 Let X Bern(p We have shown that E(X = p Var(X

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

COMPSCI 240: Reasoning Under Uncertainty

COMPSCI 240: Reasoning Under Uncertainty COMPSCI 240: Reasoning Under Uncertainty Andrew Lan and Nic Herndon University of Massachusetts at Amherst Spring 2019 Lecture 20: Central limit theorem & The strong law of large numbers Markov and Chebyshev

More information

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example Midterm #1 Midterm 1 Lecture 10: and the Law of Large Numbers Statistics 104 Colin Rundel February 0, 01 Exam will be passed back at the end of class Exam was hard, on the whole the class did well: Mean:

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

COMP2610/COMP Information Theory

COMP2610/COMP Information Theory COMP2610/COMP6261 - Information Theory Lecture 9: Probabilistic Inequalities Mark Reid and Aditya Menon Research School of Computer Science The Australian National University August 19th, 2014 Mark Reid

More information

Twelfth Problem Assignment

Twelfth Problem Assignment EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X

More information

Proving the central limit theorem

Proving the central limit theorem SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit

More information

Sampling Distributions

Sampling Distributions Sampling Distributions Mathematics 47: Lecture 9 Dan Sloughter Furman University March 16, 2006 Dan Sloughter (Furman University) Sampling Distributions March 16, 2006 1 / 10 Definition We call the probability

More information

Homework for 1/13 Due 1/22

Homework for 1/13 Due 1/22 Name: ID: Homework for 1/13 Due 1/22 1. [ 5-23] An irregularly shaped object of unknown area A is located in the unit square 0 x 1, 0 y 1. Consider a random point distributed uniformly over the square;

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

Normal Distribution and Central Limit Theorem

Normal Distribution and Central Limit Theorem Normal Distribution and Central Limit Theorem Josemari Sarasola Statistics for Business Josemari Sarasola Normal Distribution and Central Limit Theorem 1 / 13 The normal distribution is the most applied

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Chapter 6: Large Random Samples Sections

Chapter 6: Large Random Samples Sections Chapter 6: Large Random Samples Sections 6.1: Introduction 6.2: The Law of Large Numbers Skip p. 356-358 Skip p. 366-368 Skip 6.4: The correction for continuity Remember: The Midterm is October 25th in

More information

Lecture 8 Sampling Theory

Lecture 8 Sampling Theory Lecture 8 Sampling Theory Thais Paiva STA 111 - Summer 2013 Term II July 11, 2013 1 / 25 Thais Paiva STA 111 - Summer 2013 Term II Lecture 8, 07/11/2013 Lecture Plan 1 Sampling Distributions 2 Law of Large

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

Lecture 4: Sampling, Tail Inequalities

Lecture 4: Sampling, Tail Inequalities Lecture 4: Sampling, Tail Inequalities Variance and Covariance Moment and Deviation Concentration and Tail Inequalities Sampling and Estimation c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1 /

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

ST 371 (IX): Theories of Sampling Distributions

ST 371 (IX): Theories of Sampling Distributions ST 371 (IX): Theories of Sampling Distributions 1 Sample, Population, Parameter and Statistic The major use of inferential statistics is to use information from a sample to infer characteristics about

More information

Lecture 18: Central Limit Theorem. Lisa Yan August 6, 2018

Lecture 18: Central Limit Theorem. Lisa Yan August 6, 2018 Lecture 18: Central Limit Theorem Lisa Yan August 6, 2018 Announcements PS5 due today Pain poll PS6 out today Due next Monday 8/13 (1:30pm) (will not be accepted after Wed 8/15) Programming part: Java,

More information

The central limit theorem

The central limit theorem 14 The central limit theorem The central limit theorem is a refinement of the law of large numbers For a large number of independent identically distributed random variables X 1,,X n, with finite variance,

More information

Math Spring Practice for the final Exam.

Math Spring Practice for the final Exam. Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University

Lecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University Lecture 14 Text: A Course in Probability by Weiss 5.6 STAT 225 Introduction to Probability Models February 23, 2014 Whitney Huang Purdue University 14.1 Agenda 14.2 Review So far, we have covered Bernoulli

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

ESS011 Mathematical statistics and signal processing

ESS011 Mathematical statistics and signal processing ESS011 Mathematical statistics and signal processing Lecture 9: Gaussian distribution, transformation formula for continuous random variables, and the joint distribution Tuomas A. Rajala Chalmers TU April

More information

Problems 5: Continuous Markov process and the diffusion equation

Problems 5: Continuous Markov process and the diffusion equation Problems 5: Continuous Markov process and the diffusion equation Roman Belavkin Middlesex University Question Give a definition of Markov stochastic process. What is a continuous Markov process? Answer:

More information

Math 151. Rumbos Spring Solutions to Review Problems for Exam 3

Math 151. Rumbos Spring Solutions to Review Problems for Exam 3 Math 151. Rumbos Spring 2014 1 Solutions to Review Problems for Exam 3 1. Suppose that a book with n pages contains on average λ misprints per page. What is the probability that there will be at least

More information

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/ STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This

More information

Overview. Confidence Intervals Sampling and Opinion Polls Error Correcting Codes Number of Pet Unicorns in Ireland

Overview. Confidence Intervals Sampling and Opinion Polls Error Correcting Codes Number of Pet Unicorns in Ireland Overview Confidence Intervals Sampling and Opinion Polls Error Correcting Codes Number of Pet Unicorns in Ireland Confidence Intervals When a random variable lies in an interval a X b with a specified

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

AN UNDERGRADUATE LECTURE ON THE CENTRAL LIMIT THEOREM

AN UNDERGRADUATE LECTURE ON THE CENTRAL LIMIT THEOREM AN UNDERGRADUATE LECTURE ON THE CENTRAL LIMIT THEOREM N.V. KRYLOV In the first section we explain why the central limit theorem for the binomial 1/2 distributions is natural. The second section contains

More information

Topic 7: Convergence of Random Variables

Topic 7: Convergence of Random Variables Topic 7: Convergence of Ranom Variables Course 003, 2016 Page 0 The Inference Problem So far, our starting point has been a given probability space (S, F, P). We now look at how to generate information

More information

Sequences: Limit Theorems

Sequences: Limit Theorems Sequences: Limit Theorems Limit Theorems Philippe B. Laval KSU Today Philippe B. Laval (KSU) Limit Theorems Today 1 / 20 Introduction These limit theorems fall in two categories. 1 The first category deals

More information

Bernoulli Trials, Binomial and Cumulative Distributions

Bernoulli Trials, Binomial and Cumulative Distributions Bernoulli Trials, Binomial and Cumulative Distributions Sec 4.4-4.6 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,

More information

Binomial and Poisson Probability Distributions

Binomial and Poisson Probability Distributions Binomial and Poisson Probability Distributions Esra Akdeniz March 3, 2016 Bernoulli Random Variable Any random variable whose only possible values are 0 or 1 is called a Bernoulli random variable. What

More information

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014 Lecture 13 Text: A Course in Probability by Weiss 5.5 STAT 225 Introduction to Probability Models February 16, 2014 Whitney Huang Purdue University 13.1 Agenda 1 2 3 13.2 Review So far, we have seen discrete

More information

Lecture 5: Moment Generating Functions

Lecture 5: Moment Generating Functions Lecture 5: Moment Generating Functions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 28th, 2018 Rasmussen (CUED) Lecture 5: Moment

More information

CSE 312 Final Review: Section AA

CSE 312 Final Review: Section AA CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material

More information

Moment Generating Functions

Moment Generating Functions MATH 382 Moment Generating Functions Dr. Neal, WKU Definition. Let X be a random variable. The moment generating function (mgf) of X is the function M X : R R given by M X (t ) = E[e X t ], defined for

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Chapter 6: Functions of Random Variables

Chapter 6: Functions of Random Variables Chapter 6: Functions of Random Variables We are often interested in a function of one or several random variables, U(Y 1,..., Y n ). We will study three methods for determining the distribution of a function

More information

Experimental Design and Statistics - AGA47A

Experimental Design and Statistics - AGA47A Experimental Design and Statistics - AGA47A Czech University of Life Sciences in Prague Department of Genetics and Breeding Fall/Winter 2014/2015 Matúš Maciak (@ A 211) Office Hours: M 14:00 15:30 W 15:30

More information

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

by Dimitri P. Bertsekas and John N. Tsitsiklis Last updated: November 29, 2002

by Dimitri P. Bertsekas and John N. Tsitsiklis Last updated: November 29, 2002 INTRODUCTION TO PROBABILITY by Dimitri P. Bertsekas and John N. Tsitsiklis CHAPTER 7: ADDITIONAL PROBLEMS Last updated: November 9, 00 Problems marked with [D] are from Fundamentals of Applied Probability,

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

simple if it completely specifies the density of x

simple if it completely specifies the density of x 3. Hypothesis Testing Pure significance tests Data x = (x 1,..., x n ) from f(x, θ) Hypothesis H 0 : restricts f(x, θ) Are the data consistent with H 0? H 0 is called the null hypothesis simple if it completely

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 5: Review on Probability Theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt Febraury 22 th, 2015 1 Lecture Outlines o Review on probability theory

More information

18.175: Lecture 13 Infinite divisibility and Lévy processes

18.175: Lecture 13 Infinite divisibility and Lévy processes 18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Chapter 1. Sets and probability. 1.3 Probability space

Chapter 1. Sets and probability. 1.3 Probability space Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

STA 260: Statistics and Probability II

STA 260: Statistics and Probability II Al Nosedal. University of Toronto. Winter 2017 1 Chapter 7. Sampling Distributions and the Central Limit Theorem If you can t explain it simply, you don t understand it well enough Albert Einstein. Theorem

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Probability Models of Information Exchange on Networks Lecture 1

Probability Models of Information Exchange on Networks Lecture 1 Probability Models of Information Exchange on Networks Lecture 1 Elchanan Mossel UC Berkeley All Rights Reserved Motivating Questions How are collective decisions made by: people / computational agents

More information

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13 Discrete Mathematics & Mathematical Reasoning Chapter 7 (continued): Markov and Chebyshev s Inequalities; and Examples in probability: the birthday problem Kousha Etessami U. of Edinburgh, UK Kousha Etessami

More information

Stats for Engineers: Lecture 4

Stats for Engineers: Lecture 4 Stats for Engineers: Lecture 4 Summary from last time Standard deviation σ measure spread of distribution μ Variance = (standard deviation) σ = var X = k μ P(X = k) k = k P X = k k μ σ σ k Discrete Random

More information

STAT 430/510: Lecture 10

STAT 430/510: Lecture 10 STAT 430/510: Lecture 10 James Piette June 9, 2010 Updates HW2 is due today! Pick up your HW1 s up in stat dept. There is a box located right when you enter that is labeled "Stat 430 HW1". It ll be out

More information

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2 STAT 4 Exam I Continuous RVs Fall 7 Practice. Suppose a random variable X has the following probability density function: f ( x ) = sin x, < x < π, zero otherwise. a) Find P ( X < 4 π ). b) Find µ = E

More information

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial

More information

Central Theorems Chris Piech CS109, Stanford University

Central Theorems Chris Piech CS109, Stanford University Central Theorems Chris Piech CS109, Stanford University Silence!! And now a moment of silence......before we present......a beautiful result of probability theory! Four Prototypical Trajectories Central

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

THE LINDEBERG-FELLER CENTRAL LIMIT THEOREM VIA ZERO BIAS TRANSFORMATION

THE LINDEBERG-FELLER CENTRAL LIMIT THEOREM VIA ZERO BIAS TRANSFORMATION THE LINDEBERG-FELLER CENTRAL LIMIT THEOREM VIA ZERO BIAS TRANSFORMATION JAINUL VAGHASIA Contents. Introduction. Notations 3. Background in Probability Theory 3.. Expectation and Variance 3.. Convergence

More information

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better. MA 485-1E, Probability (Dr Chernov) Final Exam Wed, Dec 12, 2001 Student s name Be sure to show all your work. Each problem is 4 points. Full credit will be given for 9 problems (36 points). You are welcome

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

Lecture 5: Two-point Sampling

Lecture 5: Two-point Sampling Randomized Algorithms Lecture 5: Two-point Sampling Sotiris Nikoletseas Professor CEID - ETY Course 2017-2018 Sotiris Nikoletseas, Professor Randomized Algorithms - Lecture 5 1 / 26 Overview A. Pairwise

More information

Def 1 A population consists of the totality of the observations with which we are concerned.

Def 1 A population consists of the totality of the observations with which we are concerned. Chapter 6 Sampling Distributions 6.1 Random Sampling Def 1 A population consists of the totality of the observations with which we are concerned. Remark 1. The size of a populations may be finite or infinite.

More information

Practice Questions for Final

Practice Questions for Final Math 39 Practice Questions for Final June. 8th 4 Name : 8. Continuous Probability Models You should know Continuous Random Variables Discrete Probability Distributions Expected Value of Discrete Random

More information

18.175: Lecture 3 Integration

18.175: Lecture 3 Integration 18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability

More information

Lecture 2 Sep 5, 2017

Lecture 2 Sep 5, 2017 CS 388R: Randomized Algorithms Fall 2017 Lecture 2 Sep 5, 2017 Prof. Eric Price Scribe: V. Orestis Papadigenopoulos and Patrick Rall NOTE: THESE NOTES HAVE NOT BEEN EDITED OR CHECKED FOR CORRECTNESS 1

More information

Chapter 7. Basic Probability Theory

Chapter 7. Basic Probability Theory Chapter 7. Basic Probability Theory I-Liang Chern October 20, 2016 1 / 49 What s kind of matrices satisfying RIP Random matrices with iid Gaussian entries iid Bernoulli entries (+/ 1) iid subgaussian entries

More information

Lecture Tricks with Random Variables: The Law of Large Numbers & The Central Limit Theorem

Lecture Tricks with Random Variables: The Law of Large Numbers & The Central Limit Theorem Math 408 - Mathematical Statistics Lecture 9-10. Tricks with Random Variables: The Law of Large Numbers & The Central Limit Theorem February 6-8, 2013 Konstantin Zuev (USC) Math 408, Lecture 9-10 February

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Lecture 2: Convergence of Random Variables

Lecture 2: Convergence of Random Variables Lecture 2: Convergence of Random Variables Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Introduction to Stochastic Processes, Fall 2013 1 / 9 Convergence of Random Variables

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Lecture 4: Two-point Sampling, Coupon Collector s problem

Lecture 4: Two-point Sampling, Coupon Collector s problem Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms

More information

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16 EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers

More information

Math 180C, Spring Supplement on the Renewal Equation

Math 180C, Spring Supplement on the Renewal Equation Math 18C Spring 218 Supplement on the Renewal Equation. These remarks supplement our text and set down some of the material discussed in my lectures. Unexplained notation is as in the text or in lecture.

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Data, Estimation and Inference

Data, Estimation and Inference Data, Estimation and Inference Pedro Piniés ppinies@robots.ox.ac.uk Michaelmas 2016 1 2 p(x) ( = ) = δ 0 ( < < + δ ) δ ( ) =1. x x+dx (, ) = ( ) ( ) = ( ) ( ) 3 ( ) ( ) 0 ( ) =1 ( = ) = ( ) ( < < ) = (

More information

LECTURE 1. Introduction to Econometrics

LECTURE 1. Introduction to Econometrics LECTURE 1 Introduction to Econometrics Ján Palguta September 20, 2016 1 / 29 WHAT IS ECONOMETRICS? To beginning students, it may seem as if econometrics is an overly complex obstacle to an otherwise useful

More information

SDS : Theoretical Statistics

SDS : Theoretical Statistics SDS 384 11: Theoretical Statistics Lecture 1: Introduction Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin https://psarkar.github.io/teaching Manegerial Stuff

More information