n outcome is (+1,+1, 1,..., 1). Let the r.v. X denote our position (relative to our starting point 0) after n moves. Thus X = X 1 + X 2 + +X n,

Size: px
Start display at page:

Download "n outcome is (+1,+1, 1,..., 1). Let the r.v. X denote our position (relative to our starting point 0) after n moves. Thus X = X 1 + X 2 + +X n,"

Transcription

1 CS 70 Discrete Mathematics for CS Sprig 2008 David Wager Note 9 Variace Questio: At each time step, I flip a fair coi. If it comes up Heads, I walk oe step to the right; if it comes up Tails, I walk oe step to the left. How far do I expect to have traveled from my startig poit after steps? Deotig a right-move by + ad a left-move by, we ca describe the probability space here as the set of all words of legth over the alphabet {±}, each havig equal probability 2. For istace, oe possible outcome is (+,+,,..., ). Let the r.v. X deote our positio (relative to our startig poit 0) after moves. Thus X = X + X 2 + +X, { + if ith toss is Heads; where X i = otherwise. Now obviously we have E[X] = 0. The easiest rigorous way to see this is to ote that E[X i ] = ( 2 )+( 2 ( )) = 0, so by liearity of expectatio E[X] = i= E[X i] = 0. Thus after steps, my expected positio is 0. But of course this is ot very iformative, ad is due to the fact that positive ad egative deviatios from 0 cacel out. What the above questio is really askig is: What is the expected value of X, our distace from 0? Ufortuately, computig the expected value of X turs out to be a little awkward, due to the absolute value operator. Therefore, rather tha cosider the r.v. X, we will istead look at the r.v. X 2. Notice that this also has the effect of makig all deviatios from 0 positive, so it should also give a good measure of the distace traveled. However, because it is the squared distace, we will eed to take a square root at the ed. Let s calculate E[X 2 ]: E[X 2 ] = E[(X + X 2 + +X ) 2 ] = E[ = X 2 i= i= i + X i X j ] E[Xi 2 ]+ E[X i X j ] I the last lie here, we used liearity of expectatio. To proceed, we eed to compute E[Xi 2] ad E[X ix j ] (for i j). Let s cosider first Xi 2. Sice X i ca take o oly values ±, clearly Xi 2 = always, so E[Xi 2] =. What about E[X i X j ]? Sice X i ad X j are idepedet, it is the case that E[X i X j ] = E[X i ]E[X j ] = 0. Pluggig these values ito the above equatio gives E[X 2 ] = ( )+0 =. For idepedet radom variables X,Y, we have E[XY] = E[X]E[Y]. You are strogly ecouraged to prove this as a exercise, alog similar lies to our proof of liearity of expectatio i a earlier lecture. Note that E[XY] = E[X]E[Y] is false for geeral r.v. s X,Y ; to see this just look at the preset example, where we have E[X 2 i ] E[X i]e[x i ]. CS 70, Sprig 2008, Note 9

2 So we see that our expected squared distace from 0 is. Oe iterpretatio of this is that we might expect to be a distace of about away from 0 after steps. However, we have to be careful here: we caot simply argue that E[ X ] = E[X 2 ] =. (Why ot?) We will see shortly (see Chebyshev s Iequality below) how to make precise deductios about X from kowledge of E[X 2 ]. For the momet, however, let s agree to view E[X 2 ] as a ituitive measure of spread of the r.v. X. I fact, for a more geeral r.v. with expectatio E[X] = µ, what we are really iterested i is E[(X µ) 2 ], the expected squared distace from the mea. I our radom walk example, we had µ = 0, so E[(X µ) 2 ] just reduces to E[X 2 ]. Defiitio 9. (variace): For a r.v. X with expectatio E[X] = µ, the variace of X is defied to be Var(X) = E[(X µ) 2 ]. The square root Var(X) is called the stadard deviatio of X. The poit of the stadard deviatio is merely to udo the squarig i the variace. Thus the stadard deviatio is o the same scale as the r.v. itself. Sice the variace ad stadard deviatio differ just by a square, it really does t matter which oe we choose to work with as we ca always compute oe from the other immediately. We shall usually use the variace. For the radom walk example above, we have that Var(X) =, ad the stadard deviatio of X is. The followig easy observatio gives us a slightly differet way to compute the variace that is simpler i may cases. Theorem 9.: For a r.v. X with expectatio E[X] = µ, we have Var(X) = E[X 2 ] µ 2. Proof: From the defiitio of variace, we have Var(X) = E[(X µ) 2 ] = E[X 2 2µX + µ 2 ] = E[X 2 ] 2µE[X]+ µ 2 = E[X 2 ] 2µ 2 + µ 2 = E[X 2 ] µ 2. I the third step here, we used liearity of expectatio. Let s see some examples of variace calculatios.. Fair die. Let X be the score o the roll of a sigle fair die. Recall from a earlier lecture that E[X] = 7 2. So we just eed to compute E[X 2 ], which is a routie calculcatio: E[X 2 ] = 6 ( ) = 9 6. Thus from Theorem 5. Var(X) = E[X 2 ] (E[X]) 2 = = Biased coi. Let X the the umber of Heads i tosses of a biased coi with Heads probability p (i.e., X has the biomial distributio with parameters, p). We already kow that E[X] = p. Let { if ith toss is Head, X i = 0 otherwise. CS 70, Sprig 2008, Note 9 2

3 We ca the write X = X + X 2 + +X, ad the E[X 2 ] = E[(X + X 2 + +X ) 2 ] = i= E[Xi 2 ]+ E[X i X j ] = ( p)+(( ) p 2 ) = 2 p 2 + p( p). I the third lie here, we have used the facts that E[X 2 i ] = p, ad that E[X ix j ] = E[X i ]E[X j ] = p 2 (sice X i,x j are idepedet). Note that there are ( ) pairs i, j with i j. Fially, we get that Var(X) = E[X 2 ] (E[X]) 2 = p( p). As a example, for a fair coi the expected umber of Heads i tosses is 2, ad the stadard deviatio is 2. Notice that i fact Var(X) = i Var(X i ), ad the same was true i the radom walk example. This is o coicidece, ad it depeds o the fact that the X i are mutually idepedet. (Exercise: Verify this.) So i the idepedet case we ca compute variaces of sums very easily. As we shall see i example 3, however, whe the r.v.s are ot mutually idepedet, the variace is ot liearly additive. 3. Number of fixed poits. Let X be the umber of fixed poits i a radom permutatio of items (i.e., the umber of studets i a class of size who receive their ow homework after shufflig). We saw i a earlier{ lecture that E[X] = (regardless of ). To compute E[X 2 ], write X = X + X 2 + +X, if i is a fixed poit; where X i = 0 otherwise. The as usual we have E[X 2 ] = i= E[Xi 2 ]+ E[X i X j ]. () Sice X i is a idicator r.v., we have that E[Xi 2] = Pr[X i = ] =. I this case, however, we have to be a bit more careful about E[X i X j ]: ote that we caot claim as before that this is equal to E[X i ]E[X j ], because X i ad X j are ot idepedet (do you see why ot?). But sice both X i ad X j are idicators, we ca compute E[X i X j ] directly as follows: E[X i X j ] = Pr[X i = X j = ] = Pr[both i ad j are fixed poits] = ( ). [Check that you uderstad the last step here.] Pluggig this ito equatio () we get E[X 2 ] = ( )+(( ) ( ) ) = + = 2. Thus Var(X) = E[X 2 ] (E[X]) 2 = 2 =. I other words, the variace ad the mea are both equal to. Like the mea, the variace is also idepedet of. Ituitively at least, this meas that it is ulikely that there will be more tha a small umber of fixed poits eve whe the umber of items,, is very large. Chebyshev s Iequality We have see that, ituitively, the variace (or, more correctly the stadard deviatio) is a measure of spread, or deviatio from the mea. Our ext goal is to make this ituitio quatitatively precise. What we ca show is the followig: CS 70, Sprig 2008, Note 9 3

4 Theorem 9.2: [Chebyshev s Iequality] For a radom variable X with expectatio E[X] = µ, ad for ay α > 0, Pr[ X µ α] Var(X) α 2. Before provig Chebyshev s iequality, let s pause to cosider what it says. It tells us that the probability of ay give deviatio, α, from the mea, either above it or below it (ote the absolute value sig), is at most Var(X). As expected, this deviatio probability will be small if the variace is small. A immediate corollary α 2 of Chebyshev s iequality is the followig: Corollary 9.3: For a radom variable X with expectatio E[X] = µ, ad stadard deviatio σ = Var(X), Pr[ X µ βσ] β 2. Proof: Plug α = β σ ito Chebyshev s iequality. So, for example, we see that the probability of deviatig from the mea by more tha (say) two stadard deviatios o either side is at most 4. I this sese, the stadard deviatio is a good workig defiitio of the width or spread of a distributio. We should ow go back ad prove Chebyshev s iequality. The proof will make use of the followig simpler boud, which applies oly to o-egative radom variables (i.e., r.v. s which take oly values 0). Theorem 9.4: [Markov s Iequality] For a o-egative radom variable X with expectatio E[X] = µ, ad ay α > 0, Pr[X α] E[X] α. Proof: From the defiitio of expectatio, we have E[X] = a Pr[X = a] a = a<α a Pr[X = a]+ a Pr[X = a] a Pr[X = a] α Pr[X = a] = α Pr[X α]. The crucial step here is the third lie, where we have used the fact that X takes o oly o-egative values ad cosequetly a 0 ad Pr[X = a] 0. (This step is ot valid otherwise, sice if X ca take o egative values, the we might have a Pr[X = a] < 0.) Now we ca prove Chebyshev s iequality quite easily. Proof of Theorem 5.2: Defie the r.v. Y = (X µ) 2. Note that E[Y] = E[(X µ) 2 ] = Var(X). Also, otice that the probability we are iterested i, Pr[ X µ α], is exactly the same as Pr[Y α 2 ]. (This is because the iequality X µ α is true if ad oly if (X µ) 2 α 2 is true, i.e., if ad oly if Y α 2 is true.) Moreover, Y is obviously o-egative, so we ca apply Markov s iequality to it to get Pr[Y α 2 ] E[Y] α 2 = Var(X) α 2. CS 70, Sprig 2008, Note 9 4

5 This completes the proof. Let s apply Chebyshev s iequality to aswer our questio about the radom walk at the begiig of this lecture ote. Recall that X is our positio after steps, ad that E[X] = 0, Var(X) =. Corollary 5.3 says that, for ay β > 0, Pr[ X β ] β 2. Thus for example, if we take = 0 6 steps, the probability that we ed up more tha 0000 steps away from our startig poit is at most 00. Here are a few more examples of applicatios of Chebyshev s iequality (you should check the algebra i them):. Coi tosses. Let X be the umber of Heads i tosses of a fair coi. The probability that X deviates from µ = 2 by more tha is at most 4. The probability that it deviates by more tha 5 is at most Fixed poits. Let X be the umber of fixed poits i a radom permutatio of items; recall that E[X] = Var(X) =. Thus the probability that more tha (say) 0 studets get their ow homeworks after shufflig is at most 00, however large is2. 2 I more detail: Pr[X > 0] = Pr[X ] Pr[ X ] 0] Var(X) 00 = 00. CS 70, Sprig 2008, Note 9 5

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute

More information

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Lecture 16

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Lecture 16 EECS 70 Discrete Mathematics ad Probability Theory Sprig 2014 Aat Sahai Lecture 16 Variace Questio: Let us retur oce agai to the questio of how may heads i a typical sequece of coi flips. Recall that we

More information

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22 CS 70 Discrete Mathematics for CS Sprig 2007 Luca Trevisa Lecture 22 Aother Importat Distributio The Geometric Distributio Questio: A biased coi with Heads probability p is tossed repeatedly util the first

More information

Lecture 2: Concentration Bounds

Lecture 2: Concentration Bounds CSE 52: Desig ad Aalysis of Algorithms I Sprig 206 Lecture 2: Cocetratio Bouds Lecturer: Shaya Oveis Ghara March 30th Scribe: Syuzaa Sargsya Disclaimer: These otes have ot bee subjected to the usual scrutiy

More information

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions CS 70 Discrete Mathematics for CS Sprig 2005 Clacy/Wager Notes 21 Some Importat Distributios Questio: A biased coi with Heads probability p is tossed repeatedly util the first Head appears. What is the

More information

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018) Radomized Algorithms I, Sprig 08, Departmet of Computer Sciece, Uiversity of Helsiki Homework : Solutios Discussed Jauary 5, 08). Exercise.: Cosider the followig balls-ad-bi game. We start with oe black

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learig Theory: Lecture Notes Kamalika Chaudhuri October 4, 0 Cocetratio of Averages Cocetratio of measure is very useful i showig bouds o the errors of machie-learig algorithms. We will begi with a basic

More information

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36 Probability Distributios A Example With Dice If X is a radom variable o sample space S, the the probablity that X takes o the value c is Similarly, Pr(X = c) = Pr({s S X(s) = c} Pr(X c) = Pr({s S X(s)

More information

Problem Set 2 Solutions

Problem Set 2 Solutions CS271 Radomess & Computatio, Sprig 2018 Problem Set 2 Solutios Poit totals are i the margi; the maximum total umber of poits was 52. 1. Probabilistic method for domiatig sets 6pts Pick a radom subset S

More information

2 Definition of Variance and the obvious guess

2 Definition of Variance and the obvious guess 1 Estimatig Variace Statistics - Math 410, 11/7/011 Oe of the mai themes of this course is to estimate the mea µ of some variable X of a populatio. We typically do this by collectig a sample of idividuals

More information

An Introduction to Randomized Algorithms

An Introduction to Randomized Algorithms A Itroductio to Radomized Algorithms The focus of this lecture is to study a radomized algorithm for quick sort, aalyze it usig probabilistic recurrece relatios, ad also provide more geeral tools for aalysis

More information

Variance of Discrete Random Variables Class 5, Jeremy Orloff and Jonathan Bloom

Variance of Discrete Random Variables Class 5, Jeremy Orloff and Jonathan Bloom Variace of Discrete Radom Variables Class 5, 18.05 Jeremy Orloff ad Joatha Bloom 1 Learig Goals 1. Be able to compute the variace ad stadard deviatio of a radom variable.. Uderstad that stadard deviatio

More information

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Discrete Mathematics for CS Spring 2008 David Wagner Note 22 CS 70 Discrete Mathematics for CS Sprig 2008 David Wager Note 22 I.I.D. Radom Variables Estimatig the bias of a coi Questio: We wat to estimate the proportio p of Democrats i the US populatio, by takig

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 19

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 19 CS 70 Discrete Mathematics ad Probability Theory Sprig 2016 Rao ad Walrad Note 19 Some Importat Distributios Recall our basic probabilistic experimet of tossig a biased coi times. This is a very simple

More information

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables Chapter 7 Limit Theorems Throughout this sectio we will assume a probability space (, F, P), i which is defied a ifiite sequece of radom variables (X ) ad a radom variable X. The fact that for every ifiite

More information

Lecture 6: Coupon Collector s problem

Lecture 6: Coupon Collector s problem Radomized Algorithms Lecture 6: Coupo Collector s problem Sotiris Nikoletseas Professor CEID - ETY Course 2017-2018 Sotiris Nikoletseas, Professor Radomized Algorithms - Lecture 6 1 / 16 Variace: key features

More information

Lecture 12: September 27

Lecture 12: September 27 36-705: Itermediate Statistics Fall 207 Lecturer: Siva Balakrisha Lecture 2: September 27 Today we will discuss sufficiecy i more detail ad the begi to discuss some geeral strategies for costructig estimators.

More information

CS 330 Discussion - Probability

CS 330 Discussion - Probability CS 330 Discussio - Probability March 24 2017 1 Fudametals of Probability 11 Radom Variables ad Evets A radom variable X is oe whose value is o-determiistic For example, suppose we flip a coi ad set X =

More information

Discrete Mathematics and Probability Theory Spring 2012 Alistair Sinclair Note 15

Discrete Mathematics and Probability Theory Spring 2012 Alistair Sinclair Note 15 CS 70 Discrete Mathematics ad Probability Theory Sprig 2012 Alistair Siclair Note 15 Some Importat Distributios The first importat distributio we leared about i the last Lecture Note is the biomial distributio

More information

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables Chapter 7 Limit theorems Throughout this sectio we will assume a probability space (Ω, F, P), i which is defied a ifiite sequece of radom variables (X ) ad a radom variable X. The fact that for every ifiite

More information

The standard deviation of the mean

The standard deviation of the mean Physics 6C Fall 20 The stadard deviatio of the mea These otes provide some clarificatio o the distictio betwee the stadard deviatio ad the stadard deviatio of the mea.. The sample mea ad variace Cosider

More information

Stat 400: Georgios Fellouris Homework 5 Due: Friday 24 th, 2017

Stat 400: Georgios Fellouris Homework 5 Due: Friday 24 th, 2017 Stat 400: Georgios Fellouris Homework 5 Due: Friday 4 th, 017 1. A exam has multiple choice questios ad each of them has 4 possible aswers, oly oe of which is correct. A studet will aswer all questios

More information

Lecture Chapter 6: Convergence of Random Sequences

Lecture Chapter 6: Convergence of Random Sequences ECE5: Aalysis of Radom Sigals Fall 6 Lecture Chapter 6: Covergece of Radom Sequeces Dr Salim El Rouayheb Scribe: Abhay Ashutosh Doel, Qibo Zhag, Peiwe Tia, Pegzhe Wag, Lu Liu Radom sequece Defiitio A ifiite

More information

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1. Eco 325/327 Notes o Sample Mea, Sample Proportio, Cetral Limit Theorem, Chi-square Distributio, Studet s t distributio 1 Sample Mea By Hiro Kasahara We cosider a radom sample from a populatio. Defiitio

More information

Discrete Mathematics and Probability Theory Fall 2016 Walrand Probability: An Overview

Discrete Mathematics and Probability Theory Fall 2016 Walrand Probability: An Overview CS 70 Discrete Mathematics ad Probability Theory Fall 2016 Walrad Probability: A Overview Probability is a fasciatig theory. It provides a precise, clea, ad useful model of ucertaity. The successes of

More information

Lecture 3: August 31

Lecture 3: August 31 36-705: Itermediate Statistics Fall 018 Lecturer: Siva Balakrisha Lecture 3: August 31 This lecture will be mostly a summary of other useful expoetial tail bouds We will ot prove ay of these i lecture,

More information

Topic 8: Expected Values

Topic 8: Expected Values Topic 8: Jue 6, 20 The simplest summary of quatitative data is the sample mea. Give a radom variable, the correspodig cocept is called the distributioal mea, the epectatio or the epected value. We begi

More information

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 15

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 15 CS 70 Discrete Mathematics ad Probability Theory Summer 2014 James Cook Note 15 Some Importat Distributios I this ote we will itroduce three importat probability distributios that are widely used to model

More information

Lecture 12: November 13, 2018

Lecture 12: November 13, 2018 Mathematical Toolkit Autum 2018 Lecturer: Madhur Tulsiai Lecture 12: November 13, 2018 1 Radomized polyomial idetity testig We will use our kowledge of coditioal probability to prove the followig lemma,

More information

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables CSCI-B609: A Theorist s Toolkit, Fall 06 Aug 3 Lecture 0: the Cetral Limit Theorem Lecturer: Yua Zhou Scribe: Yua Xie & Yua Zhou Cetral Limit Theorem for iid radom variables Let us say that we wat to aalyze

More information

Distribution of Random Samples & Limit theorems

Distribution of Random Samples & Limit theorems STAT/MATH 395 A - PROBABILITY II UW Witer Quarter 2017 Néhémy Lim Distributio of Radom Samples & Limit theorems 1 Distributio of i.i.d. Samples Motivatig example. Assume that the goal of a study is to

More information

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER / Statistics

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER / Statistics ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER 1 018/019 DR. ANTHONY BROWN 8. Statistics 8.1. Measures of Cetre: Mea, Media ad Mode. If we have a series of umbers the

More information

Random Variables, Sampling and Estimation

Random Variables, Sampling and Estimation Chapter 1 Radom Variables, Samplig ad Estimatio 1.1 Itroductio This chapter will cover the most importat basic statistical theory you eed i order to uderstad the ecoometric material that will be comig

More information

Chapter 5. Inequalities. 5.1 The Markov and Chebyshev inequalities

Chapter 5. Inequalities. 5.1 The Markov and Chebyshev inequalities Chapter 5 Iequalities 5.1 The Markov ad Chebyshev iequalities As you have probably see o today s frot page: every perso i the upper teth percetile ears at least 1 times more tha the average salary. I other

More information

Frequentist Inference

Frequentist Inference Frequetist Iferece The topics of the ext three sectios are useful applicatios of the Cetral Limit Theorem. Without kowig aythig about the uderlyig distributio of a sequece of radom variables {X i }, for

More information

1+x 1 + α+x. x = 2(α x2 ) 1+x

1+x 1 + α+x. x = 2(α x2 ) 1+x Math 2030 Homework 6 Solutios # [Problem 5] For coveiece we let α lim sup a ad β lim sup b. Without loss of geerality let us assume that α β. If α the by assumptio β < so i this case α + β. By Theorem

More information

Notes for Lecture 11

Notes for Lecture 11 U.C. Berkeley CS78: Computatioal Complexity Hadout N Professor Luca Trevisa 3/4/008 Notes for Lecture Eigevalues, Expasio, ad Radom Walks As usual by ow, let G = (V, E) be a udirected d-regular graph with

More information

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman: Math 224 Fall 2017 Homework 4 Drew Armstrog Problems from 9th editio of Probability ad Statistical Iferece by Hogg, Tais ad Zimmerma: Sectio 2.3, Exercises 16(a,d),18. Sectio 2.4, Exercises 13, 14. Sectio

More information

Expectation and Variance of a random variable

Expectation and Variance of a random variable Chapter 11 Expectatio ad Variace of a radom variable The aim of this lecture is to defie ad itroduce mathematical Expectatio ad variace of a fuctio of discrete & cotiuous radom variables ad the distributio

More information

Lecture 4: April 10, 2013

Lecture 4: April 10, 2013 TTIC/CMSC 1150 Mathematical Toolkit Sprig 01 Madhur Tulsiai Lecture 4: April 10, 01 Scribe: Haris Agelidakis 1 Chebyshev s Iequality recap I the previous lecture, we used Chebyshev s iequality to get a

More information

Lecture 2: April 3, 2013

Lecture 2: April 3, 2013 TTIC/CMSC 350 Mathematical Toolkit Sprig 203 Madhur Tulsiai Lecture 2: April 3, 203 Scribe: Shubhedu Trivedi Coi tosses cotiued We retur to the coi tossig example from the last lecture agai: Example. Give,

More information

Understanding Samples

Understanding Samples 1 Will Moroe CS 109 Samplig ad Bootstrappig Lecture Notes #17 August 2, 2017 Based o a hadout by Chris Piech I this chapter we are goig to talk about statistics calculated o samples from a populatio. We

More information

6.3 Testing Series With Positive Terms

6.3 Testing Series With Positive Terms 6.3. TESTING SERIES WITH POSITIVE TERMS 307 6.3 Testig Series With Positive Terms 6.3. Review of what is kow up to ow I theory, testig a series a i for covergece amouts to fidig the i= sequece of partial

More information

Parameter, Statistic and Random Samples

Parameter, Statistic and Random Samples Parameter, Statistic ad Radom Samples A parameter is a umber that describes the populatio. It is a fixed umber, but i practice we do ot kow its value. A statistic is a fuctio of the sample data, i.e.,

More information

The Random Walk For Dummies

The Random Walk For Dummies The Radom Walk For Dummies Richard A Mote Abstract We look at the priciples goverig the oe-dimesioal discrete radom walk First we review five basic cocepts of probability theory The we cosider the Beroulli

More information

Lecture 2: Monte Carlo Simulation

Lecture 2: Monte Carlo Simulation STAT/Q SCI 43: Itroductio to Resamplig ethods Sprig 27 Istructor: Ye-Chi Che Lecture 2: ote Carlo Simulatio 2 ote Carlo Itegratio Assume we wat to evaluate the followig itegratio: e x3 dx What ca we do?

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Desig ad Aalysis of Algorithms Probabilistic aalysis ad Radomized algorithms Referece: CLRS Chapter 5 Topics: Hirig problem Idicatio radom variables Radomized algorithms Huo Hogwei 1 The hirig problem

More information

Random Models. Tusheng Zhang. February 14, 2013

Random Models. Tusheng Zhang. February 14, 2013 Radom Models Tusheg Zhag February 14, 013 1 Radom Walks Let me describe the model. Radom walks are used to describe the motio of a movig particle (object). Suppose that a particle (object) moves alog the

More information

Final Review for MATH 3510

Final Review for MATH 3510 Fial Review for MATH 50 Calculatio 5 Give a fairly simple probability mass fuctio or probability desity fuctio of a radom variable, you should be able to compute the expected value ad variace of the variable

More information

MATH/STAT 352: Lecture 15

MATH/STAT 352: Lecture 15 MATH/STAT 352: Lecture 15 Sectios 5.2 ad 5.3. Large sample CI for a proportio ad small sample CI for a mea. 1 5.2: Cofidece Iterval for a Proportio Estimatig proportio of successes i a biomial experimet

More information

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators Topic 9: Samplig Distributios of Estimators Course 003, 2018 Page 0 Samplig distributios of estimators Sice our estimators are statistics (particular fuctios of radom variables), their distributio ca be

More information

Solutions to Homework 6

Solutions to Homework 6 Solutios to Homework Math 500-, Summer 00 July 7, 00 p. 0 07, #. Clearly, Y has a biomial distributio with parameters 3 ad p. That is, k 0 3 PfY kg ( )3 ( )3 3 ( )3 3 ( )3 `3 0 `3 `3 `3 3 E(Y ) 0 ˆ «+

More information

4. Partial Sums and the Central Limit Theorem

4. Partial Sums and the Central Limit Theorem 1 of 10 7/16/2009 6:05 AM Virtual Laboratories > 6. Radom Samples > 1 2 3 4 5 6 7 4. Partial Sums ad the Cetral Limit Theorem The cetral limit theorem ad the law of large umbers are the two fudametal theorems

More information

Ma 530 Infinite Series I

Ma 530 Infinite Series I Ma 50 Ifiite Series I Please ote that i additio to the material below this lecture icorporated material from the Visual Calculus web site. The material o sequeces is at Visual Sequeces. (To use this li

More information

Infinite Sequences and Series

Infinite Sequences and Series Chapter 6 Ifiite Sequeces ad Series 6.1 Ifiite Sequeces 6.1.1 Elemetary Cocepts Simply speakig, a sequece is a ordered list of umbers writte: {a 1, a 2, a 3,...a, a +1,...} where the elemets a i represet

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Itroductio to Probability ad Statistics Lecture 23: Cotiuous radom variables- Iequalities, CLT Puramrita Sarkar Departmet of Statistics ad Data Sciece The Uiversity of Texas at Austi www.cs.cmu.edu/

More information

Statistics 511 Additional Materials

Statistics 511 Additional Materials Cofidece Itervals o mu Statistics 511 Additioal Materials This topic officially moves us from probability to statistics. We begi to discuss makig ifereces about the populatio. Oe way to differetiate probability

More information

STAT Homework 1 - Solutions

STAT Homework 1 - Solutions STAT-36700 Homework 1 - Solutios Fall 018 September 11, 018 This cotais solutios for Homework 1. Please ote that we have icluded several additioal commets ad approaches to the problems to give you better

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Solutios to Quiz : Sprig 006 Problem : Each of the followig statemets is either True or False. There will be o partial credit give for the True False questios, thus ay explaatios will ot be graded. Please

More information

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss ECE 90 Lecture : Complexity Regularizatio ad the Squared Loss R. Nowak 5/7/009 I the previous lectures we made use of the Cheroff/Hoeffdig bouds for our aalysis of classifier errors. Hoeffdig s iequality

More information

0, otherwise. EX = E(X 1 + X n ) = EX j = np and. Var(X j ) = np(1 p). Var(X) = Var(X X n ) =

0, otherwise. EX = E(X 1 + X n ) = EX j = np and. Var(X j ) = np(1 p). Var(X) = Var(X X n ) = PROBABILITY MODELS 35 10. Discrete probability distributios I this sectio, we discuss several well-ow discrete probability distributios ad study some of their properties. Some of these distributios, lie

More information

Homework 5 Solutions

Homework 5 Solutions Homework 5 Solutios p329 # 12 No. To estimate the chace you eed the expected value ad stadard error. To do get the expected value you eed the average of the box ad to get the stadard error you eed the

More information

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36 Probability Distributios A Example With Dice If X is a radom variable o sample space S, the the probability that X takes o the value c is Similarly, Pr(X = c) = Pr({s S X(s) = c}) Pr(X c) = Pr({s S X(s)

More information

Sample Midterm This midterm consists of 10 questions. The rst seven questions are multiple choice; the remaining three

Sample Midterm This midterm consists of 10 questions. The rst seven questions are multiple choice; the remaining three CS{74 Combiatorics & Discrete Probability, Fall 97 Sample Midterm :30{:00pm, 7 October Read these istructios carefully. This is a closed book exam. Calculators are permitted.. This midterm cosists of 0

More information

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6)

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6) STAT 350 Hadout 9 Samplig Distributio, Cetral Limit Theorem (6.6) A radom sample is a sequece of radom variables X, X 2,, X that are idepedet ad idetically distributed. o This property is ofte abbreviated

More information

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS

January 25, 2017 INTRODUCTION TO MATHEMATICAL STATISTICS Jauary 25, 207 INTRODUCTION TO MATHEMATICAL STATISTICS Abstract. A basic itroductio to statistics assumig kowledge of probability theory.. Probability I a typical udergraduate problem i probability, we

More information

Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators Topic 9: Samplig Distributios of Estimators Course 003, 2018 Page 0 Samplig distributios of estimators Sice our estimators are statistics (particular fuctios of radom variables), their distributio ca be

More information

Math 216A Notes, Week 5

Math 216A Notes, Week 5 Math 6A Notes, Week 5 Scribe: Ayastassia Sebolt Disclaimer: These otes are ot early as polished (ad quite possibly ot early as correct) as a published paper. Please use them at your ow risk.. Thresholds

More information

Lecture 27. Capacity of additive Gaussian noise channel and the sphere packing bound

Lecture 27. Capacity of additive Gaussian noise channel and the sphere packing bound Lecture 7 Ageda for the lecture Gaussia chael with average power costraits Capacity of additive Gaussia oise chael ad the sphere packig boud 7. Additive Gaussia oise chael Up to this poit, we have bee

More information

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 18

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 18 EECS 70 Discrete Mathematics ad Probability Theory Sprig 2013 Aat Sahai Lecture 18 Iferece Oe of the major uses of probability is to provide a systematic framework to perform iferece uder ucertaity. A

More information

1 of 7 7/16/2009 6:06 AM Virtual Laboratories > 6. Radom Samples > 1 2 3 4 5 6 7 6. Order Statistics Defiitios Suppose agai that we have a basic radom experimet, ad that X is a real-valued radom variable

More information

CS 171 Lecture Outline October 09, 2008

CS 171 Lecture Outline October 09, 2008 CS 171 Lecture Outlie October 09, 2008 The followig theorem comes very hady whe calculatig the expectatio of a radom variable that takes o o-egative iteger values. Theorem: Let Y be a radom variable that

More information

II. Descriptive Statistics D. Linear Correlation and Regression. 1. Linear Correlation

II. Descriptive Statistics D. Linear Correlation and Regression. 1. Linear Correlation II. Descriptive Statistics D. Liear Correlatio ad Regressio I this sectio Liear Correlatio Cause ad Effect Liear Regressio 1. Liear Correlatio Quatifyig Liear Correlatio The Pearso product-momet correlatio

More information

Lecture 1 Probability and Statistics

Lecture 1 Probability and Statistics Wikipedia: Lecture 1 Probability ad Statistics Bejami Disraeli, British statesma ad literary figure (1804 1881): There are three kids of lies: lies, damed lies, ad statistics. popularized i US by Mark

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Lecture 16. Multiple Random Variables and Applications to Inference

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Lecture 16. Multiple Random Variables and Applications to Inference CS 70 Discrete Mathematics ad Probability Theory Fall 2009 Satish Rao,David Tse Lecture 16 Multiple Radom Variables ad Applicatios to Iferece I may probability problems, we have to deal with multiple r.v.

More information

2.2. Central limit theorem.

2.2. Central limit theorem. 36.. Cetral limit theorem. The most ideal case of the CLT is that the radom variables are iid with fiite variace. Although it is a special case of the more geeral Lideberg-Feller CLT, it is most stadard

More information

4.3 Growth Rates of Solutions to Recurrences

4.3 Growth Rates of Solutions to Recurrences 4.3. GROWTH RATES OF SOLUTIONS TO RECURRENCES 81 4.3 Growth Rates of Solutios to Recurreces 4.3.1 Divide ad Coquer Algorithms Oe of the most basic ad powerful algorithmic techiques is divide ad coquer.

More information

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence Chapter 3 Strog covergece As poited out i the Chapter 2, there are multiple ways to defie the otio of covergece of a sequece of radom variables. That chapter defied covergece i probability, covergece i

More information

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f. Lecture 5 Let us give oe more example of MLE. Example 3. The uiform distributio U[0, ] o the iterval [0, ] has p.d.f. { 1 f(x =, 0 x, 0, otherwise The likelihood fuctio ϕ( = f(x i = 1 I(X 1,..., X [0,

More information

LECTURE 8: ASYMPTOTICS I

LECTURE 8: ASYMPTOTICS I LECTURE 8: ASYMPTOTICS I We are iterested i the properties of estimators as. Cosider a sequece of radom variables {, X 1}. N. M. Kiefer, Corell Uiversity, Ecoomics 60 1 Defiitio: (Weak covergece) A sequece

More information

Lecture 2 February 8, 2016

Lecture 2 February 8, 2016 MIT 6.854/8.45: Advaced Algorithms Sprig 206 Prof. Akur Moitra Lecture 2 February 8, 206 Scribe: Calvi Huag, Lih V. Nguye I this lecture, we aalyze the problem of schedulig equal size tasks arrivig olie

More information

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample. Statistical Iferece (Chapter 10) Statistical iferece = lear about a populatio based o the iformatio provided by a sample. Populatio: The set of all values of a radom variable X of iterest. Characterized

More information

Basics of Probability Theory (for Theory of Computation courses)

Basics of Probability Theory (for Theory of Computation courses) Basics of Probability Theory (for Theory of Computatio courses) Oded Goldreich Departmet of Computer Sciece Weizma Istitute of Sciece Rehovot, Israel. oded.goldreich@weizma.ac.il November 24, 2008 Preface.

More information

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED

MATH 472 / SPRING 2013 ASSIGNMENT 2: DUE FEBRUARY 4 FINALIZED MATH 47 / SPRING 013 ASSIGNMENT : DUE FEBRUARY 4 FINALIZED Please iclude a cover sheet that provides a complete setece aswer to each the followig three questios: (a) I your opiio, what were the mai ideas

More information

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand Final Solutions

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand Final Solutions CS 70 Discrete Mathematics ad Probability Theory Fall 2016 Seshia ad Walrad Fial Solutios CS 70, Fall 2016, Fial Solutios 1 1 TRUE or FALSE?: 2x8=16 poits Clearly put your aswers i the aswer box o the

More information

A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence Sequeces A sequece of umbers is a fuctio whose domai is the positive itegers. We ca see that the sequece,, 2, 2, 3, 3,... is a fuctio from the positive itegers whe we write the first sequece elemet as

More information

The picture in figure 1.1 helps us to see that the area represents the distance traveled. Figure 1: Area represents distance travelled

The picture in figure 1.1 helps us to see that the area represents the distance traveled. Figure 1: Area represents distance travelled 1 Lecture : Area Area ad distace traveled Approximatig area by rectagles Summatio The area uder a parabola 1.1 Area ad distace Suppose we have the followig iformatio about the velocity of a particle, how

More information

Math 155 (Lecture 3)

Math 155 (Lecture 3) Math 55 (Lecture 3) September 8, I this lecture, we ll cosider the aswer to oe of the most basic coutig problems i combiatorics Questio How may ways are there to choose a -elemet subset of the set {,,,

More information

Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices

Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices Radom Matrices with Blocks of Itermediate Scale Strogly Correlated Bad Matrices Jiayi Tog Advisor: Dr. Todd Kemp May 30, 07 Departmet of Mathematics Uiversity of Califoria, Sa Diego Cotets Itroductio Notatio

More information

z is the upper tail critical value from the normal distribution

z is the upper tail critical value from the normal distribution Statistical Iferece drawig coclusios about a populatio parameter, based o a sample estimate. Populatio: GRE results for a ew eam format o the quatitative sectio Sample: =30 test scores Populatio Samplig

More information

1 Convergence in Probability and the Weak Law of Large Numbers

1 Convergence in Probability and the Weak Law of Large Numbers 36-752 Advaced Probability Overview Sprig 2018 8. Covergece Cocepts: i Probability, i L p ad Almost Surely Istructor: Alessadro Rialdo Associated readig: Sec 2.4, 2.5, ad 4.11 of Ash ad Doléas-Dade; Sec

More information

Parameter, Statistic and Random Samples

Parameter, Statistic and Random Samples Parameter, Statistic ad Radom Samples A parameter is a umber that describes the populatio. It is a fixed umber, but i practice we do ot kow its value. A statistic is a fuctio of the sample data, i.e.,

More information

Lecture 8: Non-parametric Comparison of Location. GENOME 560, Spring 2016 Doug Fowler, GS

Lecture 8: Non-parametric Comparison of Location. GENOME 560, Spring 2016 Doug Fowler, GS Lecture 8: No-parametric Compariso of Locatio GENOME 560, Sprig 2016 Doug Fowler, GS (dfowler@uw.edu) 1 Review What do we mea by oparametric? What is a desirable locatio statistic for ordial data? What

More information

How to walk home drunk. Some Great Theoretical Ideas in Computer Science for. Probability Refresher. Probability Refresher.

How to walk home drunk. Some Great Theoretical Ideas in Computer Science for. Probability Refresher. Probability Refresher. 15251 Some Great Theoretical Ideas i Computer Sciece for "My frieds keep askig me what 251 is like. I lik them to this video: http://youtube.com/watch?v=m275pjvwrli" Probability Refresher What s a Radom

More information

Mathematical Statistics - MS

Mathematical Statistics - MS Paper Specific Istructios. The examiatio is of hours duratio. There are a total of 60 questios carryig 00 marks. The etire paper is divided ito three sectios, A, B ad C. All sectios are compulsory. Questios

More information

Measures of Spread: Standard Deviation

Measures of Spread: Standard Deviation Measures of Spread: Stadard Deviatio So far i our study of umerical measures used to describe data sets, we have focused o the mea ad the media. These measures of ceter tell us the most typical value of

More information

Section 1.1. Calculus: Areas And Tangents. Difference Equations to Differential Equations

Section 1.1. Calculus: Areas And Tangents. Difference Equations to Differential Equations Differece Equatios to Differetial Equatios Sectio. Calculus: Areas Ad Tagets The study of calculus begis with questios about chage. What happes to the velocity of a swigig pedulum as its positio chages?

More information

Lecture 7: Properties of Random Samples

Lecture 7: Properties of Random Samples Lecture 7: Properties of Radom Samples 1 Cotiued From Last Class Theorem 1.1. Let X 1, X,...X be a radom sample from a populatio with mea µ ad variace σ

More information

Median and IQR The median is the value which divides the ordered data values in half.

Median and IQR The median is the value which divides the ordered data values in half. STA 666 Fall 2007 Web-based Course Notes 4: Describig Distributios Numerically Numerical summaries for quatitative variables media ad iterquartile rage (IQR) 5-umber summary mea ad stadard deviatio Media

More information