1 Variance of a Random Variable
|
|
- Carmel Burke
- 5 years ago
- Views:
Transcription
1 Indian Institute of Technology Bombay Department of Electrical Engineering Handout 14 EE 325 Probability and Random Processes Lecture Notes 9 August 28, Variance of a Random Variable The expectation E[X] is also nown as the first moment or mean of the random variable. Can we put some quantitative measure on how close the random variable is to its mean. In electrical engineering such situations frequently occur in sampling and quantization, where one wishes to now the quantization error around a level of finite bit representation. To put it more formally, we are trying to find the mean square error, i.e, the difference X a is squared and then expectation is taen to obtain E (X a) 2, where a is some value of interest, for example the quantized value after sampling. The variance of a random variable V ar(x) is defined as V ar(x) E (X E[X]) 2 In the notes, we will often denote V ar(x) by σx 2. A simple expansion of the RHS above results in the following. Proposition 1 E (X E[X]) 2 EX 2 (E[X]) 2. The mean EX is the point of minimum mean squared error, as the following proposition states. Proposition 2 E(X µ) 2 E(X a) 2, a R Proof: Let us maximize the RHS with respect to a. d da E(X a)2 2E(X a) 2(EX a). Thus the only optimum is at a EX, and since the double derivative is 2 everywhere, we have the desired result. Let us now introduce the Poisson distribution and compute its variance as an example. 1.1 Poisson Distribution: P oisson(λ) The Poisson distribution is over N, and has a single parameter, namely λ R +. distribution is λ λ P (X ) e! Exercise 1 Verify that the given distribution satisfies the axioms of probability. The
2 Let us compute the mean and variance of P oisson(λ). Let us now compute EX 2 EX. E(X) e λ λ 0! e λ λ 1! λ 1 e λ λ 1 ( 1)! e λ λe λ λ. The variance can now be computed as E(X) e λ ( 2 ) λ 0! e λ ( 1) λ 2! e λ λ 2 λ 2 h 2 ( 2)! λ 2. σ 2 X EX 2 EX + EX (EX) 2 λ 2 + λ λ 2 λ. 2 Expectation in terms of Probability To explain their intimate connection, we now express the expectation of a N valued random variable in terms of just the probabilities. Theorem 1 For a non-negative integer valued random variable X, Proof: E[X] P (X n). n 1 E[X] jp (X j) j N 1 {n j} P (X j) j N n1 1 {n j} P (X j) n1 j N P (X j) n1 j n n1 P (X n). To show the convenience of this property, let us consider the example of a popular random variable in the next subsection. 2
3 2.1 Geometric Random Variable- Geometric(p) This random variable taes values in the countable state space N, and the only parameter governing its realization is p [0, 1]. In relation to tossing a coin, a geometric random variable X captures the first occurrence of HEADs. The probability of HEAD occurring on the th toss and not before it is P (X ) (1 p) 1 p, 1. Example 1 Compute E[X] and E[X 2 ] if X Geometric(p). Solution: Let us find P (X ) first. P (X ) j (1 p) j 1 p p(1 p) 1 (1 p) j j0 (1 p) 1. E[X] P (X ) 1 (1 p) p. 3 Marov s Inequality To further underline that expectation contains significant information about the distribution, we state the Marov s inequality. Theorem 2 For a non-negative valued random variable X, P (X a) E[X] a. (1) Proof: While the proof can be done by expanding the summation of expectation, we resort to a more elegant method using indicator functions. Taing expectation of both sides, we get X X 1 {X<a} + X 1 {X a} X 1 {X a} a 1 {X a}. ap (X a) E[X]. 3
4 4 More On Independence Theorem 3 Consider the random vector (X 1, X 2 ) E 1 E 2. Then X 1.X 2 is a random variable. Proof: Let Y X 1.X 2, we will show that Y is a random variable. Clearly Y is discrete. Furthermore, {Y y} {X 1.X 2 y} {(ω 1, ω 2 ) X 1 (ω 1 ) x, X 2 (ω 2 ) y x x } Since X 1 and X 2 are measurable maps, the above union is well defined and an element of F. Hence Y is a random variable. Theorem 4 Consider the random variable (X 1, X 2 ) E 1 E 2, let g 1 E 1 R and g 2 E 2 R be two functions. Then g 1 (X 1 )g 2 (X 2 ) is a random variable. Proof: Since we now g 1 (X 1 ) and g 2 (X 2 ) are random variables, applying the previous theorem shows that g 1 (X 1 ).g 2 (X 2 ) is indeed a random variable. Let us now come bac to our discussion of independence. The independence of random variables is preserved by individual functional transformations. Theorem 5 Let X 1 and X 2 be independent random variables in E 1 and E 2 respectively. Consider two functions g 1 E 1 R and g 2 E 2 R. The random variables g 1 (X 1 ) and g 2 (X 2 ) are independent. Solution: The RVs g 1 (X 1 ) and g 2 (X 2 ) are discrete. Considering joint events, {g 1 (X 1 ) y 1, g 2 (X 2 ) y 2 } (x 1,x 2 ) g 1 (x 1 )y 1,g 2 (x 2 )y 2 {X 1 x 1, X 2 x 2 }, where the events inside the union are disjoint. The probability of the event on the LHS is, P (g 1 (X 1 ) y 1, g 2 (X 2 ) y 2 ) P (X 1 x 1, X 2 x 2 ) (x 1,x 2 ) g 1 (x 1 ) y 1, g 2 (x 2 ) y 2 (x 1,x 2 ) g 1 (x 1 ) y 1, g 2 (x 2 ) y 2 P (X 1 x 1 )P (X 2 x 2 ) P (X 1 x 1 ) P (X 2 x 2 ) x 1 g 1 (x 1 )y 1 x 2 g 2 (x 2 )y 2 P (g 1 (X 1 ) y 1 )P (g 2 (X 2 ) y 2 ). Thus, by the definition of independence of two random variables, g 1 (X 1 ) and g 2 (X 2 ) are independent. 4
5 Note: There is some room for an argument that the correct proof should show P (g 1 (X 1 ) y 1, g 2 (X 2 ) y 2 ) P 1 (g 1 (X 1 ) y 1 )P 2 (g 2 (X 2 ) y 2 ), where P 1 ( ) and P 2 ( ) are the corresponding measures. Indeed we have shown this desired result, but the notation that we used in the theorem is an universally accepted one. In particular, we may use the same notation P ( ) for different probability measures, but depending on the argument, the associated measure will be understood. For example, P (X 1 x 1 ) is the measure induced by X 1 on E 1, where as P (X 2 x 2 ) is a possibly different measure, induced by the random variable X 2. Observe that the transformations were applied in a separable fashion on each variable. Let us now state a simple, yet powerful law for the distribution of expectation over product of independent random variables. Proposition 3 The independent random variables X and Y whenever these expectations are well defined. Proof: E[g 1 (X)g 2 (Y )] E[g 1 (X)][Eg 2 (Y )] Eg 1 [X]g 2 [Y ] g 1 (x)g 2 (y)p (X x, Y y) (2) x,y g 1 (x)g 2 (y)p (X x)p (Y y) (3) x,y x E 1 g 1 (x)p (X x) y E 2 g 2 (y)p (Y y) (4) E[g 1 (X)]E[g 2 (Y )]. (5) Example 2 A fair die is thrown times. Let X i be the outcome of the i th toss. Consider the probability association, P (X 1 x 1,, X x ) 1 6. Show that the random variables X 1,, X are pairwise independent, i.e. every pair of X i, X j with i j are independent. Solution: Clearly each X i taes values in {1,, 6}. Let us consider the event {X i x i, X j x j }. P (X i x i, X j x j ) x 1,,x m {x i,x j } P (x 1,, x ) 1 x 1,,x m {x i,x j } Thus, P (X i x i, X j x j ) P (X i x i )P (X j x j ). See footnote 1 below. 1 we used x 1,, x {x i, x j } to tae the sum of over all subscripts except i and j 5
6 Example 3 Consider the above example, and let S X i. Compute the mean and variance of S. Solution: Let the mean be denoted as µ. µ E[S ] Let us compute the variance σ 2 S E[(S µ ) 2 ]. E[X i ] 7 2. E[(S µ ) 2 2 ] E[( (X i E[X i ])) ] (6) E[ (X i E[X i ]) 2 + (X i E[X i ])(X j E[X j ])] (7) i,j i j E(X i E[X i ]) 2 + E[(X i E[X i ])(X j E[X j ])] (8) i,j i j σx 2 i + E[X i E[X i ]]E[X j E[X j ]] (9) i,j i j σ 2 X i (10) Since σ 2 X i 70 24, i, we have σ2 S In fact, our computations in the last example show that we can generalize this result to arbitrary random variables which are pair-wise independent. Theorem 6 Let X 1,, X be a collection of pairwise independent random variables. Then σ 2 X 1 + +X σ 2 X i. We have already defined independent sequence of random variables. particular class is the most appealing. Among this, a a 4.1 IID Random Variables Definition 1 A sequence X n, n 1 of RVs is said to be Independent and Identically Distributed (IID) if 1. X n, n 1 is a independent sequence, each X n taing values in the same set E. 2. P (X i x) P (X j x), i, j and x E. 6
Lecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationDiscrete Probability Refresher
ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationSample Spaces, Random Variables
Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted
More informationExample 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3).
Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3). First of all, we note that µ = 2 and σ = 2. (a) Since X 3 is equivalent
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationExample 1. The sample space of an experiment where we flip a pair of coins is denoted by:
Chapter 8 Probability 8. Preliminaries Definition (Sample Space). A Sample Space, Ω, is the set of all possible outcomes of an experiment. Such a sample space is considered discrete if Ω has finite cardinality.
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationMath-Stat-491-Fall2014-Notes-I
Math-Stat-491-Fall2014-Notes-I Hariharan Narayanan October 2, 2014 1 Introduction This writeup is intended to supplement material in the prescribed texts: Introduction to Probability Models, 10th Edition,
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationDiscrete Random Variable
Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationLecture 9: Conditional Probability and Independence
EE5110: Probability Foundations July-November 2015 Lecture 9: Conditional Probability and Independence Lecturer: Dr. Krishna Jagannathan Scribe: Vishakh Hegde 9.1 Conditional Probability Definition 9.1
More informationLecture 1. ABC of Probability
Math 408 - Mathematical Statistics Lecture 1. ABC of Probability January 16, 2013 Konstantin Zuev (USC) Math 408, Lecture 1 January 16, 2013 1 / 9 Agenda Sample Spaces Realizations, Events Axioms of Probability
More informationTopic 3 Random variables, expectation, and variance, II
CSE 103: Probability and statistics Fall 2010 Topic 3 Random variables, expectation, and variance, II 3.1 Linearity of expectation If you double each value of X, then you also double its average; that
More informationChapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be
Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationThe expected value E[X] of discrete random variable X is defined by. xp X (x), (6.1) E[X] =
Chapter 6 Meeting Expectations When a large collection of data is gathered, one is typically interested not necessarily in every individual data point, but rather in certain descriptive quantities such
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationHW Solution 2 Due: July 10:39AM
ECS 35: Probability and Random Processes 200/ HW Solution 2 Due: July 9 @ 0:39AM Lecturer: Prapun Suksompong, Ph.D. Instructions (a) A part of ONE question will be graded. Of course, you do not know which
More informationLECTURE 1. 1 Introduction. 1.1 Sample spaces and events
LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events
More informationMath 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is
Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i
More informationNotes on Discrete Probability
Columbia University Handout 3 W4231: Analysis of Algorithms September 21, 1999 Professor Luca Trevisan Notes on Discrete Probability The following notes cover, mostly without proofs, the basic notions
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2
IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read
More informationBrief Review of Probability
Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic
More informationMODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES
MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES 7-11 Topics 2.1 RANDOM VARIABLE 2.2 INDUCED PROBABILITY MEASURE 2.3 DISTRIBUTION FUNCTION AND ITS PROPERTIES 2.4 TYPES OF RANDOM VARIABLES: DISCRETE,
More information3 Conditional Expectation
3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationSTAT 3610: Review of Probability Distributions
STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationJOINT PROBABILITY DISTRIBUTIONS
MTH/STA 56 JOINT PROBABILITY DISTRIBUTIONS The study of random variables in the previous chapters was restricted to the idea that a random variable associates a single real number with each possible outcome
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More informationProbability Theory Review
Cogsci 118A: Natural Computation I Lecture 2 (01/07/10) Lecturer: Angela Yu Probability Theory Review Scribe: Joseph Schilz Lecture Summary 1. Set theory: terms and operators In this section, we provide
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationFundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner
Fundamentals CS 281A: Statistical Learning Theory Yangqing Jia Based on tutorial slides by Lester Mackey and Ariel Kleiner August, 2011 Outline 1 Probability 2 Statistics 3 Linear Algebra 4 Optimization
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationSTA 711: Probability & Measure Theory Robert L. Wolpert
STA 711: Probability & Measure Theory Robert L. Wolpert 6 Independence 6.1 Independent Events A collection of events {A i } F in a probability space (Ω,F,P) is called independent if P[ i I A i ] = P[A
More informationMotivation and Applications: Why Should I Study Probability?
Motivation and Applications: Why Should I Study Probability? As stated by Laplace, Probability is common sense reduced to calculation. You need to first learn the theory required to correctly do these
More informationLecture 12: Multiple Random Variables and Independence
EE5110: Probability Foundations for Electrical Engineers July-November 2015 Lecture 12: Multiple Random Variables and Independence Instructor: Dr. Krishna Jagannathan Scribes: Debayani Ghosh, Gopal Krishna
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationChapter 5 Random vectors, Joint distributions. Lectures 18-23
Chapter 5 Random vectors, Joint distributions Lectures 18-23 In many real life problems, one often encounter multiple random objects. For example, if one is interested in the future price of two different
More informationω X(ω) Y (ω) hhh 3 1 hht 2 1 hth 2 1 htt 1 1 thh 2 2 tht 1 2 tth 1 3 ttt 0 none
3 D I S C R E T E R A N D O M VA R I A B L E S In the previous chapter many different distributions were developed out of Bernoulli trials. In that chapter we proceeded by creating new sample spaces for
More informationMAT 135B Midterm 1 Solutions
MAT 35B Midterm Solutions Last Name (PRINT): First Name (PRINT): Student ID #: Section: Instructions:. Do not open your test until you are told to begin. 2. Use a pen to print your name in the spaces above.
More informationModule 3. Function of a Random Variable and its distribution
Module 3 Function of a Random Variable and its distribution 1. Function of a Random Variable Let Ω, F, be a probability space and let be random variable defined on Ω, F,. Further let h: R R be a given
More informationDynamic Programming Lecture #4
Dynamic Programming Lecture #4 Outline: Probability Review Probability space Conditional probability Total probability Bayes rule Independent events Conditional independence Mutual independence Probability
More informationModule 1. Probability
Module 1 Probability 1. Introduction In our daily life we come across many processes whose nature cannot be predicted in advance. Such processes are referred to as random processes. The only way to derive
More information2.1 Lecture 5: Probability spaces, Interpretation of probabilities, Random variables
Chapter 2 Kinetic Theory 2.1 Lecture 5: Probability spaces, Interpretation of probabilities, Random variables In the previous lectures the theory of thermodynamics was formulated as a purely phenomenological
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationUniversity of Regina. Lecture Notes. Michael Kozdron
University of Regina Statistics 851 Probability Lecture Notes Winter 2008 Michael Kozdron kozdron@stat.math.uregina.ca http://stat.math.uregina.ca/ kozdron References [1] Jean Jacod and Philip Protter.
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationProbability. Carlo Tomasi Duke University
Probability Carlo Tomasi Due University Introductory concepts about probability are first explained for outcomes that tae values in discrete sets, and then extended to outcomes on the real line 1 Discrete
More informationConditional Probability
Conditional Probability When we obtain additional information about a probability experiment, we want to use the additional information to reassess the probabilities of events given the new information.
More informationDiscrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations
EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More information3. DISCRETE RANDOM VARIABLES
IA Probability Lent Term 3 DISCRETE RANDOM VARIABLES 31 Introduction When an experiment is conducted there may be a number of quantities associated with the outcome ω Ω that may be of interest Suppose
More information1.1. MEASURES AND INTEGRALS
CHAPTER 1: MEASURE THEORY In this chapter we define the notion of measure µ on a space, construct integrals on this space, and establish their basic properties under limits. The measure µ(e) will be defined
More information1 Random variables and distributions
Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set
More information2. AXIOMATIC PROBABILITY
IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop
More informationDiscrete Distributions
A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose
More information18.175: Lecture 8 Weak laws and moment-generating/characteristic functions
18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach
More informationDiscrete Random Variables
Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2014 Introduction The markets can be thought of as a complex interaction of a large number of random
More informationDiscrete Mathematics and Probability Theory Fall 2013 Vazirani Note 16. A Brief Introduction to Continuous Probability
CS 7 Discrete Mathematics and Probability Theory Fall 213 Vazirani Note 16 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces Ω, where the
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationMath 3338: Probability (Fall 2006)
Math 3338: Probability (Fall 2006) Jiwen He Section Number: 10853 http://math.uh.edu/ jiwenhe/math3338fall06.html Probability p.1/8 2.2 Axioms, Interpretations, and Properties of Probability Probability
More information18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014
18440: Probability and Random variables Quiz 1 Friday, October 17th, 014 You will have 55 minutes to complete this test. Each of the problems is worth 5 % of your exam grade. No calculators, notes, or
More informationThe Probabilistic Method
Lecture 3: Tail bounds, Probabilistic Method Today we will see what is nown as the probabilistic method for showing the existence of combinatorial objects. We also review basic concentration inequalities.
More informationSpring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures
36-752 Spring 2014 Advanced Probability Overview Lecture Notes Set 1: Course Overview, σ-fields, and Measures Instructor: Jing Lei Associated reading: Sec 1.1-1.4 of Ash and Doléans-Dade; Sec 1.1 and A.1
More informationMath 105 Course Outline
Math 105 Course Outline Week 9 Overview This week we give a very brief introduction to random variables and probability theory. Most observable phenomena have at least some element of randomness associated
More informationMath 493 Final Exam December 01
Math 493 Final Exam December 01 NAME: ID NUMBER: Return your blue book to my office or the Math Department office by Noon on Tuesday 11 th. On all parts after the first show enough work in your exam booklet
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random
More informationCommunication Theory II
Communication Theory II Lecture 5: Review on Probability Theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt Febraury 22 th, 2015 1 Lecture Outlines o Review on probability theory
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationCS 630 Basic Probability and Information Theory. Tim Campbell
CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)
More informationPart (A): Review of Probability [Statistics I revision]
Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationRandomized Algorithms
Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 18
EECS 7 Discrete Mathematics and Probability Theory Spring 214 Anant Sahai Note 18 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces Ω,
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationProbabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation
EE 178 Probabilistic Systems Analysis Spring 2018 Lecture 6 Random Variables: Probability Mass Function and Expectation Probability Mass Function When we introduce the basic probability model in Note 1,
More informationLecture 1: Review on Probability and Statistics
STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More information