Stochastic Processes for Physicists
|
|
- Oscar Kelley
- 5 years ago
- Views:
Transcription
1 Stochastic Processes for Physicists Understanding Noisy Systems Chapter 1: A review of probability theory Paul Kirk Division of Molecular Biosciences, Imperial College London 19/03/2013
2 1.1 Random variables and mutually exclusive events Random variables Suppose we do not know the precise value of a variable, but may have an idea of the relative likelihood that it will have one of a number of possible values. Let us call the unknown quantity X. This quantity is referred to as a random variable. Probability 6-sided die. Let X be the value we get when we roll the die. Describe the likelihood that X will have each of the values 1,..., 6 by a number between 0 and 1: the probability. If Prob(X = 3) = 1, then we will always get a 3. If Prob(X = 3) = 2/3, then we expect to get a 3 about two-thirds of the time. Paul Kirk 1 of 22
3 1.1 Random variables and mutually exclusive events Mutually exclusive events The various values of X are an example of mutually exclusive events. X can take precisely one of the values between 1 and 6. Mutually exclusive probabilities sum Prob(X = 3 or X A= review 4) = Prob(X of probability = 3) + Prob(X theory = 4). Paul Kirk 2 of 22
4 1.1 Random variables and mutually exclusive events Note: in mathematics texts it is customary to denote the unknown quantity using a capital letter, say X, and a variable that specifies one of the possible values that X may have as the equivalent lower-case letter, x. We will use this convention in this chapter, but in the following chapters we will use a lower-case letter for both the unknown quantity and the values it can take, since it causes no confusion. So, rather than writing Prob(X = 3) or Prob(X = x), we will (in later chapters) tend to write Prob(3) or Prob(x). Paul Kirk 3 of 22
5 1.1 Random variables and mutually exclusive events Note: in mathematics texts it is customary to denote the unknown quantity using a capital letter, say X, and a variable that specifies one of the possible values that X may have as the equivalent lower-case letter, x. We will use this convention in this chapter, but in the following chapters we will use a lower-case letter for both the unknown quantity and the values it can take, since it causes no confusion. So, rather than writing Prob(X = 3) or Prob(X = x), we will (in later chapters) tend to write Prob(3) or Prob(x). Warning: may cause confusion. Paul Kirk 3 of 22
6 1.1 Random variables and mutually exclusive events Continuous random variables For continuous random variables, the probability for X to be within a range is found by integrating the probability density function. Prob(a < X < b) = b a P(x)dx 2 A review of probability theory Figure 1.1. An illustation of summing the probabilities of mutually exclusive events, both for discrete and continuous random variables. Paul Kirk 4 of 22
7 1.1 Random variables and mutually exclusive events Expectation The expectation of an arbitrary function, f (X ), with respect to the probability density function P(X ) is f (X ) P(X ) = The mean or expected value of X is X. P(x)f (x)dx. Paul Kirk 5 of 22
8 1.1 Random variables and mutually exclusive events Variance The variance of X is the expectation of the squared difference from the mean V [X ] = = P(x)(x X ) 2 dx P(x)(x 2 + X 2 2x X )dx = P(x)x 2 dx + = X 2 + ( X 2 = X 2 + X 2 2 X 2 = X 2 X 2 P(x)dx P(x) X 2 dx ) ( 2 X P(x)2x X dx ) P(x)xdx Paul Kirk 6 of 22
9 1.2 Independence Independence Independent probabilities multiply For independent variables, P X,Y (x, y) = P X (x)p Y (y) For independent variables, XY = X Y Paul Kirk 7 of 22
10 1.3 Dependent random variables Dependence If X and Y are dependent, then P X,Y (x, y) does not factor as the product of P X (x) and P Y (y) If we know P X,Y (x, y) and want to know P X (x), then it is obtained by integrating out (or marginalising ) the other variable: P X (x) = P X,Y (x, y)dy Paul Kirk 8 of 22
11 1.3 Dependent random variables Conditional probability densities The probability density for X given that we know that Y = y is written P(X = x Y = y) or P(x y) and is referred to as the conditional probability density for X given Y. P X Y (X = x Y = y) = P X,Y (X = x, Y = y). P Y (Y = y) Explanation: To see how to calculate this conditional probability, we note first that P(x, y) with y = a gives the relative probability for different values of x given that Y = a. To obtain the conditional probability density for X given that Y = a, all we have to do is divide P(x, a) by its integral over all values of x. Paul Kirk 9 of 22
12 1.4 Correlations and correlation coefficients The covariance of X and Y is: cov(x, Y ) = (X X )(Y Y ) = XY X Y. Idea: 1. How can we define what it means for a value x to be bigger than usual? Well, we can see if x > X i.e. if x X > Similarly, we can say that a value x is smaller than usual if x < X i.e. if x X < If x tends to be bigger than usual when y is bigger than usual, then (X X )(Y Y ) will be > 0 The correlation is just a normalised version of the covariance, which takes values in the range 1 to 1: C XY = XY X Y V [X ]V [Y ] Paul Kirk 10 of 22
13 1.5 Adding random variables together When we have two continuous random variables, X and Y, with probability densities P X and P Y, it is often useful to be able to calculate the probability density of the random variable whose value is the sum of them: Z = X + Y. It turns out that the probability density for Z is given by P Z (z) = P X (s z)p Y (s)ds = P X P Y Paul Kirk 11 of 22
14 1.5 Adding random variables together When we have two continuous random variables, X and Y, with probability densities P X and P Y, it is often useful to be able to calculate the probability density of the random variable whose value is the sum of them: Z = X + Y. It turns out that the probability density for Z is given by P Z (z) = P X (s z)p Y (s)ds = P X P Y Paul Kirk 11 of 22
15 1.5 Adding random variables together When we have two continuous random variables, X and Y, with probability densities P X and P Y, it is often useful to be able to calculate the probability density of the random variable whose value is the sum of them: Z = X + Y. It turns out that the probability density for Z is given by P Z (z) = P Z (z) = P X (s z)p Y (s)ds = P X P Y P X,Y (z s, s)ds = P X,Y (s, z s)ds Paul Kirk 11 of 22
16 1.5 Adding random variables together When we have two continuous random variables, X and Y, with probability densities P X and P Y, it is often useful to be able to calculate the probability density of the random variable whose value is the sum of them: Z = X + Y. It turns out that the probability density for Z is given by P Z (z) = P Z (z) = P X (s z)p Y (s)ds = P X P Y P X,Y (z s, s)ds = If X and Y are independent, this becomes: P Z (z) = P X,Y (s, z s)ds P X (z s)p Y (s)ds = P X P Y Paul Kirk 11 of 22
17 1.5 Adding random variables together If X 1 and X 2 are random variables and X = X 1 + X 2, then X = X 1 + X 2, and if X 1 and X 2 are independent, then V [X ] = V [X 1 ] + V [X 2 ]. Mysterious (?) assertion Averaging the results of a number of independent measurements produces a more accurate result. This is because the variances of the different measurements add together. does this make sense? Paul Kirk 12 of 22
18 1.5 Adding random variables together Explanation Assume all measurements have expectation, µ, and variance, σ 2. By the independence assumption, the variance of the average is: Moreover, V V [ N n=1 [ ] [ ] ( Xn X 2 = E n N N 2 E X n N ] = N V n=1 [ ] Xn. N [ ]) 2 Xn = E [ X 2 ] n (E [Xn ]) 2 N N 2 = V [X n] N 2. So, V [ N n=1 X n N ] = N n=1 V [X n ] N 2 = 1 N 2 N n=1 σ 2 = σ2 N. Paul Kirk 13 of 22
19 1.6 Transformations of a random variable Key assertion: If Y = g(x ), then: f (Y ) = x=b x=a P X (x)f (g(x))dx = y=g(b) y=g(a) P Y (y)f (y)dy. Paul Kirk 14 of 22
20 1.6 Transformations of a random variable Key assertion: If Y = g(x ), then: f (Y ) = x=b x=a P X (x)f (g(x))dx = y=g(b) y=g(a) P Y (y)f (y)dy. Given this assumption, everything else falls out automatically: f (Y ) = = x=b x=a y=g(b) y=g(a) P X (x)f (g(x))dx = P X (g 1 (y)) g (g 1 (y)) f (y)dy. y=g(b) y=g(a) P X (g 1 (y))f (y) dx dy dy Paul Kirk 14 of 22
21 1.6 Transformations of a random variable Key assertion: If Y = g(x ), then: f (Y ) = x=b x=a P X (x)f (g(x))dx = y=g(b) y=g(a) P Y (y)f (y)dy. Given this assumption, everything else falls out automatically: f (Y ) = = x=b x=a y=g(b) y=g(a) P X (x)f (g(x))dx = P X (g 1 (y)) g (g 1 (y)) f (y)dy. General result (for invertible g): y=g(b) y=g(a) P Y (y) = P X (g 1 (y)) g (g 1 (y)). P X (g 1 (y))f (y) dx dy dy Paul Kirk 14 of 22
22 1.7 The distribution function The probability distribution function, which we will call D(x), of a random variable X is defined as the probability that X is less than or equal to x. Thus D(x) = Prob(X x) = x P(z)dz In addition, the fundamental theorem of calculus tells us that P(x) = d dx D(x). Paul Kirk 15 of 22
23 1.8 The characteristic function The characteristic function is defined as the Fourier transform of the probability density. χ(s) = The inverse transform gives: P(x) = 1 2π P(x) exp (isx )dx. χ(s) exp ( isx)ds. The Fourier transform of the convolution of two functions, P(x) and Q(x), is the product of their Fourier transforms, χ P (s) and χ Q (s). For discrete random variables, the characteristic function is a sum. In general (for both discrete and continuous r.v. s), we have: χ P (s) = exp (isx) P(X ). Paul Kirk 16 of 22
24 1.9 Moments and cumulants Moment generating function (departure from the book) The moment generating function is defined as: M(t) = exp(tx ), where X is a random variable, and the expectation is with respect to some density P(X ), so that M(t) = = exp(tx)p(x)dx ( 1 + tx + 1 2! t2 x ) P(x)dx = 1 + tm ! t2 m r! tr m r +... where m r = X r is the r-th (raw) moment. Paul Kirk 17 of 22
25 1.9 Moments and cumulants Moment generating function (continued) M(t) = 1 + tm ! t2 m r! tr m r +... It follows from the above expansion that: M(0) = 1 M (0) = m 1 M (0) = m 2. M (r) (0) = m r Paul Kirk 18 of 22
26 1.9 Moments and cumulants Cumulant generating function (continued departure from book) The log of the moment generating function is called the cumulant generating function, R(t) = ln(m(t)). By the chain rule of differentiation, we can write down the derivatives of R(t) in terms of the derivatives of M(t), e.g. R (t) = M (t) M(t) R (t) = M(t)M (t) (M (t)) 2 (M(t)) 2 Note that R(0) = 1, R (0) = M (0) = m 1 = µ, R (0) = M (0) (M (0)) 2 = m 2 m 2 1 = σ2,... These are the cumulants. Paul Kirk 19 of 22
27 1.9 Moments and cumulants The moments can be calculated from the derivatives of the characteristic function, evaluated at s = 0. We can see this by expanding the characteristic function as a Taylor series: χ(s) = χ (n) (0)s n n=0 where χ (n) (s) is the n-th derivative of χ(s). But we also have: χ(s) = e isx (isx ) n i n X n s n = = n! n! n=0 n! n=0 Equating the 2 expressions, we get: X n = χ(n) (0) i n Paul Kirk 20 of 22
28 1.9 Moments and cumulants Cumulants The n-th order cumulant of X, is the n-th derivative of the log of the characteristic function. For independent random variables, X and Y, if Z = X + Y then the n-th cumulant of Z is the sum of the n-th cumulants of X and Y. The Gaussian distribution is also the only absolutely continuous distribution all of whose cumulants beyond the first two (i.e. other than the mean and variance) are zero. Paul Kirk 21 of 22
29 1.10 The multivariate Gaussian Let x = [x!,..., x N ], then the general form of the Gaussian pdf is: 1 P(x) = ( (2π) N det(σ) exp 1 ) 2 (x µ) Σ 1 (x µ), where µ is the mean vector and Σ is the covariance matrix. All higher moments of a Gaussian can be written in terms of the means and covariances. Defining X X X, for a 1-dimensional Gaussian we have: X 2n = X 2n 1 = 0 (2n 1)!(V [X ])n 2 n 1 (n 1)! Paul Kirk 22 of 22
A review of probability theory
1 A review of probability theory In this book we will study dynamical systems driven by noise. Noise is something that changes randomly with time, and quantities that do this are called stochastic processes.
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationMath 341: Probability Eighteenth Lecture (11/12/09)
Math 341: Probability Eighteenth Lecture (11/12/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic - not
More informationA Brief Analysis of Central Limit Theorem. SIAM Chapter Florida State University
1 / 36 A Brief Analysis of Central Limit Theorem Omid Khanmohamadi (okhanmoh@math.fsu.edu) Diego Hernán Díaz Martínez (ddiazmar@math.fsu.edu) Tony Wills (twills@math.fsu.edu) Kouadio David Yao (kyao@math.fsu.edu)
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationHomework 10 (due December 2, 2009)
Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationChapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.
Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept
More informationME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander
ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 Prof. Steven Waslander p(a): Probability that A is true 0 pa ( ) 1 p( True) 1, p( False) 0 p( A B) p( A) p( B) p( A B) A A B B 2 Discrete Random Variable X
More informationp(z)
Chapter Statistics. Introduction This lecture is a quick review of basic statistical concepts; probabilities, mean, variance, covariance, correlation, linear regression, probability density functions and
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationIntroduction to Probability Theory
Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationProbability Theory Review Reading Assignments
Probability Theory Review Reading Assignments R. Duda, P. Hart, and D. Stork, Pattern Classification, John-Wiley, 2nd edition, 2001 (appendix A.4, hard-copy). "Everything I need to know about Probability"
More information2.2 Separable Equations
2.2 Separable Equations Definition A first-order differential equation that can be written in the form Is said to be separable. Note: the variables of a separable equation can be written as Examples Solve
More informationChapter 5,6 Multiple RandomVariables
Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More informationToday. Probability and Statistics. Linear Algebra. Calculus. Naïve Bayes Classification. Matrix Multiplication Matrix Inversion
Today Probability and Statistics Naïve Bayes Classification Linear Algebra Matrix Multiplication Matrix Inversion Calculus Vector Calculus Optimization Lagrange Multipliers 1 Classical Artificial Intelligence
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationThe Gaussian distribution
The Gaussian distribution Probability density function: A continuous probability density function, px), satisfies the following properties:. The probability that x is between two points a and b b P a
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationSome Probability and Statistics
Some Probability and Statistics David M. Blei COS424 Princeton University February 12, 2007 D. Blei ProbStat 01 1 / 42 Who wants to scribe? D. Blei ProbStat 01 2 / 42 Random variable Probability is about
More informationProbability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.
Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationMonte Carlo Simulation. CWR 6536 Stochastic Subsurface Hydrology
Monte Carlo Simulation CWR 6536 Stochastic Subsurface Hydrology Steps in Monte Carlo Simulation Create input sample space with known distribution, e.g. ensemble of all possible combinations of v, D, q,
More informationREVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)
REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationContinuous r.v practice problems
Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation
More informationConditional distributions. Conditional expectation and conditional variance with respect to a variable.
Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions
More informationTopic 6 Continuous Random Variables
Topic 6 page Topic 6 Continuous Random Variables Reference: Chapter 5.-5.3 Probability Density Function The Uniform Distribution The Normal Distribution Standardizing a Normal Distribution Using the Standard
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationEstimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators
Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationOrdinary Differential Equations
Ordinary Differential Equations (MA102 Mathematics II) Shyamashree Upadhyay IIT Guwahati Shyamashree Upadhyay ( IIT Guwahati ) Ordinary Differential Equations 1 / 25 First order ODE s We will now discuss
More informationfunctions Poisson distribution Normal distribution Arbitrary functions
Physics 433: Computational Physics Lecture 6 Random number distributions Generation of random numbers of various distribuition functions Normal distribution Poisson distribution Arbitrary functions Random
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationPhysics 116C The Distribution of the Sum of Random Variables
Physics 116C The Distribution of the Sum of Random Variables Peter Young (Dated: December 2, 2013) Consider a random variable with a probability distribution P(x). The distribution is normalized, i.e.
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationBivariate Distributions. Discrete Bivariate Distribution Example
Spring 7 Geog C: Phaedon C. Kyriakidis Bivariate Distributions Definition: class of multivariate probability distributions describing joint variation of outcomes of two random variables (discrete or continuous),
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationMultivariate random variables
Multivariate random variables DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Joint distributions Tool to characterize several
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationFourier and Stats / Astro Stats and Measurement : Stats Notes
Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More information116 Chapter 3 Convolution
116 Chapter 3 Convolution This is a great combination of many of the things we ve developed to this point, and it will come up again. 9 Consider the left hand side as a function of x, say h(x) n e (x n)2
More informationProbability and Statistics
Probability and Statistics Jane Bae Stanford University hjbae@stanford.edu September 16, 2014 Jane Bae (Stanford) Probability and Statistics September 16, 2014 1 / 35 Overview 1 Probability Concepts Probability
More informationChapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:
4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained
More informationContinuous Random Variables
1 Continuous Random Variables Example 1 Roll a fair die. Denote by X the random variable taking the value shown by the die, X {1, 2, 3, 4, 5, 6}. Obviously the probability mass function is given by (since
More informationMachine Learning Lecture Notes
Machine Learning Lecture Notes Predrag Radivojac January 3, 25 Random Variables Until now we operated on relatively simple sample spaces and produced measure functions over sets of outcomes. In many situations,
More informationIntermediate Lab PHYS 3870
Intermediate Lab PHYS 3870 Lecture 3 Distribution Functions References: Taylor Ch. 5 (and Chs. 10 and 11 for Reference) Taylor Ch. 6 and 7 Also refer to Glossary of Important Terms in Error Analysis Probability
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More information8 - Continuous random vectors
8-1 Continuous random vectors S. Lall, Stanford 2011.01.25.01 8 - Continuous random vectors Mean-square deviation Mean-variance decomposition Gaussian random vectors The Gamma function The χ 2 distribution
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More information[POLS 8500] Review of Linear Algebra, Probability and Information Theory
[POLS 8500] Review of Linear Algebra, Probability and Information Theory Professor Jason Anastasopoulos ljanastas@uga.edu January 12, 2017 For today... Basic linear algebra. Basic probability. Programming
More information