1 Presessional Probability
|
|
- Clement Cook
- 5 years ago
- Views:
Transcription
1 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional material summarises some basic facts, most of which you have seen before, and are given for future reference. 1.1 Finite Probability Spaces Consider a random experiment with a finite set of possible outcomes. This set is called the sample space and denoted Ω. Probability measure on Ω is given by a function that assigns to each ω Ω a number P(ω [0, 1], so that P(ω = 1. ω Ω The pair (Ω, P is a finite probability space. The subsets of Ω are called events. Probability of an event A is defined as P(A = ω A P(ω. The addition rule of probabilities holds: if A B = (events A and B are disjoint, or incompatible, then P(A B = P(A + P(B. In general, we have the inclusion-exclusion formula P(A B = P(A + P(B P(A B. The event complement to A is denoted A c = Ω \ A. Note that P(A c = 1 P(A. Example 1.1. Flipping a coin three times. The sample space is Ω = {HHH, HHT, HT H, HT T, T HH, T HT, T T H, T T T }. An individual element of Ω is a sequence of three tosses ω = ω 1 ω 2 ω 3 (this is a shorthand notation for (ω 1, ω 2, ω 3. We assume that p is probability of a head and q = 1 p probability of a tail, and that the tosses are independent. Then P(HHH = p 3, P(HHT = p 2 q,..., P(T T T = q 3. Let A be the event first toss is a head. probability of A is p: We check from the definitions that the P(A = P(ω ω Ω =P(HHH + P(HHT + P(HT H + P(HT T =p 3 + p 2 q + p 2 q + pq 2 = p 2 (p + q + pq(p + q =p 2 + pq = p(p + q = p. The complement event A c is first toss is a tail. We have P(A c = 1 p = q. 1
2 Example 1.2. Let a, b, c be three companies. Let (i, j, k be the outcome that in 2014 company i makes more profit than company j and that company j makes more profit than company k. Then the sample space is Ω = {(a, b, c, (a, c, b, (b, a, c, (b, c, a, (c, a, b, (c, b, a}. Define a probability measure on Ω by letting P(ω = 1/6 for every ω Ω. Let A be the event that a makes maximum profit in Then A = {(a, b, c, (a, c, b} and P(A = 1/6 + 1/6 = 1/3. Example 1.3. Let the probability that the FTSE100 increases today be 0.52 and the probability that it increases tomorrow be 0.52 as well. Suppose that the probability that it increases both today and tomorrow is What is the probability that the FTSE100 increases neither today nor tomorrow? Solution. Let A be the event that the FTSE100 increases today and let B be the event that the FTSE100 increases tomorrow. We are given that P(A = P(B = 0.52 and P(A B = 0.28 and we want to find P((A B c. Now P(A B = P(A + P(B P(A B = 0.76, so P((A B c = = 0.24 is the sought answer. Two events A and B are called independent if P(A B = P(AP(B. Note: independent does not mean disjoint. Disjoint (incompatible events distinct from, Ω are not independent! If P(B > 0 the conditional probability of A given B is P(A B = P(A B P(B. If A and B are events with P(B > 0, then P(A = P(A B is the same at that A and B are independent. Exercise 1.4. Two dice are rolled. 1. Let A be the event the first die turns up an odd number, B the total is even. Are the events A and B independent? Give a detailed answer by calculating the probabilities. 2. Let C be the total is seven, D first die turns up three. Calculate P(C D, P(D C. More generally, two or more events A 1, A 2,..., A k are said to be (mutually independent if ( k k P B j = P(B j, where each B j is either A j or A c j. Equivalently, if for every possible selection A i1,..., A im from the list A 1,..., A k it holds P (A i1... A im = P(A i1 P(A im. 2
3 It is not enough for the mutual independence that the events are pairwise independent. It might happen that P(A i A j = P(A i P(A j for every pair i j, but the independence fails. In particular, three events A, B, C are independent if the following four equalities hold: P(A B C = P(AP(BP(C P(A B = P(AP(B, P(A C = P(AP(C, P(B C = P(BP(C. 1.2 General probability spaces Finite probability spaces are by far not sufficient to describe many real-life phenomena. Sometimes it is enough to adopt a countably infinite probability space, e.g. for the experiment of tossing a coin until first head. More often, however, we need to model continuous quantities like velocity, stock prices, distance, etc, for which a larger probability space is required. A general probability space is a triple (Ω, F, P, where Ω is a sample space of all possible outcomes, P is a probability measure, and F is a collection of events A Ω for which probability P(A is defined. The simple rule of addition of probabilities needs to be replaced by a stronger countableadditivity rule ( P P(A j = P(A j, A j F. required to hold for pairwise disjoint events A 1, A 2,... (with A j F. Example 1.5. For infinite series of tosses of a coin, a natural sample space is Ω = {ω 1 ω 2... : ω j {H, T }}. An elementary outcome is an infinite sequence of heads and tails which might start like T HHHT T H.... Assuming that coin has probability p of a head we can calculate probabilities of events like A = {ω = ω 1 ω 2... : ω 1 = T, ω 2 = H, ω 3 = H}, (these are elements ω 1 ω 2... of ω with fixed few initial coordinates. Probabilities of more complex events are determined from these using the laws of probability (addition rule and the complement rule. Example 1.6. Experiment random point in a circle is described by a circle of radius 1, taken for Ω, the probability measure P(A = area of A, π and F the system of sets, to which area can be assigned. We need a more complex probability space to model stock market data in time. 3
4 1.3 Random Variables A random variable is a numerical quantity X associated with a random experiment. Formally, X is a R-valued function defined on the sample space. (Sometimes it is useful to also permit values ±. A random variable is characterized by a probability distribution which is described differently for discrete and continuous random variables. We speak of a discrete random variable if X has a finite (or countably infinite set of possible values {x i }. The probability distribution in this case is specified by probabilities of individual values, often denoted p i = P(X = x i (another possible notation: p X (i. Under P(X = x i we mean probability of the event that X assumes the value x i, so we may also write p i = P({ω Ω : X(ω = x i }, but the argument ω in X(ω is usually omitted. Sometimes the collection of probabilities p i is called probability function or probability mass function. For named discrete distributions the set {x i } is some collection of integer numbers. Example 1.7. Let X be the number of heads in a series of n coin tosses, with probability p for a head in each toss. Possible values of X are i = 0, 1,..., n. For instance, for n = 3, X(ω = 2 if ω {HHT, HT H, T HH}, so P(X = 2 = 3p 2 q. The general formula is ( n p i = P(X = i = p i q n i for i = 0,..., n. i Due to its intrinsic connection to the binomial formula, this probability mass function is called the binomial distribution, denoted Binomial(n, p. In the case n = 1 this is Bernoulli(p (or distribution, with P(X = 1 = p and P(X = 0 = 1 p. Example 1.8. Let X be the number of tails before the first head. Possible values of X are i = 0, 1,..., and p i = P(X = i = pq i. This is known as the geometric distribution (denoted Geometric(p, due to the connection with infinite geometric series q i = 1 1 q. i=0 Note: sometimes the range of geometric distribution is taken i = 1, 2,..., interpreted as possible number of trials to get the first head (including the trial with head. Example 1.9. Poisson distribution (denoted Poisson(λ with parameter λ > 0 is defined by the formula p i = e λ λi, i = 0, 1,... i! 4
5 This distribution appears as approximation to the binomial distribution Binomial(n, p for n large and np λ. For discrete random variable X with P(X = x i = p i, its expectation (aka expected value is given by EX = x i p i = x i P(X = x i. i i If this is an infinite series, the expectation is defined if i x i p i < (otherwise undefined. There is an equivalent formula in terms of the probabilities of elementary outcomes EX = ω Ω X(ωP(ω. Example Suppose that a certain company makes 1,000,000 with probability 1/4; loses 500,000 with probability 1/4; and makes 2,000,000 with probability 1/4. If X the profit of the company, then X is a random variable with x 1 = p 1 = 1/4 x 2 = p 2 = 1/4 x 3 = p 3 = 1/2 In particular, the expected profit of the company in pounds is given by EX = x 1 p 1 + x 2 p 2 + x 3 p 3 = Using notation for distributed as, we have EX = np for X Binomial(n, p, EX = λ for X Poisson(λ, EX = (1 p/p for X Geometric(p. We speak of a continuous random variable when X takes values in R, R + (set of nonnegative real numbers, or in some interval and the probability of every individual value is zero. In this course, continuous rv s will be only considered with a probability density function (p.d.f. or just density f X, so that P(X [a, b] = b a f X (xdx. The density is nonnegative, f X (x 0 and has the total integral equal 1 f X (xdx = 1. 5
6 The function F X (x = P(X x = x f X (ydy is called (cumulative distribution function, which is also related to the density via the standard formulas of calculus d dx F X(x = f X (x, For continuous rv the expected value is b a f X (xdx = F X (b F X (a. EX = (it is well-defined if x f X(xdx <. xf X (xdx Example For A > 0 a random variable X is said to have a Cauchy distribution if its density is A f X (x = π(a 2 + x 2. It turns out that the expectation for Cauchy distributed rv does not exist, because for x it holds that x f X (x c/ x (c a constant, hence E X = x f X (x dx = 2 An important property of the expectation is linearity: 0 A x π(a 2 + x 2 dx =, Proposition If X 1,..., X m are random variables and α 1,..., α m are constants, then ( m m E α j X j = α j EX j. Definition The variance of a random variable X is given by or by the equivalent alternative formula: Var(X = E(X EX 2, Var(X = EX 2 (EX 2. The variance is well defined if EX 2 < (we sometimes say that the second moment EX 2 exists. The standard deviation is given by σ X = Var(X. 6
7 Example We say X is uniform on [0, 1], written as X Uniform(0, 1, if { 1 if 0 x 1 f X (x = 0 otherwise The moments of X are E[X k ] = x k f X (x dx = 1 From this, Var(X = 1/3 (1/2 2 = 1/12. 0 x k dx = [ ] x=1 1 k + 1 xk+1 = 1 x=0 k + 1. Unlike the expectation, the variance is not linear, as the following result shows. Lemma If X is a random variable and a and b are constants, then Var(aX + b = a 2 Var(X. Example A random variable X is said to be normal (aka normally distributed, or Gaussian if X has pdf f X (x = 1 (x µ2 exp (. 2πσ 2σ 2 The parameters µ R and σ 2 > 0 are the mean (expectation and the variance of X, respectively. We write X N(µ, σ 2. If µ = 0 and σ = 1, we say that X is standard normal, in which case the density is denoted as φ(x = 1 2π exp ( x2 2 The distribution function of a standard normal random variable is denoted Φ(x = x φ(y dy. Although there is no elementary formula for Φ, the values can be found in statistical tables of via standard software. Due to the symmetry of the bell-shaped curve about zero ( φ(x = φ( x we have Moreover, for a b Φ(x = P(X x = P(X x = 1 P(X x = 1 Φ( x.. P(a X b = P(X b P(X a = Φ(b Φ(a. Every normal rv can be transformed to a standard normal rv as follows: Lemma If X N(µ, σ 2 then X µ σ N(0, 1. 7
8 It is often needed to derive a density of rv variable Y = g(x from the density of X, where g is some function. Theorem 1.18 (Transformation Formula. Let X be a continuous rv and let Y = g(x, where g is a differentiable function which is (i either strictly monotonically increasing (so g (x > 0 x R (ii or strictly monotonically decreasing (so g (x < 0 x R. Then { fx (g 1 (y d dy f Y (y = g 1 (y for all y for which g 1 (y exists 0 for all other y Lognormal distribution introduced in the next example is particularly important in finance. Example A rv Y is said to be lognormal with parameters µ and σ 2 where µ R and σ 2 > 0, if log Y N(µ, σ 2. We write Y LogNormal(µ, σ 2. Equivalently, Y satisfies Y = exp(x, where X N(µ, σ 2. The expectation and the variance are EY = exp(µ σ2 and Var(Y = exp(2µ + σ 2 (e σ2 1. We will now use the Transformation Formula to determine the density of lognormal distribution: 1 (log y µ2 f Y (y = exp (, y > 0 2πσy 2σ 2 (f Y (y = 0 for y < 0 Proof. Write g(x = e x. Then Y = g(x. Now, if y = e x, then x = log y, so and Moreover, since X N(µ, σ 2, g 1 (y = log y for y > 0, d dy g 1 (y = 1 y f X (x = 1 2πσ exp for y > 0. ( (x µ2. 2σ 2 Using the Transformation Formula we see that if y > 0 then f Y (y = f X (log y 1 y = 1 (log y µ2 exp (, 2πσy 2σ 2 while f Y (y = 0 if y 0. 8
9 We can make calculation with the lognormal distribution function using the tables for the normal Φ. Example Suppose Y LogNormal(µ, σ 2 with µ = 0.20 and σ = Determine y such that P(Y y = Solution. Note that P(Y y = P(log Y log y where log Y N(µ, σ 2. Thus ( log Y µ 0.95 = P(Y y = P(log Y log y = P log y µ ( log y µ = Φ. σ σ σ From the table for Φ we find so log y µ σ = 1.645, y = exp(µ σ = Independence, covariance and correlation We start with a general definition/theorem. Definition Two random variables X and Y are said to be independent if the events {X x} and {Y y} are independent for all x, y, that is P(X x, Y y = P(X xp(y y. Equivalently, if for all functions g 1, g 2 (provided the expectations are defined. E[g 1 (Xg 2 (Y ] = E[g 1 (X]E[g 2 (X] Furthermore, random variables X 1, X 2,..., X k are said to be independent if the events {X 1 x 1 },..., {X k x k } are independent for all x 1,..., x k ; equivalently E[g 1 (X 1 g k (X k ] = E[g 1 (X 1 ] E[g k (X k ] for any functions g 1,..., g k. In more practical terms, the independence for discrete X, Y means that for all possible values x i, y j of X, Y and for continuous X, Y that for all x, y P(X = x i, Y = y j = P(X = x i P(Y = y j, f X,Y (x, y = f X (xf Y (y 9
10 where the joint density f X,Y is defined by P(X [a, b], Y [c, d] = b d a c f X,Y (x, ydxdy. Likewise for independent X 1,..., X k, the joint probability mass function (respectively, joint density function should factorise in marginal probability mass functions (respectively, density functions in the discrete (respectively, continuous case. The following addition rules are useful: Lemma Let X and Y be independent rv s 1. If X Binomial(n, p, Y Binomial(m, p then X + Y Binomial(n + m, p, 2. If X Poisson(λ, Y Poisson(µ then X + Y Poisson(λ + µ, 3. If X N(µ 1, σ 2 1, Y N(µ 2, σ 2 2 then X + Y N(µ 1 + µ 2, σ σ 2 2. A standard way to check the addition rules is to appeal to the characteristic function ϕ X (t = Ee itx (where i = 1. The product formula ϕ X+Y (t = ϕ X (tϕ Y (t holds if and only if X and Y are independent. Recall that the covariance of two rv s X and Y is defined by cov(x, Y = E[(X EX(Y EY ]. Note that cov(x, Y = cov(y, X and that cov(x, X = Var(X. The following is a useful reformulation, cov(x, Y = E[XY ] EXEY. Covariance turns out to be linear in each of its arguments. We shall state a special case first. Lemma Let X 1, X 2 and Y be three rv s. Then cov(x 1 + X 2, Y = cov(x 1, Y + cov(x 2, Y. Repeated application of the previous lemma, together with the fact that cov(x, Y = cov(y, X yields the following result. Proposition Let X i, i = 1, 2,..., m and Y j, j = 1, 2,... n be two sequences of random variables. Then ( m n m n cov X i, Y j = cov(x i, Y j. i=1 i=1 10
11 The correlation coefficient for two rv s X and Y is given by It is always the case that ρ X,Y = cov(x, Y σ(xσ(y. 1 ρ X,Y 1. (provided the variances, hence the correlation coefficient are defined. We now recall that two random variables X and Y are said to uncorrelated if cov(x, Y = 0. Lemma If two rv s X and Y are independent, then they are uncorrelated. Proof. By independence E[XY ] = EXEY. Note: the converse of the lemma is false, that is, X and Y uncorrelated does not imply that X and Y are independent. Proposition For rv s X 1, X 2,..., X n we have ( n n n Var X i = cov(x i, X j. i=1 i=1 If the random variables X 1, X 2,..., X n are pairwise uncorrelated (that is X i and X j are uncorrelated whenever i j, then ( n n Var X i = Var(X i. i=1 Proposition For pairwise independent rv s X 1, X 2,..., X n (i.e. such that X i and X j are independent for i j i=1 ( m Var X j = m Var(X j. 1.5 The Law of Large Numbers and the Central Limit Theorem Theorem Suppose X 1, X 2,... are independent identically distributed rv s with EX i = µ. Let S n = n X j. Then with probability one S n lim n n = µ. The normal distribution appears in the applications due to the Central Limit Theorem (CLT, one of the main results of Probability Theory. CLT quantifies how fast S n /n converges to µ. 11
12 Theorem 1.29 (Central Limit Theorem. Let X 1, X 2,... be independent identically distributed rv s with mean EX i = µ and variance Var(X i = σ 2, where µ R and σ 2 > 0. Define S n = n i=1 X i. Then lim P n ( Sn nµ nσ x = Φ(x ( x R. Example Historically, the CLT was first shown for Bernoulli trials, as approximation the binomial distribution. In the n-times coin-tossing experiment set X j = 1(ω j = H, where ω 1... ω n {H, T } (1( is 1 when true and is 0 otherwise. Then S n = X X n is the number of heads in n tosses, and S n Binomial(n, p, The CLT in this case says that (S n np/ npq is approximately normally distributed for n large. The LLN only says that S n /n is approximately p. Example The exact distribution of S n is typically complicated. However, suppose X 1, X 2,..., X n are independent N(µ, σ 2 -distributed rv s. Then, by the addition theorem for the normal distribution S n N(nµ, nσ 2, and so in this case S n nµ nσ N(0, 1. Example Let X j be Exponential(λ, i.e. with density f Xj (x = λe λx for x > 0. The moments are EX j = λ, Var(X j = 1/λ 2. The sum S n has Gamma distribution with density λ k f Sn (x = (n 1! xn 1 e λx, x > 0. The CLT in this case says that the distribution of (λs n n/ n is approximately N(0, 1. Gambler s ruin problem Simple random walk provides a model of the wealth by playing head-or-tail game with unit bets. Let X 1, X 2, be iid (independent, identically distributed rv s with P(X j = 1 = P(X j = 1 = 1/2. With some (integer initial capital S 0, the random walk S n = S 0 + X X n models the fortune of a gambler in n rounds (with S n S 0 being the net winnings. If two players start with initial capital of A and B pounds, respectively, each betting a pound, what is the probability that either of them get broken (so the other wins? In terms of the random walk the question is: what the probability that the random walk reaches level A before level B? Consider the random time τ = min{n 0 : S n = A or S n = B}, 12
13 and define S τ = S n 1(τ = n n=0 to be the value of random walk at time τ. Clearly, S τ is either A or B, and we want to determine the conditional probability Consider a more general quantity P(S τ = A S 0 = 0. π(k = P(S τ = A S 0 = k, B k A, the probability to reach A before B when starting at S 0 = k. Looking at what happens at the first step, we get a recursion π(k = 1 2 π(k π(k + 1, 2 B < k < A with the boundary conditions π(a = 1, π( B = 0. From this equation, setting π( B + 1 = x, we get π( B + 2 = 2x, π( B + 3 = 3x,..., π( B + B + A = (B + Ax. From the boundary condition, x = 1/(A + B. Finally, π(0 = π( B + B = B A + B is the probability to reach A before B, equal to the probability to get ruined when gambling with A pounds against a player who started with B pounds. 13
Algorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationSet Theory Digression
1 Introduction to Probability 1.1 Basic Rules of Probability Set Theory Digression A set is defined as any collection of objects, which are called points or elements. The biggest possible collection of
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More information4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur
4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Rahul Roy Indian Statistical Institute, Delhi. Adapted
More informationLecture 1: Review on Probability and Statistics
STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationBayesian statistics, simulation and software
Module 1: Course intro and probability brush-up Department of Mathematical Sciences Aalborg University 1/22 Bayesian Statistics, Simulations and Software Course outline Course consists of 12 half-days
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationProbability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9
Probability Computer Science Tripos, Part IA R.J. Gibbens Computer Laboratory University of Cambridge Easter Term 2008/9 Last revision: 2009-05-06/r-36 1 Outline Elementary probability theory (2 lectures)
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationRecap of Basic Probability Theory
02407 Stochastic Processes Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk
More information3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur
3rd IIA-Penn State Astrostatistics School 19 27 July, 2010 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Bhamidi V Rao Indian Statistical Institute,
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationRecap of Basic Probability Theory
02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk
More informationRandom Models. Tusheng Zhang. February 14, 2013
Random Models Tusheng Zhang February 14, 013 1 Introduction In this module, we will introduce some random models which have many real life applications. The course consists of four parts. 1. A brief review
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationUniversity of Regina. Lecture Notes. Michael Kozdron
University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More information2.1 Elementary probability; random sampling
Chapter 2 Probability Theory Chapter 2 outlines the probability theory necessary to understand this text. It is meant as a refresher for students who need review and as a reference for concepts and theorems
More informationDefinition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R
Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationPart (A): Review of Probability [Statistics I revision]
Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery
More information1 INFO Sep 05
Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationLecture notes for Part A Probability
Lecture notes for Part A Probability Notes written by James Martin, updated by Matthias Winkel Oxford, Michaelmas Term 017 winkel@stats.ox.ac.uk Version of 5 September 017 1 Review: probability spaces,
More informationMotivation and Applications: Why Should I Study Probability?
Motivation and Applications: Why Should I Study Probability? As stated by Laplace, Probability is common sense reduced to calculation. You need to first learn the theory required to correctly do these
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationX 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:
nna Janicka Probability Calculus 08/09 Lecture 4. Real-valued Random Variables We already know how to describe the results of a random experiment in terms of a formal mathematical construction, i.e. the
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationSTT 441 Final Exam Fall 2013
STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationLaws of Probability, Bayes theorem, and the Central Limit Theorem
Laws of, Bayes theorem, and the Central Limit Theorem 8th Penn State Astrostatistics School David Hunter Department of Statistics Penn State University Adapted from notes prepared by Rahul Roy and RL Karandikar,
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationProbability Theory for Machine Learning. Chris Cremer September 2015
Probability Theory for Machine Learning Chris Cremer September 2015 Outline Motivation Probability Definitions and Rules Probability Distributions MLE for Gaussian Parameter Estimation MLE and Least Squares
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More information3. DISCRETE RANDOM VARIABLES
IA Probability Lent Term 3 DISCRETE RANDOM VARIABLES 31 Introduction When an experiment is conducted there may be a number of quantities associated with the outcome ω Ω that may be of interest Suppose
More informationBandits, Experts, and Games
Bandits, Experts, and Games CMSC 858G Fall 2016 University of Maryland Intro to Probability* Alex Slivkins Microsoft Research NYC * Many of the slides adopted from Ron Jin and Mohammad Hajiaghayi Outline
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationLaws of Probability, Bayes theorem, and the Central Limit Theorem
Laws of Probability, Bayes theorem, and the Central Limit Theorem 2016 Penn State Astrostatistics Summer School David Hunter Department of Statistics Penn State University Adapted from notes prepared by
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationCS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro
CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationLaws of Probability, Bayes theorem, and the Central Limit Theorem
Laws of, Bayes theorem, and the Central Limit Theorem 7th Penn State Astrostatistics School David Hunter Department of Statistics Penn State University Adapted from notes prepared by Rahul Roy and RL Karandikar,
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationLaws of Probability, Bayes theorem, and the Central Limit Theorem
Laws of, Bayes theorem, and the Central Limit Theorem 6th Penn State Astrostatistics School David Hunter Department of Statistics Penn State University Adapted from notes prepared by Rahul Roy and RL Karandikar,
More informationBrief Review of Probability
Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic
More informationProbability and Statistics. Vittoria Silvestri
Probability and Statistics Vittoria Silvestri Statslab, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WB, UK Contents Preface 5 Chapter 1. Discrete probability
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More information