TMA4265: Stochastic Processes

Size: px
Start display at page:

Download "TMA4265: Stochastic Processes"

Transcription

1 General information You nd all important information on the course webpage: TMA4265: Stochastic Processes Please check this website regularly! Lecturers: Andrea Riebler Lecture: Andrea Riebler, room 1242 Eercises: Petter Arnesen, room 1026, August 18th, 2014 Time and place: Lecture: Monday 10:15-12:00 R9, Wednesday: 10:15-12:00 R9 Eercises: Monday 9:15-10:00 S1 Course outline Reference book: Ross, S. M., 2014, Introduction to Probability Models, 11th edition, Academic Press. Rough course content: Review probability (ch. 1-3) Markov chains (ch. 4) Poisson processes (ch. 5) Continuous-time Markov chains (ch. 6) Queueing theory (ch. 8) Brownian motion (ch. 10) Eercises and project There will be weekly eercises and one obligatory project work. Eercise sheets Eercises problems will be of mied type: theoretical questions to be solved with paper and pencil will be complemented with tasks that need programming. Programming can be done in either Matlab or R. In the lectures R will be used. Eercise classes on Monday 9:15-10:00 in S1 given by Petter Strongly recommended Questions to the current eercise sheet will be answered. New eercises will be presented on the blackboard. (Procedures for simulation of stochastic processes (ch. 11))

2 Eercises and project Eam The eam will be on at 09:00. Project The project is obligatory and has to be done in groups of two persons. You will have two weeks, whereby one week is lecture-free. Time plan available soon. The project will count 20% of the nal mark. Eamination aids: C Calculator HP30S, CITIZEN SR-270X, CITIZEN SR-270X College, Casio f-82es PLUS with empty memory. Tabeller og formler i statistikk, Tapir forlag. K. Rottman: Matematisk formelsamling. One yellow, stamped A5 sheet with own handwritten formulas and notes. Former eams and solutions can be found on: Quality ensurance: Reference group Statistical software R At least three students representing the major study programmes in the course should participate. The reference group should have an ongoing dialogue with other students throughout the semester. meet three times throughout the semester with the lecturer. Evaluate the course and write a report which will be published at the end. More information: R is available for free download at The Comprehensive R Archive Network ( for Windows, Linu and Mac. Rstudio is an integrated development environment (system where you have your script, your run-window, overview of objects and a plotting window). A nice introduction to R is the book P. Dalgaard: Introductory statistics with R, 2nd edition, Springer which is also available freely to NTNU students as an ebook. groups+-+quality+assurance+of+education

3 Probability and random variables - Review (Chapter 1 & 2) Eample: Rolling a die with si sides Sample space: S = {1, 2, 3, 4, 5, 6} Probabilities: P(1) = P(2) = = P(6) = 1 6 E = even number = {2, 4, 6} S P(E) = P(2) + P(4) + P(6) = 1 2 F = dividable through 3 = {3, 6} S P(F ) = P(3) + P(6) = 1. 3 E and F are independent as: P(E F ) = P(6) = 1 = P(E) P(F ). 6 Introduction Sample space and events One fundamental term in stochastics is the probability, such as the probability P(E) for the occurrence of event E : P(E) = 1 : E does certainly occur P(E) = 0 : E does certainly NOT occur P(E) = p (0, 1) : E occurs with probability p Suppose we perform an eperiment whose outcome is not predictable in advance. However, let us suppose that the set of possible outcomes is known. This set is called sample space S. Event E S occurs when the outcome of the eperiment lies in E. Some notation: E F : union of E and F EF or E F : intersection of E and F : null event. If E F = then E and F are mutually eclusive. E c : complement of E.

4 Probabilities dened on events Probability for the union of two events For each event E S you can dene a number P(E) satisfying We have P(E) + P(F ) = P(E F ) + P(E F ) (i) 0 P(E) 1 (ii) P(S) = 1. (iii) For E 1, E 2,... that are mutually eclusive ( ) P E n = P(E n ). n=1 n=1 P(E) is called the probability of event E. or equivalently P(E F ) = P(E) + P(F ) P(E F ) If E and F are mutually eclusive, then P(E F ) = P(E) + P(F ) (see Eample 1.3) This can be also generalized to more than two events. Sylvester-Poincaré formula Conditional probabilities For arbitrary n N and events E 1, E 2,..., E n S P(E 1 E 2 E n ) = P(E i ) i i<j P(E i E j ) + P(E i E j E k ) i<j<k ±... + ( 1) n+1 P(E 1 E 2 E n ) In particular, for E, F, G S: P(E F G) = P(E) + P(F ) + P(G) P(EF ) P(EG) P(FG) + P(EFG) see illustration blackboard Suppose we toss two fair dice: The rst die is a four Given this information, what is the probability that the sum of the two dice equals si? The conditional probability P(E F ) is dened as the probability that event E occurs given that F has occurred. We can calculate it as P(E F ) = P(EF ) P(F ), P(F ) > 0

5 Multiplication theorem Independent events For arbitrary events E 1, E 2,..., E n with P(E 1 E 2... E n ) > 0 we have P(E 1 E 2... E n ) = P(E 1 ) P(E 2 E 1 ) P(E 3 E 1 E 2 ) P(E n E 1 E n 1 ) where the right side can obviously be factorized in any other possible order. In particular it follows P(E 1 E 2 ) = P(E 1 ) P(E 2 E 1 ) P(E 2 E 2 E 3 ) = P(E 1 ) P(E 2 E 1 ) P(E 3 E 1 E 2 ) Two events E and F are said to be independent if P(EF ) = P(E) P(F ) That means: P(E F ) = P(E) Thus, the knowledge that F has occurred does not aect the probability that E occurs. [See blackboard Eample 1.9] Comment: It can be shown that then also E and F c, E c and F, and E c and F c are independent. Law of total probability Suppose F 1, F 2,..., F n are mutually eclusive such that n F i = S. Then for each E in S we have Especially, P(E) = n P(E F i ) = n P(F i ) P(E F i ) Bayes formula Bayes formula relies on the asymmetry of the denition of conditional probabilities so that P(E F ) = P(EF ) P(F ) P(F E) = P(EF ) P(E) P(EF ) = P(E F ) P(F ) P(EF ) = P(F E) P(E) P(E) = P(F ) P(E F ) + P(F c ) P(E F c ) P(F E) = P(E F ) P(F ) P(E) = P(E F ) P(F ) P(E F ) P(F ) + P(E F c ) P(F c ) because F and F c build a mutually eclusive decomposition of S. In general P(F j E) = P(E F j ) P(F j ) n P(E F i) P(F i )

6 Eample: Diagnostic tests Random variables We shall not always be interested in an eperiment itself, but rather in some consequence of its random outcome. Eample: Rolling two dice [See blackboard] Let X := Sum of the two dice Such consequences, when real-valued, may be thought of as functions which map S into the real line R, and are called random variables. Random variables: Notation We shall always use: upper-case letters, such as, X, Y, and Z, to represent generic random variables lower-case letters, such as, y, and z, to represent possible numerical values of these variables. Discrete random variables A random variable X is discrete, if they can either take a nite or countable number of values. We dene the probability mass function p(a) of X by p(a) = P(X = a) The following properties have to be fullled: p( i ) = 1, p( i ) 0. The cumulative distribution functions F can be epressed in terms of p(a) by F (a) = P(X a) = p( i ) i: i a

7 Properties of the cumulative distribution function (CDF) Eample: Throwing four times a fair coin F () is monotone increasing (step function). F () is piece-wise constant with jumps at elements i, where p( i ) > 0. Let X the number heads we observe. [see blackboard] lim F () = 1. lim F () = 0. Discrete distributions There are dierent common distribution functions to model dierent random processes. The simplest eample is the Bernoulli distribution. A Bernoulli random variable can only take the values 0 and 1: p(1) = P(X = 1) = p. p(0) = P(X = 0) = 1 p, where p [0, 1] is the parameter of the distribution. The CDF is given by 0 : < 0 F (X ) = 1 p : 0 < 1 1 : 1 The binomial random variable Suppose that a sequel of n independent Bernoulli trials was performed. If X represents the number of successes that occur in the n trials, then X is said to be binomially distributed with parameters n N and p [0, 1] (symbolic X B(n, p)), and probability mass function ( ) n p(i) = p i (1 p) n i, i where ( ) n = i In R: dbinom, pbinom, rbinom n! (n i)!i! i = 0, 1,..., n

8 Illustration: Binomial distribution The geometric random variable p() n = 10, p = 0.5 n = 10, p = 0.3 p() n = 100, p = 0.5 n = 100, p = 0.3 An eperiment with probability p [0, 1] of being a success is repeated independently until a success occurs. Let X denote the number of trials required until the rst success, then X is said to be a geometric random variable with parameter p (symbolic X G(p)) The probability mass function is given by F() n = 10, p = 0.5 n = 10, p = 0.3 F() n = 100, p = 0.5 n = 100, p = p(n) = P(X = n) = (1 p) (n 1) p, n = 1, 2,... The CDF is given by F = P(X n) = 1 P(X > n) = 1 (1 p) n In R: dgeom, pgeom, rgeom The Poisson random variable Illustration: Poisson distribution A random variable X taking on one of the values 0, 1, 2,..., is said to be a Poisson random variable with parameter λ (symbolic X P(λ)), if for some λ > 0 p(i) = P(X = i) = λi i! e λ, i = 0, 1,... The parameter λ reects the rate or intensity with which the p() λ = 1 λ = 3 F() λ = 1 λ = 3 events in the underlying time interval occur.

9 Approimation of the binomial distribution Illustration (n = 10) (n, p) = (10, 0.8) (n, p) = (10, 0.5) The binomial distribution B(n, p) with parameters n and p converges to the Poisson with parameter λ if n and p 0 in such a way that λ = np remains constant. p() Binomial Poisson p() Binomial Poisson [proof see blackboard] p() (n, p) = (10, 0.3) Binomial Poisson p() (n, p) = (10, 0.1) Binomial Poisson Illustration (n = 100) Denition of continuous random variables (n, p) = (100, 0.8) (n, p) = (100, 0.5) p() Binomial Poisson p() Binomial Poisson Idea: A random variable X is called continuous, if to arbitray values a < b from the support of X, every intermediate value in the interval [a, b] is possible. (n, p) = (100, 0.3) (n, p) = (100, 0.1) Problem: How to compute P(a X b) if there are p() Binomial Poisson p() Binomial Poisson uncountable many points in the interval [a, b].

10 Continuous distributions The CDF F () of continuous random variables A random variable X whose set of possible values is uncountable, is called a continuous random variable. A random variable is called continuous, if there eists a function f () 0, so that the cumulative distribution function F () can be written as Some consequences: F (a) = P(X a) = P(X = a) = a a f ()d = 0 P(X B) = B f ()d f ()d = 1 a f ()d Properties: F (a) is a nondecreasing function of a, i.e. if < y then F () < F (y) lim F () = 0 lim F () = 1 d F (a) = f (a) da P(a < X b) = F (b) F (a) = b a f ()d P(X > a) = 1 F (a) Normalizing constant The uniform random variable A normalizing constant c is a multiplicative term in f (), which does not depend on. The remaining term is called core: We often write f () g(). f () = c g() }{{} core A random variable X is a uniform random variable on the interval [α, β] if 1 β α if α < < β f () = 0 otherwise The CDF is given by 0 α F () = α β α α < < β 1 α β In R: dunif(, min=a, ma=b), punif, qunif, runif

11 Illustration of the uniform distribution Eponential random variables For a = 2 and b = 6: A random variable X is an eponential random variable with parameter λ > 0 (symbolic X E(λ)) if f() F() f () = The CDF is given by { F () = { λ ep( λ) if 0 0 else 1 ep( λ) if 0 0 if < 0 In R: dep(, rate=lambda), pep, qep, rep Illustration of the eponential distribution Property of the eponential distribution Probability density function CDF f() λ = 0.9 λ = 0.5 λ = 0.3 F() λ = 0.9 λ = 0.5 λ = 0.3 The eponential distribution is closely related to the Poisson distribution: The number of events in an interval is Poisson distributed with parameter λ, if the time intervals between subsequent events are independent and eponentially distributed with λ.

12 Gamma random variables Illustration of the gamma distribution A random variable X is a gamma random variable with shape and rate parameters α, β > 0 (symbolic X G(a, b)) if its density is given by: f () = β α Γ(α) α 1 ep( β) for 0 0 else Here, Γ(α) denotes the gamma function Γ(α) = 0 α 1 ep( ) d where Γ( + 1) =! for N 0 and Γ( + 1) = Γ(). f() Probability density function α = 2.0, β = 3 α = 1.2, β = 3 α = 2.0, β = 6 α = 1.2, β = 6 F() CDF α = 2.0, β = 3 α = 1.2, β = 3 α = 2.0, β = 6 α = 1.2, β = 6 In R: dgamma(, shape = α, rate = β), pgamma(...), rgamma(...) Properties of the gamma distribution Normal random variables For α = 1 the gamma distribution is equal to the eponential distribution with paramter λ = β. For α = d with d N and β = 1 the gamma distribution is 2 2 equal to the χ 2 -distribution with d degrees of freedom We say that X is a normal random variable with parameters µ R and σ 2 R + (or symbolic X N (µ, σ 2 )) if its density is f () = 1 ) 1 ( 2π σ ep ( µ)2 2σ 2 for R The density function is a bell-shaped symmetric curve around µ. For µ = 0 and σ 2 = 1 the distribution is called standard normal distribution. If X N (µ, σ 2 ), then Y = αx + β N (αµ + β, (ασ) 2 ). In R: dnorm(, mean, sd), pnorm(...), rnorm(...)

13 Illustration of the normal distribution Probability density function CDF Epectation of a random variable To describe a probability density distribution one distinguishes between location parameters (epected values, median, mode) f() µ = 0, σ = 1 µ = 2, σ = 1 µ = 0, σ = F() µ = 0, σ = 1 µ = 2, σ = 1 µ = 0, σ = scale parameters (variance, standard deviation) The epected value E(X ) of a discrete random variable X is dened as E(X ) = In the continuous case we have E(X ) = :p()>0 p() f () d [see blackboard for Poisson] Eamples Transformations of random variables Discrete distributions: Binomial case: E(X ) = np Geometric case: E(X ) = 1/p Poisson case: E(X ) = λ Continuous distributions: Uniform case: E(X ) = (β + α)/2 Eponential case: E(X ) = 1/λ Normal case: E(X ) = µ We are often interested in transformations of random variables, where a transformation is any function of the random variables. Eamples: Changing units, e.g. changing temperature scales from Celsius to Fahrenheit. Changing scales, from ep to log. Computing ratios: X /Y. Computing non-linear arithmetics.

14 Epectation of a function of a random variable Let X be a discrete random variable and g a real-valued function, then E[g(X )] = :p()>0 g()p() Let X be a continuous random variable and g a real-valued function, then E[g(X )] = g()p()d Possible linear transformations adding a constant: X + b, multiplying by a constant: ax, both: ax + b. Eample: [ F ] = 9 5 [ C] X inde X + 32 inde /5*X + 32 inde 9/5* + 32 Linear transformation of the mean If a and b are constants, then E(aX + b) = a E(X ) + b. Proof (continuous case). Set g(x ) = ax + b E(aX + b) = (a + b)f ()d = a f ()d + b f ()d = a f ()d }{{} E() +b f ()d }{{} 1 = a E(X ) + b. Variance of a random variable The variance of a random variable X, denoted by Var(X ) is dened as Var(X ) = E[(X E[X ]) 2 ] For simpler calculation you can use that Var(X ) = E[X 2 ] E[X ] 2 Proof. Var(X ) = E[(X E[X ]) 2 ] = E[X 2 2X E[X ] + E[X ] 2 ] = E[X 2 ] E[2X E[X ]] + E[E[X ] 2 ] = E[X 2 ] 2 E[X ] E[X ] + E[X ] 2 = E[X 2 ] E[X ] 2

15 Property of the variance Jointly distributed random variables The joint cumulative probability distribution function of two continuous random variables X and Y is given by F (, y) = P(X, Y y); <, y < Var(aX + b) = a 2 Var(X ) for all a, b R Variance is unchanged by addition/subtraction of a constant! Alternatively one can use the joint probability density function f (, y) to get the joint cumulative probability distribution function as Thus: F (, y) = y v= f (u, v) du dv u= d 2 F (, y) = f (, y) d dy Jointly distributed random variables (II) The marginal density function of X (or Y ) can be obtained by: + f X () = f (, y) dy + or f Y (y) = f (, y) d Further E[g(X, Y )] = For eample, g(, y)f (, y)ddy Eample: Binomial distribution The epected value of a binomially distributed random variable X must be E[X ] = np, as X can be represented as a sum of n independent Bernoulli-distributed random variables X i B(p), with i = 1,..., n. For each X i, we have E[X i ] = p, so that [ n ] n E X i = E[X i ] = np. E[aX + by ] = a E(X ) + b E(Y ) or more general [ n ] E a i X i = n a i E[X i ]

16 Independent random variables Epectation value The random variable X and Y are set to be independent if, for all a, b, P(X a, Y b) = P(X a) P(Y b) In terms of the joint distribution function F of X and Y : F (a, b) = F X (a)f Y (b) for all a, b When X and Y are discrete we have If X and Y are independent, then for any functions h and g E(g(X )h(y )] = E[g(X )] E[h(Y )] Further, Var(X + Y ) = Var(X ) + Var(Y ) p(, y) = p X ()p Y (y), or f (, y) = f X ()f Y (y) in the continuous case. Covariance Properties of the covariance A measure for the linear stochastic dependency of two random variables X and Y is the covariance Cov(X, Y ) = E[(X E[X ])(Y E[Y ])] = E[XY X E[Y ] Y E[X ] + E[X ] E[Y ]] = E[XY ] E[X ] E[Y ] If X and Y are independent it follows that Cov(X, Y ) = 0. A positive value of Cov(X, Y ) is an indication that Y tends to increase as X does, whereas a negative value indicates that Y tends to decrease as X increases. Cov(X, X ) = Var(X ) Cov(X, Y ) = Cov(Y, X ) Cov(a + bx, c + dy ) = b d Cov(X, Y ) Cov(X, Y + Z) = Cov(X, Y ) + Cov(X, Z) Thus n m Cov X i, Y j = j=1 j=1 n m Cov(X i, Y j )

17 Variance of the sum of random variable ( n ) n Var X i = Cov X i, j=1 n j=1 X j n n = Cov(X i, X j ) = = n Cov(X i, X i ) + n Var(X i ) + 2 n Cov(X i, X j ) j<i If X i, i = 1,..., n are independent then ( n ) n Var X i = Var(X i ) j i n Cov(X i, X j ) Convolution Let X and Y independent random variables with probability mass function p X () and p Y (y). Let Z = X + Y, then f Z (z) is called convolution of p X () and p Y (y). P(X + Y = z) = = indep. = = P(X =, X + Y = z) P(X =, Y = z X ) P(X = ) P(Y = z X = ) p X () p Y (z ) }{{} called convolution of p X and p Y Limit theorems Markov's inequality: (Proofs see blackboard) Strong law of large numbers If X is a random variable that takes only nonnegative values, then for any value of a > 0 P(X a) E(X ) a Chebyshev's inequality: If X is a random variable with mean µ and variance σ 2, then for any value of k > 0 (1) Let X 1, X 2,... be a sequence of independent random variables having a common distribution, and let E[X i ] = µ. Then, with probability 1, X 1 + X X n µ, n n P( X µ k) σ2 (2) k 2 With (1) and (2) bounds on probabilities can be computed knowing only mean, or mean and variance, of the probability distribution.

18 Illustration Arithmetic mean for 5000 standard normal distributed random variables. par(ce.lab=1.4, ce.ais=1.3, ce.main=1.4, lwd=2) <- rnorm(5000) plot(cumsum()/1:5000, type="l", lab="n", ylab="arithmetic mean", ylim=c(-1,1)) abline(h=0, lty=2, col="red") Central limit theorem The central limit theorem says, that the arithmetic mean, appropriately standardized, of arbitrary independent and identically distributed random variables converges to the standard normal distribution. Arithmetic mean A random variable X is called standardized, if E(X ) = µ = 0 Var(X ) = σ 2 = 1 n Standardization Central limit theorem Every random variable X with nite epected value E(X ) = µ and nite variance Var(X ) = σ 2 can be standardized using a linear transformation. Dene Then, obviously, E( X ) = (E(X ) µ)/σ = 0 Var( X ) = Var(X )/σ2 = 1 X = X µ σ Let X 1, X 2,... be a sequence of independent, identically distributed random variables, each with mean µ and variance σ 2. Then the distribution of X 1 + X X n nµ σ n tends to the standard normal as n. This holds for any distribution of the X i s. Alternatively, X X n N (nµ, nσ 2 ) as n.

19 Stochastic processes A stochastic process {X (t), t T } is a collection of random variables, i.e. for each t T, X (t) is a random variable. inde t is often the time, so that X (t) is referred to as the state at time t. When T is a countable set, we have a discrete-time process. When T is an interval of the real line, we have a continuous-time process. The state space is dened as the set of all possible values that X (t) can assume. We consider processes with discrete random variables X (t). Dierent dependence relations among the random variables X (t) might be assumed.

TMA4265: Stochastic Processes

TMA4265: Stochastic Processes General information TMA4265: Stochastic Processes Andrea Riebler August 18th, 2015 You find all important information on the course webpage: https://wiki.math.ntnu.no/tma4265/2015h/start Please check this

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Lecture 16. Lectures 1-15 Review

Lecture 16. Lectures 1-15 Review 18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Introduction to Probability Theory for Graduate Economics Fall 2008

Introduction to Probability Theory for Graduate Economics Fall 2008 Introduction to Probability Theory for Graduate Economics Fall 008 Yiğit Sağlam October 10, 008 CHAPTER - RANDOM VARIABLES AND EXPECTATION 1 1 Random Variables A random variable (RV) is a real-valued function

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

ECON 5350 Class Notes Review of Probability and Distribution Theory

ECON 5350 Class Notes Review of Probability and Distribution Theory ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5 Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Examination paper for TMA4265 Stochastic Processes

Examination paper for TMA4265 Stochastic Processes Department of Mathematical Sciences Examination paper for TMA4265 Stochastic Processes Academic contact during examination: Andrea Riebler Phone: 456 89 592 Examination date: December 14th, 2015 Examination

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 2 Random Variables. Random Variable Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Chapter 3. Chapter 3 sections

Chapter 3. Chapter 3 sections sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions

More information

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups. Copyright & License Copyright c 2006 Jason Underdown Some rights reserved. choose notation binomial theorem n distinct items divided into r distinct groups Axioms Proposition axioms of probability probability

More information

Ch. 5 Joint Probability Distributions and Random Samples

Ch. 5 Joint Probability Distributions and Random Samples Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Review of Basic Probability Theory

Review of Basic Probability Theory Review of Basic Probability Theory James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 35 Review of Basic Probability Theory

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term 2016-2017 Tutorial 3: Emergency Guide to Statistics Prof. Dr. Moritz Diehl, Robin

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

R Functions for Probability Distributions

R Functions for Probability Distributions R Functions for Probability Distributions Young W. Lim 2018-03-22 Thr Young W. Lim R Functions for Probability Distributions 2018-03-22 Thr 1 / 15 Outline 1 R Functions for Probability Distributions Based

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Chapter 3 Single Random Variables and Probability Distributions (Part 1) Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function

More information

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00 Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Chapter 4. Probability-The Study of Randomness

Chapter 4. Probability-The Study of Randomness Chapter 4. Probability-The Study of Randomness 4.1.Randomness Random: A phenomenon- individual outcomes are uncertain but there is nonetheless a regular distribution of outcomes in a large number of repetitions.

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Random Variables Example:

Random Variables Example: Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

Probability Theory Review

Probability Theory Review Cogsci 118A: Natural Computation I Lecture 2 (01/07/10) Lecturer: Angela Yu Probability Theory Review Scribe: Joseph Schilz Lecture Summary 1. Set theory: terms and operators In this section, we provide

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Advanced Herd Management Probabilities and distributions

Advanced Herd Management Probabilities and distributions Advanced Herd Management Probabilities and distributions Anders Ringgaard Kristensen Slide 1 Outline Probabilities Conditional probabilities Bayes theorem Distributions Discrete Continuous Distribution

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching

More information

6 The normal distribution, the central limit theorem and random samples

6 The normal distribution, the central limit theorem and random samples 6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis Day 5 Functions/Probability Overview Functions Probabilistic analysis Neil Rhodes UC San Diego Image/Coimage The image of f is the set of values f actually takes on (a subset of the codomain) The inverse

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

CSE 312 Final Review: Section AA

CSE 312 Final Review: Section AA CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material

More information

UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS

UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS EXTRACTS FROM THE ESSENTIALS EXAM REVISION LECTURES NOTES THAT ARE ISSUED TO STUDENTS Students attending our mathematics Essentials Year & Eam Revision

More information

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali Stochastic Processes Review o Elementary Probability bili Lecture I Hamid R. Rabiee Ali Jalali Outline History/Philosophy Random Variables Density/Distribution Functions Joint/Conditional Distributions

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS PROBABILITY AND INFORMATION THEORY Dr. Gjergji Kasneci Introduction to Information Retrieval WS 2012-13 1 Outline Intro Basics of probability and information theory Probability space Rules of probability

More information

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example Midterm #1 Midterm 1 Lecture 10: and the Law of Large Numbers Statistics 104 Colin Rundel February 0, 01 Exam will be passed back at the end of class Exam was hard, on the whole the class did well: Mean:

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution Random Variable Theoretical Probability Distribution Random Variable Discrete Probability Distributions A variable that assumes a numerical description for the outcome of a random eperiment (by chance).

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

CSCI-6971 Lecture Notes: Probability theory

CSCI-6971 Lecture Notes: Probability theory CSCI-6971 Lecture Notes: Probability theory Kristopher R. Beevers Department of Computer Science Rensselaer Polytechnic Institute beevek@cs.rpi.edu January 31, 2006 1 Properties of probabilities Let, A,

More information