Chapter 2. Discrete Distributions

Size: px
Start display at page:

Download "Chapter 2. Discrete Distributions"

Transcription

1 Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation ˆ Basic Bivariate Distributions; joint, marginal & conditional pdf Discrete Distributions The (theoretical population mean, population variance, and the population sd (standard deviation are: µ N N i f(, σ ( µ f(, and σ σ i The sample mean, sample variance, and the sample sd from a dataset: n i, s i i ( i n, s s or equivalently, from a frequency table: n k i f i i, s n k { fi ( i }, s s i Note also: σ ( µ f( ( µ + µ f( f( µ f( + µ f( f( µ (Why? f( is called the nd moment about the origin. E.. Find the mean and sd of the following observations:,,,, 4 6, 5.. Let f( /6,,, be the pmf of X. Find the mean (µ and the sd (σ of X.

2 E. Let the random variable X the number of rolls of a regular die until the first 5 or 6. The probability of rolling a 5 or 6 is /, thus the pmf of X is written as: f( P (X Find (a the mean (µ and the sd (σ of X. Answer: µ f( ( + ( (,,,,,... ( ( + ( ( You will see quite a few of similar infinite sums of this kind. Here is how to find the answer easily. First, let s call a. Then, Write one more line like µ a + µ Subtract, we get µ a + µ 9a ( ( a + a ( ( a + a ( a + a / a ( a Remark Tetbook introduces another way of handling this kind of infinite sum. First recall the Taylor series epansion: f( f(a + f (a! + f (a! Net, apply the Taylor series epansion to f( ( around 0, we get ( + + Now, notice that we have µ + ( + (. The RHS is (, where /. That is, µ ( /. E ( X f( ( + ( ( ( + ( Chapter, page

3 Rewrite as E ( X + ( + ( Write one more line like before Subtract the last two equations, we get ( E ( X ( ( ( + + E ( X + Write one more line again like before ( E ( X ( + 5 ( + Subtract the last equation from the one above, we get Basel Problem: n ( E ( X ( + n π ( / / E ( X 5 ( + 7 ( + 5 ( + σ E ( X µ We begin with the Taylor series epansion of sin( ( ( ( sin(! + 5! 5 7! 7 Divide both sides by sin(! + 5! 4 7! 6 Chapter, page

4 Notice that the roots of sin( are ±nπ (i.e., sin(±nπ 0. factors like The above epression must have ( ( + ( ( + ( ( + π π π π π π ( ( π ( 4π 9π ( π + 4π + 9π π n n Comparing the coefficients of term, we get E. Find a constant c so that f( c, Definition. (Mathematical Epectation! π n n n n π 6 ±, ±,... is a pmf. Univariate case: X f( pdf for a continuous rv; pmf for a discrete rv. f( d E(X f( Multivariate case: X (X, X,..., X n f(,,..., n pdf or pmf u(,..., n f(,..., n d... d n E {u (X, X,..., X n } u(,..., n f(,..., n Properties. If k is a constant E(k k. Properties. If k is a constant { and v( is a function, then E{kv(X} ke{v(x}. m } m This can be etended to E k i v i (X k i E {v i (X}. Proof. i i Chapter, page 4

5 E 4. Let the random variable X have the pmf f( (,,,,.... Find µ and σ. Answer: You can use the same methods shown before or use the Taylor series epansion of: ( + + and ( µ E(X ( f( { ( + + ( + ( ( + } ( E {X(X + } ( ( ( + f( + ( { + ( 8 ( + 4 ( + 4 } ( So, E ( X 6 (Why? σ E ( X {E(X} 6 4 Some special mathematical epectations ˆ Mean value of X: f( d E(X µ f( ˆ Variance of X: ( µ f( d E(X µ σ ( µ f( ˆ Moment generating function (mgf of X: e t f( d M(t e t f( Related facts Chapter, page 5

6 ˆ σ E ( X {E(X} ˆ (sd σ σ ˆ Not every distribution has an mgf. Suppose M X (t M Y (t for t < h, and some h > 0, then P X ( P Y (, i.e., F X (z F Y (z, z. This is called the uniqueness of mgf, i.e., mgf uniquely determines the distribution. ˆ M (0 E(X, M (0 E ( X,, M (k (0 E ( X k. The last part is because t M(T t E ( e tx E ( Xe tx E 5. {Cauchy distribution} X has pdf f( π M X (t do NOT eist. (Why?: b lim a a b π, < <. Then, both E(X & + d? (Does it eist? + lim b b 0 0 lim a a π π d lim + b π d lim + a π { log ( + } b { log ( + } a + Binomial distribution Definition. X binomial (n, p X has a binomial ( distribution with parameters n & p (n,,..., 0 p n f( p ( p n, 0,,..., n X binomial (, p is particularly called the Bernoulli random variable, i.e., P (X p P (X 0 Properties. X binomial (n, p M X (t ( pe t + q n, q p. Proof. ( M X (t E ( e tx ( (Uniqueness of mgf 0 ( n (pe t ( p n ( pe t + q n Chapter, page 6

7 Properties 4. Representational definition of binomial (n, p X binomial (n, p X Z i, Z,..., Z n : iid binomial(, p. i Proof. Begin with the mgf of Z i M X (t E (e t Z i E ( e tz e tz e tzn E ( e tz i M Zi (t (Why? (Why? ( pe t + q (Why? ( pe t + q n X binomial (n, p X Z i, Z,..., Z n : iid binomial(, p i The last part is by the uniqueness of mgf. Properties 5. Mean & variance of binomial (n, p X binomial (n, p E(X np, Var(X npq. Proof. ˆ Easiest way: use the representational definition ˆ Proof by mgf: Try on your own using M (0 E(X, M (0 E ( X ˆ Proof by pmf: E(X n!!(n! p ( p n n! (!(n! p ( p n (let k 0 n (n! np k!(n k! pk ( p n k k0 np (Why? Chapter, page 7

8 E {X(X } n! (!(n! p ( p n n! (!(n! p ( p n (let k 0 n n(n p (n! k!(n k! pk ( p n k k0 n(n p (Why? σ E {X(X } + E(X {E(X} n(n p + np (np np( p E 6. {WLLN: Weak Law of Large Numbers} (Binomial case Chebyshev s Inequality: P ( X µ kσ k Chebyshev s inequality came from the following: σ E { (X µ } S ( µ f( A( µ f(, where A { : µ kσ} for a positive constant k. This leads to σ ( µ f( k σ f( k σ f( k σ P (X A A A A Now, let X, X,..., X n be an iid binomial(, p, ˆp n X i sample success ratio. Consider P ( ˆp p ɛ p( p Note that E (ˆp p and σ (ˆp n, so by plugging into the Chebyshev s inequality we get P ( ˆp p ɛ p( p ɛ n p( p lim P ( ˆp p ɛ lim n n ɛ 0 n This means that the probability that the sample success ratio (ˆp is more than ɛ away from p goes to zero as n goes to, and we say (ˆp converges in probability to p. Chapter, page 8

9 Definition. Cumulative distribution function (cdf univariate case Properties 6. cdf F ( F ( P (X. (monotonicity a b F (a F (b. F ( lim F ( 0,. (right continuity lim F ( + h F ( h 0 + F (+ lim F ( A random variable X may not have a pdf or mgf, BUT it always has a cdf. Definition 4. Relationship with pdf (or pmf when it eists f(t dt F ( f(t t (continuous case (discrete case F ( for where f is continuous (continuous case f( F ( F ( (discrete case E 7.. Find the cdf F ( of. Find the cdf F ( of Answer: for,, f( 6 0 otherwise f( for > 0 otherwise 0 < 0 /6 < F ( /6 < Chapter, page 9

10 F ( t dt, > 0 otherwise E 8. Let X binomial (8, Find (a P (X 5, (b P (X 5, (c P (X 5 P (X 4 Answer by R: > pbinom(5,8,0.65 [] > dbinom(5,8,0.65 [] > pbinom(5,8,0.65-pbinom(4,8,0.65 [] Poisson distribution Poisson approimation to the binomial distribution: ( n p q n e λ λ, as n, np λ (fied! There are different ways to show this. Here is an easy way from the tetbook. We begin with Net, let n P (X P (X lim n lim n n!!(n! n!!(n! ( λ ( λ n n n ( λ ( λ n n n n(n (n + λ n! λ! e λ (Why? e λ λ Poisson with λ! ( λ n ( λ n n Definition 5. X Poisson (λ X has a Poisson distribution with parameter λ (> 0 f( e λ λ, 0,,,...! Properties 7. X Poisson (λ M X (t ep { λ ( e t }. Chapter, page 0

11 Proof. M X (t E ( e tx 0 e t e λ λ! ( λe e λ t! 0 e λ e (λet e {λ(e t } Properties 8. Mean & variance of Poisson (λ X Poisson (λ E(X λ, Var(X λ. Properties 9. (Reproductive property X,..., X k independent & X i Poisson (λ i, i,..., k, then X i Poisson ( λ i Proof. Begin with the mgf of Z i M X i (t E (e t X i E ( e tx e tx e txn E ( e tx i M Xi (t e λ i(e t e λ i(e t ( mgf of Poisson λi (Why? (Why? X i Poisson ( λ i by the uniqueness of mgf. E 9. Let X Poisson (λ. Find (a P (X, (b P (X, (c P (X P (X Answer by R: > ppois(, [] > dpois(, [] > ppois(,-ppois(, [] Geometric distribution Definition 6. Y Geometric (p f(y pq y, y 0,,... ; q p Chapter, page

12 X,..., X n iid binomial (, p, Y # of failures before the first success. Properties 0. Y Geometric (p M Y (t p ( qe t, qe t > 0. E(Y q p, V ar(y q p. Proof. M Y (t E ( e ty p ( qe t y y0 p ( qe t, qet > 0 One note: Tetbook (page. 64 uses a slightly different definition. There, X the trial number on which the st success occurs and it s related by Y X, (,,.... According to this definition, we have f( pq (,,...; E(X p, V ar(x q p ; M X(t pe t ( qe t Negative Binomial distribution Definition 7. Y Negative Binomial (r, p ( y + r f(y p r q y, y 0,,... ; q p r X,..., X n iid binomial (, p, Y # of failures before the rth success (r. Properties. Y NB (r, p M Y (t p r ( qe t r, qe t > 0. r Y Z i, Z,..., Z r iid geometric(p. Proof. Note first E(Y r q p, V ar(y r q p. ( y + r q y p r (Why?. This is known as the negative binomial r y0 epansion and it s the Taylor series epansion of f(q ( q r as shown below Making use of this, we have M Y (t E ( e ty f( f(0 + f (0! ( q r + r! y0 + f (0! r(r + q + q! ( y + r p r ( qe t y p r ( qe t r, qe t > 0 r Chapter, page

13 Proof. (mgf representational definition M Z i (t E (e t Z i, where r E ( e tz i r M Zi (t r {p ( qe t } p r ( qe t r mgf of NB (r, p r Z i iid geometric (p (Why? (Why? Z i NB (r, p by the uniqueness of mgf. Another note: Tetbook (page. 64 uses a slightly different definition. There, X the trial number on which the rth success occurs and it s related by Y X r, ( r, r +,.... Tetbook calls Y has a translated negative binomial distribution. According to this definition, we have f( ( p r q r ( r, r +,...; µ r r p, σ rq p ; M X(t ( pe t r ( qe t r Hypergeometric distribution Definition 8. X Hypergeometric (N, N, n f( ( N ( N n ( N n, n In a bo, there are N red balls and N blue balls, X # of red balls. Properties. X Hypergeom (N, N, n µ np, σ np( p ( N n, p N N N. MLE: Maimum Likelihood Estimate Definition 9. Suppose X, X,..., X n are random samples from the same underlying distribution (i.e., iid with the pdf f ( i ; θ, then n i f ( i; θ is called the jpdf (joint pdf or the likelihood Chapter, page

14 function. Furthermore, the value of the parameter ˆθ that maimizes the likelihood is called the mle (maimum likelihood estimator of θ. E 0. X, X,..., X n : iid from binomial (, p. Find the mle of p. Answer: L(p f ( i ; p i p i ( p i, 0 < p <, i 0 or i p i ( p n i ln L(p i ln p + (n i ln( p (log-likelihood function Now, to find p that maimizes the log likelihood, differentiate this wrt p and set it equal to zero, we get i ln L(p (n i 0 p p p ( p i (n i p 0 ˆp i n To make sure that this indeed makes the log likelihood maimum, we can do the second derivative test. Here, we have i ln L(p p p (n i ( p This is always < 0 regardless of ˆp, which means that our solution ˆp is indeed the mle. One note: X is called the maimum likelihood estimator, and is the maimum likelihood estimate. E. X, X,..., X n : iid from Poisson (λ. Find the mle of λ. Answer: L(λ f ( i ; λ i λ i e λ i i! λ i e nλ!! n! ln L(λ nλ + i ln λ ln (! n! i ln L(λ n + λ λ 0 i ˆλ n Now, the second derivative is λ ln L(λ i, and this is always < 0 regardless of ˆλ, which λ means ˆλ is the mle. Chapter, page 4

15 E. X, X,..., X n : iid from discrete uniform for,,..., θ. Find the mle of θ. Answer: L(θ f ( i ; θ i i θ ln L(θ n ln θ θ ln L(θ n θ ˆθ ma (,..., n i ( n, i,..., θ θ This agrees with our intuition because in n observations of a discrete uniform random variable, the largest value should be taken as the upper bound. Epected Values Linear Functions of Independent Random variables E. X, X : independent random samples from Poisson with λ, λ, respectively. Find (a P (X, X 4, (b P (X + X. Answer: ( e ( 4 e P (X, X 4 9! 4! e P (X + X P (0, + P (, + P (, 0 ( 0 e ( e ( e ( e ( e ( 0 e !!!!! 0! Let u (X, X be a new random created as a function of two independent random variables X and X, where X and X have pdf s f (, f (, respectively. Then the epected value of u (X, X can be found by E {u (X, X } u (, f (, u (, f ( f ( The last part, where the joint pdf is written as a product of marginal pdf, is due to the independence of X and X. Now, consider Y a X + a X, where a, a are constants. We have E(Y µ Y E (a X + a X a E (X + a E (X a µ + a µ { V ar(y σy E (a X + a X a µ a µ } E [{a (X µ + a (X µ } ] { a E (X µ } { + a E (X µ } + a a E {(X µ (X µ } a V ar (X + a V ar (X + a a E (X µ E (X µ a σ + a σ In general, let Y n i a ix i, where a i s are constants and X i s are independent random Chapter, page 5

16 samples with mean µ i and variance σi, then ( E(Y µ Y E a i X i a i E (X i a i µ i i i ( V ar(y σy V ar a i X i i a i V ar (X i i i i One note: In case when X,..., X n are not independent, we have a i σi σ Y a i σi + a ia j σ ij, where σ ij Cov(X i, X j i<j i E 4. Let X binomial(n 00, p /, X binomial(n 48, p /4 and they are independent. Find the epected value and the variance of Y X X. Answer: E(Y E (X X E (X E (X 50 8 V ar(y V ar (X X V ar (X + V ar (X Definition 0. f (, joint pdf of X and X. f ( marginal pdf of X f(, or f(, d. f ( f (, f ( conditional pdf of X given X. E 5. Consider the following (discrete joint pmf of X and X. X X marginal X 4/0 /0 6/0 X /0 /0 4/0 marginal 7/0 /0 Find (a f(,, (b f ( f (, (c pmf of Y X + X, (d E(Y, and (e E (X + X. Chapter, page 6

17 Answer: f(, 0 ; f ( f ( 7 0 4/0, Y P (Y 5/0, Y /0, Y 4 ( ( ( 4 5 E(Y E (X + X E (X + E (X ; f(, f ( f ( 7 0 ( ( ( ( Definition. Cov (X, X σ E {(X µ (X µ } E (X X µ µ ρ Cor (X, X Cov(X, X σ σ σ σ σ. Definition. Conditional mean & conditional variance: f( d E (X µ X f ( ] { E (X } V ar (X σx E [{X f( d E (X } { E (X } f ( The conditional variance can also be shown as V ar (X E ( X {E (X } E 6. Let X and X have the joint pmf f(, +,,,,, Here is the probability table for your information. X X X marginal X / / 4/ 9/ X / 4/ 5/ / marginal 5/ 7/ 9/ Find (a the marginal pmf s f (, f (, (b conditional pmf s f(, f(, (c conditional epectation E (X, (d conditional variance V ar (X, and (e P (X X, E (X, V ar (X. Answer: Shown below are the conditional probabilities for your information. Check your calculation below with these values. Chapter, page 7

18 X X X conditional prob X /9 /9 4/9 f( X / 4/ 5/ f( X X X X /5 /7 4/9 X /5 4/7 5/9 conditional prob f( f( f( f ( f(, f ( f(, ,,, ,, f ( f (, f ( f ( f (, f ( ,,, when, + +,, when,, P (X X f( 4 E (X f ( ( +,, when,, + E (X ( + 9 E ( X f ( E ( X ( + 9 ( ( + + ( ( ,, when,, 4( V ar (X E ( X {E (X },, ;,, V ar (X E ( X {E (X } 4 9 ( Chapter, page 8

19 One note: Check E {E (X }? E (X { } ( + E {E (X } E E + E (X ( + 6 ( + 5 {( ( } + Definition. The conditional epectation of a function of random variables X, X : Properties. Conditional epectation:. E (ax + b ae (X + b u(, f( d E {u (X, X } u(, f (. E (X + X E (X + E (X. X 0 E (X 0 4. E {E (X } E (X 5. E {g( X } g( E (X Proof. E (ax + b ae (X + b (a + bf( d a f( d + b f( d E (X + X E (X + E (X E (X f( d 0 E {E (X } E (X f( d { } f( d f ( d E {g( X } g( g( E (X (Property of integral (Definition f( f ( d d f(, d d (Def of conditional pdf E (X f( d g( f( d Chapter, page 9

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Introduction to Probability Theory for Graduate Economics Fall 2008

Introduction to Probability Theory for Graduate Economics Fall 2008 Introduction to Probability Theory for Graduate Economics Fall 008 Yiğit Sağlam October 10, 008 CHAPTER - RANDOM VARIABLES AND EXPECTATION 1 1 Random Variables A random variable (RV) is a real-valued function

More information

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

ECON 5350 Class Notes Review of Probability and Distribution Theory

ECON 5350 Class Notes Review of Probability and Distribution Theory ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A. 1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Discrete Distributions

Discrete Distributions Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

4 Moment generating functions

4 Moment generating functions 4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Chapter 3 Single Random Variables and Probability Distributions (Part 1) Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

PROBABILITY THEORY LECTURE 3

PROBABILITY THEORY LECTURE 3 PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

STA 2201/442 Assignment 2

STA 2201/442 Assignment 2 STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder

More information

Test Problems for Probability Theory ,

Test Problems for Probability Theory , 1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30

More information

CSCI-6971 Lecture Notes: Probability theory

CSCI-6971 Lecture Notes: Probability theory CSCI-6971 Lecture Notes: Probability theory Kristopher R. Beevers Department of Computer Science Rensselaer Polytechnic Institute beevek@cs.rpi.edu January 31, 2006 1 Properties of probabilities Let, A,

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Chapter 5 Class Notes

Chapter 5 Class Notes Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16 EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Lecture 4: Random Variables and Distributions

Lecture 4: Random Variables and Distributions Lecture 4: Random Variables and Distributions Goals Random Variables Overview of discrete and continuous distributions important in genetics/genomics Working with distributions in R Random Variables A

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 2 Random Variables. Random Variable Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation ECE353: Probability and Random Processes Lecture 5 - Cumulative Distribution Function and Expectation Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

1 General problem. 2 Terminalogy. Estimation. Estimate θ. (Pick a plausible distribution from family. ) Or estimate τ = τ(θ).

1 General problem. 2 Terminalogy. Estimation. Estimate θ. (Pick a plausible distribution from family. ) Or estimate τ = τ(θ). Estimation February 3, 206 Debdeep Pati General problem Model: {P θ : θ Θ}. Observe X P θ, θ Θ unknown. Estimate θ. (Pick a plausible distribution from family. ) Or estimate τ = τ(θ). Examples: θ = (µ,

More information

Chapter 3 Discrete Random Variables

Chapter 3 Discrete Random Variables MICHIGAN STATE UNIVERSITY STT 351 SECTION 2 FALL 2008 LECTURE NOTES Chapter 3 Discrete Random Variables Nao Mimoto Contents 1 Random Variables 2 2 Probability Distributions for Discrete Variables 3 3 Expected

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

It can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with

It can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with Example: Alternative calculation of mean and variance of binomial distribution A r.v. X has the Bernoulli distribution if it takes the values 1 ( success ) or 0 ( failure ) with probabilities p and (1

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2016 Page 0 Expectation of a discrete random variable Definition: The expected value of a discrete random variable exists, and is defined by EX

More information

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain 0.1. INTRODUCTION 1 0.1 Introduction R. A. Fisher, a pioneer in the development of mathematical statistics, introduced a measure of the amount of information contained in an observaton from f(x θ). Fisher

More information

Lecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution

Lecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017 Probability Notes Compiled by Paul J. Hurtado Last Compiled: September 6, 2017 About These Notes These are course notes from a Probability course taught using An Introduction to Mathematical Statistics

More information

Chapter 7: Special Distributions

Chapter 7: Special Distributions This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

Statistics 3858 : Maximum Likelihood Estimators

Statistics 3858 : Maximum Likelihood Estimators Statistics 3858 : Maximum Likelihood Estimators 1 Method of Maximum Likelihood In this method we construct the so called likelihood function, that is L(θ) = L(θ; X 1, X 2,..., X n ) = f n (X 1, X 2,...,

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

SOLUTION FOR HOMEWORK 12, STAT 4351

SOLUTION FOR HOMEWORK 12, STAT 4351 SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.

More information

Chapters 3.2 Discrete distributions

Chapters 3.2 Discrete distributions Chapters 3.2 Discrete distributions In this section we study several discrete distributions and their properties. Here are a few, classified by their support S X. There are of course many, many more. For

More information

King Saud University College of Since Statistics and Operations Research Department PROBABILITY (I) 215 STAT. By Weaam Alhadlaq

King Saud University College of Since Statistics and Operations Research Department PROBABILITY (I) 215 STAT. By Weaam Alhadlaq King Saud University College of Since Statistics and Operations Research Department PROBABILITY (I) 25 STAT By Weaam Alhadlaq I Topics to be covered. General Review: Basic concepts of Probability and Random

More information

ESS011 Mathematical statistics and signal processing

ESS011 Mathematical statistics and signal processing ESS011 Mathematical statistics and signal processing Lecture 9: Gaussian distribution, transformation formula for continuous random variables, and the joint distribution Tuomas A. Rajala Chalmers TU April

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model

More information

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis Lecture Recalls of probability theory Massimo Piccardi University of Technology, Sydney,

More information