1.6 Families of Distributions
|
|
- Aubrie Harvey
- 6 years ago
- Views:
Transcription
1 Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) Density 0.10 cdf a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for 1.6 Families of Distributions Example Normal distributionn(µ,σ 2 ) The density function is given by f X (x) = 1 σ e (x µ) 2 2σ 2. (1.8) 2π There are two parameters which tell us about the position and the shape of the density curve. There is an extensive theory of statistical analysis for data which are realizations of normally distributed random variables. This distribution is most common in applications, but sometimes it is not feasible to assume that what we observe can indeed be a sample from such a population. In Example 1.6 we observe time to a specific response to a drug candidate. Such a variable can only take nonnegative values, while the normal rv s domain isr. A lognormal distribution is often used in such cases. X has a lognormal distribution if log X is normally distributed. Exercise 1.6. Recall the following discrete distributions: 1. UniformU(n); 2. Bernoulli(p); 3. Geom(p); 4. Hypogeometric(n, M, N);
2 16 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY 5. Poisson(λ). Exercise 1.7. Recall the following continuous distributions: 1. Uniform U(a, b); 2. Exp(λ); 3. Gamma(α, λ); 4. Chi-squared with ν degrees of freedom, χ 2 ν; Note that all the distributions depend on some parameters, likep,λ,µ,σ or other. These values are usually unknown, so their estimation is one of the important problems in statistical analyzes. These parameters determine some characteristics of the shape of the pdf/pmf of a random variable. It can be location, spread, skewness etc. We denote by P θ the distribution function of a rv X depending on the parameter θ (a scalar or an s-dimensional vector). Definition The distribution family is a set P = {P θ : θ Θ R s } defined on a common measurable space(ω,a). Example Assume that in the efficacy Example 1.9 we have 3 mice. Then the sample space Ω consists of triples of efficacious or non-efficacious responses, Ω = {(ω 1,ω 1,ω 1 ),(ω 1,ω 1,ω 2 ),...,(ω 2,ω 2,ω 2 )}. Take theσ-algebra A as the power set onω. The random variable defined as the number of successes inn = 3 trials has Binomial distribution with probability of success p. X : Ω {0,1,2,3}. Here p is the parameter which together with the distribution function defines the family on the common measurable space(ω,a). Forp= 1 we have the symmetric mass function 2
3 1.7. EXPECTED VALUES 17 X = x P(X = x) , while forp = 3 4 the asymmetric mass function X = x P(X = x) Both functions belong to the same family of binomial distributions withn = Expected Values Definition The expected value of a functiong(x) is defined by g(x)f(x)dx, for a continuous r.v., E(g(X)) = g(x j )p(x j ) for a discrete r.v., j=0 andg is any function such thate g(x) <. Two important special cases ofg are: Mean E(X), also denoted byex, when g(x) = X, Variance E(X EX) 2, when g(x) = (X EX) 2. The following relation is very useful while calculating the variance E(X EX) 2 = EX 2 (EX) 2. (1.9) Example Let X be a random variable such that 1 sinx, for x [0,π], f(x) = 2 0 otherwise. Then the expectation and variance are following
4 18 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY Expectation Variance EX = 1 2 π 0 0 xsinxdx = π 2. var(x) = EX 2 (EX) 2 = 1 π ( π 2 x 2 sinxdx 2 2) = π2 4. Example Geom(p) (the number of independent Bernoulli trials until first success ): The support set and the pmf are, respectively, X = {1, 2,...} and P(X = x) = p(1 p) x 1 = pq x 1, x X, wherep [0,1] is the probability of success, q = 1 p. E(X) = 1 1 p, var(x) =. p p 2 Here we will prove the formula for the expectation. By the definition of the expectation we have E(X) = xpq x 1 = p xq x 1. x=1 Note that forx 1 the following equality holds Hence, d dq (qx ) = xq x 1. x=1 E(X) =p xq x 1 d = p dq (qx ) x=1 x=1 ( = p d ) q x. dq The latter is the sum of a geometric sequence and is equal toq/(1 q). Therefore, E(X) = p d ( ) ( ) q 1 = p = 1 dq 1 q (1 q) 2 p. Similar method can be used to show that the var(x) = q/p 2 (second derivative with respect toq ofq x can be applied for this). x=1
5 1.8. MOMENTS AND MOMENT GENERATING FUNCTIONS 19 The following useful properties of the expectation follow from properties of integration (summation). Theorem 1.5. Let X be a random variable and let a, b and c be constants. Then for any functionsg(x) andh(x) whose expectations exist we have: (a) E[ag(X)+bh(X)+c] = ae[g(x)]+be[h(x)]+c; (b) Ifg(x) h(x) for allx, thene(g(x)) E(h(X)); (c) If g(x) 0 for all x, then E(g(X)) 0; (d) Ifa g(x) b for allx, thena E(g(X)) b. Exercise 1.8. Show thate(x b) 2 is minimized by b = E(X). Variance of a random variable together with the mean are the most important parameters used in the theory of statistics. The following theorem is a result of the properties of the expectation function. Theorem 1.6. If X is a random variable with a finite variance, then for any constants a and b, var(ax +b) = a 2 varx. Exercise 1.9. Prove Theorem Moments and Moment Generating Functions Definition The nth moment (n N) of a random variablex is defined as µ n = EXn Thenth central moment ofx is defined as µ n = E(X µ) n, where µ = µ 1 = EX.
6 20 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY Note, that the second central moment is the variance of a random variablex, usually denoted by σ 2. Moments give an indication of the shape of the distribution of a random variable. Skewness and kurtosis are measured by the following functions of the third and fourth central moment respectively: the coefficient of skewness is given by γ 1 = E(X µ)3 σ 3 = µ 3 µ 3 2 ; the coefficient of kurtosis is given by γ 2 = E(X µ)4 3 = µ 4 3. σ 4 µ 2 2 Moments can be calculated from the definition or by using so called moment generating function. Definition The moment generating function (mgf) of a random variable X is a functionm X : R [0, ) given by M X (t) = Ee tx, provided that the expectation exists for t in some neighborhood of zero. More explicitly, the mgf ofx can be written as M X (t) = M X (t) = x X e tx f X (x)dx, if X is continuous, e tx P(X = x)dx, if X is discrete. The method to generate moments is given in the following theorem. Theorem 1.7. If X has mgfm X (t), then E(X n ) = dn dt nm X(t) t=0, That is, the n-th moment is equal to the n-th derivative of the mgf evaluated at t = 0.
7 1.8. MOMENTS AND MOMENT GENERATING FUNCTIONS 21 Proof. Assuming that we can differentiate under the integral sign we may write d dt M X(t) = d dt = = ( d e tx f X (x)dx dt etx = E(Xe tx ). ) f X (x)dx ( xe tx ) f X (x)dx Hence, evaluating the last expression at zero we obtain Forn = 2 we will get d dt M X(t) 0 = E(Xe tx ) 0 = E(X). d 2 dt 2M X(t) 0 = E(X 2 e tx ) 0 = E(X 2 ). Analogously, it can be shown that for anyn N we can write d n dt nm X(t) 0 = E(X n e tx ) 0 = E(X n ). Example Find the mgf of X Exp(λ) and use results of Theorem 1.7 to obtain the mean and variance ofx. By definition the mgf can be written as M X (t) = E(e t X) = For the exponential distribution we have e tx f X (x)dx. f X (x) = λe λx I (0, ) (x), whereλ R +. Hence, integrating by the method of substitution, we get M X (t) = 0 e tx λe λx dx = λ 0 e (t λ)x dx = λ λ t provided that t < λ.
8 22 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY Now, using Theorem 1.7 we obtain the first and the second moments, respectively: Hence, the variance ofx is E(X) = M X (0) = λ (λ t) 2 t=0 = 1 λ, E(X 2 ) = M (2) 2λ X (0) = t=0 = 2 (λ t) 3 λ 2. var(x) = E(X 2 ) [E(X)] 2 = 2 λ 2 1 λ 2 = 1 λ 2. Exercise Calculate mgf for Binomial and Poisson distributions. Moment generating functions provide methods for comparing distributions or finding their limiting forms. The following two theorems give us the tools. Theorem 1.8. Let F X (x) and F Y (y) be two cdfs whose all moments exist. Then, if the mgfs ofx and Y exist and are equal, i.e.,m X (t) = M Y (t) for alltin some neighborhood of zero, thenf X (u) = F Y (u) for allu. Theorem 1.9. Suppose that X 1,X 2,... is a sequence of random variables, each with mgf M Xi (t). Furthermore, suppose that lim M X i (t) = M X (t), for alltin a neighborhood of zero, i and M X (t) is an mgf. Then, there is a cdf F X whose moments are determined by M X (t) and, for allxwhere F X (x) is continuous, we have lim F X i (x) = F X (x). i This theorem means that the convergence of mgfs implies convergence of cdfs.
9 1.8. MOMENTS AND MOMENT GENERATING FUNCTIONS 23 Example We know that the Binomial distribution can be approximated by a Poisson distribution when p is small and n is large. Using the above theorem we can confirm this fact. The mgf ofx n Bin(n,p) and ofy Poisson(λ) are, respectively: M Xn (t) = [pe t +(1 p)] n, M Y (t) = e λ(et 1). We will show that the mgf ofx tends to the mgf ofy, whereλ np. We will need the following useful result given in the lemma: Lemma 1.1. Let a 1,a 2,... be a sequence of numbers converging to a, that is, lim n a n = a. Then ( lim 1+ a ) n n = e a. n n Now, we can write M Xn (t) = ( pe t +(1 p) ) n ( = 1+ 1 n np(et 1) ( = 1+ np(et 1) n ) n ) n 1) = M n eλ(et Y (t). Hence, by Theorem 1.9 the Binomial distribution converges to a Poisson distribution.
Random Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationLecture 3. Discrete Random Variables
Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationExpectation, variance and moments
Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks
More informationDiscrete Distributions
Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationSTAT 3610: Review of Probability Distributions
STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More information1.12 Multivariate Random Variables
112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More information4 Moment generating functions
4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf
More informationMATH : EXAM 2 INFO/LOGISTICS/ADVICE
MATH 3342-004: EXAM 2 INFO/LOGISTICS/ADVICE INFO: WHEN: Friday (03/11) at 10:00am DURATION: 50 mins PROBLEM COUNT: Appropriate for a 50-min exam BONUS COUNT: At least one TOPICS CANDIDATE FOR THE EXAM:
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationLecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution
Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types
More informationStatistics 1B. Statistics 1B 1 (1 1)
0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationSOLUTION FOR HOMEWORK 12, STAT 4351
SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More information3 Modeling Process Quality
3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationChapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.
Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationChap 2.1 : Random Variables
Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is
More informationStatistics 3657 : Moment Generating Functions
Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2016 Page 0 Expectation of a discrete random variable Definition: The expected value of a discrete random variable exists, and is defined by EX
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.
UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Christopher Barr University of California, Los Angeles,
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More information2.3 Analysis of Categorical Data
90 CHAPTER 2. ESTIMATION AND HYPOTHESIS TESTING 2.3 Analysis of Categorical Data 2.3.1 The Multinomial Probability Distribution A mulinomial random variable is a generalization of the binomial rv. It results
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationGuidelines for Solving Probability Problems
Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationORF 245 Fundamentals of Statistics Chapter 4 Great Expectations
ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable
More informationGenerating Random Variates 2 (Chapter 8, Law)
B. Maddah ENMG 6 Simulation /5/08 Generating Random Variates (Chapter 8, Law) Generating random variates from U(a, b) Recall that a random X which is uniformly distributed on interval [a, b], X ~ U(a,
More informationChapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type
Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )
UCLA STAT 35 Applied Computational and Interactive Probability Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Chris Barr Continuous Random Variables and Probability
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationDefinition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R
Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationStatistika pro informatiku
Statistika pro informatiku prof. RNDr. Roman Kotecký DrSc., Dr. Rudolf Blažek, PhD Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-SPI, ZS 2011/12, Přednáška 5 Evropský sociální
More informationOrder Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order.
Order Statistics The order statistics of a set of random variables 1, 2,, n are the same random variables arranged in increasing order. Denote by (1) = smallest of 1, 2,, n (2) = 2 nd smallest of 1, 2,,
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More information6 The normal distribution, the central limit theorem and random samples
6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e
More informationChapter 4: Continuous Random Variables and Probability Distributions
Chapter 4: and Probability Distributions Walid Sharabati Purdue University February 14, 2014 Professor Sharabati (Purdue University) Spring 2014 (Slide 1 of 37) Chapter Overview Continuous random variables
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationContinuous Distributions
Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study
More information