Statistics 3657 : Moment Approximations
|
|
- Conrad Mitchell
- 5 years ago
- Views:
Transcription
1 Statistics 3657 : Moment Approximations Preliminaries Suppose that we have a r.v. and that we wish to calculate the expectation of g) for some function g. Of course we could calculate it as Eg)) by the appropriate integral continuous r.v. ) or sum discrete r.v. ). This may not always be easy, for example when itself is a function of several other r.v. s. An approximation can be constructed using Taylor s series. In practice usually we are only interested in first or second order approximations. Below we assume that g has enough derivatives for the expressions to be valid. A first order Taylor s series of a function g, about x 0 is where R x) is the remainder term. gx) = gx 0 ) + g x 0 )x x 0 ) + R x) A second order Taylor s series of a function g, about x 0 is where R 2 x) is the remainder term. gx) = gx 0 ) + g x 0 )x x 0 ) + 2 g x 0 )x x 0 ) 2 + R 2 x) The remainder terms R and R 2 are of the forms R x) = 2 g x )x x 0 ) 2 R 2 x) = 3! g3) x )x x 0 ) 3 where x is some number between x 0 and x. More specifically there is a number α [0, ] such that x = αx 0 + α)x. Typically it is not easy to calculate x, but there are often nice upper bounds on the remainder terms. However in this course we will not be concerned with these beyond noting that they exist. In further probability courses these are studied.
2 Moment Approximations 2 Now consider the Taylor approximation obtained by taking the polynomial part of the Taylor s series, but ignoring the remainder. That is we consider g x) = gx 0 ) + g x 0 )x x 0 ) g 2 x) = gx 0 ) + g x 0 )x x 0 ) + 2 g x 0 )x x 0 ) 2 The function g is a polynomial of degree, and the function g 2 is a polynomial of degree 2. We also call g a first order Taylor s approximation of the function g, and call g 2 a second order Taylor s approximation of the function g. 2 Moment Approximation Let be a random variable with enough finite moments so that the calculations below are well defined. They will be given in more detail below as needed. Suppose that E) = µ is finite, and that σ 2 = Var) is also finite. Let g be the first order Taylor s approximation of g about the x 0 = µ, that is g x) = gµ) + g µ)x µ). Consider the random variable g ). It is an approximation to the random variable g). We can use moments of g ) to approximate moments of g). Using this first Taylor s approximation we can easily calculate and Eg )) = E gµ) + g µ) µ)) = gµ) + g µ)e µ) = gµ) Varg )) = Var gµ) + g µ) µ)) = g µ)) 2 Var µ) = g µ)) 2 σ 2. The idea of the moment approximation is to approximate Eg)) and Varg)) by moments of the Taylor functions used to approximate g.
3 Moment Approximations 3 Example : Suppose Unif, 3) and Y =. Find the first order Taylor approximation to the mean and variance of Y. Also find the variance of, as Var) = 3. µ = E) = 3 x 2 dx = 2 Consider gx) = x as a mapping from [,3] to the reals. g x) = 2 x /2. Thus the first order Taylor s approximation about µ = 2 is Then g x) = g2) + 2 x 2). 2 Eg )) = 2 ) 2 Varg )) = = 24 Thus we find the first order Taylor s approximation for the mean and variance of Y as If we approximate Eg)) by EY ) Eg )) = 2 VarY ) Varg )) = 24 Eg )) = Egµ) + g µ) µ)) = gµ) we do not have a good approximation. Thus we often use the second order Taylor series to approximate Eg)), that is Eg 2 )) = Egµ) + g µ) µ) + 2 g µ) µ) 2 ) = gµ) + g µ) σ 2. 2 To use the second order approximation for the variance we obtain Varg 2 )) = Var gµ) + g µ) µ) + 2 ) g µ) µ) 2 = g µ) 2 Var) + 2g µ) 2 g µ)cov µ, µ) 2 ) + 4 g µ) 2 Var µ) 2 )
4 Moment Approximations 4 Aside : The student should find the mean and variance of Y =. These are exactly and rounded to 4 decimal places 27 EY ) = = VarY ) = = Aside : In general there is no need to carry more digits than required to get a meaningful answer. Here 3 or 4 decimal digits is reasonable, whereas decimal digit is not as it would give a variance of approximately 0.0, and hence not reasonable. Remark Recall VarY 0. Also Varg )) 0 and Varg 2 )) 0. However it is not always true that Eg 2 ) 2 ) Eg ))) 2 is greater than or equal to 0. Similarly it is also not guaranteed that Eg ) 2 ) Eg 2 ))) 2 0. Thus when approximating variances with this method we should in general not mix the two different degrees of approximation.
5 Moment Approximations 5 Functions of Several Variables and Moment Approximation This idea of moment approximation also extends to functions of several variables. consider two variables. Here we only Suppose, Y ) are bivariate r.v.s and Z = g, Y ). We suppose that g has derivatives and hence Taylor approximations. For ease of writing we use the notation g x x, y) = gx, y) x g y x, y) = gx, y) y g xx x, y) = 2 gx, y) 2 x g yy y, y) = 2 gx, y) 2 y g xy x, y) = 2 gx, y). x y First order Taylor s approximation about x 0, y 0 ) : g x, y) = gx 0, y 0 ) + g x x 0, y 0 )x x 0 ) + g y x 0, y 0 )y y 0 ). Second order Taylor s approximation about x 0, y 0 ) : g 2 x, y) = gx 0, y 0 ) + g x x 0, y 0 )x x 0 ) + g y x 0, y 0 )y y 0 ) + 2 g xxx 0, y 0 )x x 0 ) 2 + g xy x 0, y 0 )x x 0 )y y 0 ) + 2 g yyx 0, y 0 )y y 0 ) 2. and Again we approximate Eg, Y )) Eg 2, Y )) Varg, Y )) Varg, Y )) where the Taylor s approximation are obtained about x 0, y 0 ) = µ, µ Y ).
6 Moment Approximations 6 Example Suppose that and Y are independent and we are interested in the ratio g, Y ) = Y. and Then g 2, Y ) = µ Y µ µ Y g, Y ) = µ Y µ Y µ µ 2 µ ) + Y µ Y ) µ 2 µ 2 µ ) + Y µ Y ) + µ 2µ Y µ 3 µ ) µ )Y µ Y ) + 0 Y µ Y ) 2 = µ Y µ Y µ µ 2 µ ) + Y µ Y ) + µ Y µ µ 3 µ ) 2. Using the second order Taylor approximation we obtain Eg, Y )) Eg 2, Y )) = µ Y µ Y µ µ 3 σ 2. Using the first order Taylor approximation we obtain Varg, Y )) Varg, Y )) µy = Var µ Y µ µ 2 µ ) + ) Y µ Y ) µ = µ2 Y µ 4 σ 2 2 µ Y µ 3 Cov, Y ) + µ 2 σy 2 = µ2 Y µ 4 σ 2 + µ 2 σy 2.
7 Moment Approximations 7 QQ Plots Consider the iid random variables, 2,..., n from a continuous cdf F. Next consider the ordered data ), 2),..., n), that is the order statistics from these n r.v.s. Notice that U i = F i ) are iid Uniform0,), and that U i) = F i) ). Thus i) = F U i) ) We can find the marginal distribution of U i) from the methods above. Thus i) = F U i) ) In Chapter 3.7 we found the distribution of i), specifically it has the pdf, say i given by f i x) = i )!n i)! F x)i fx) F x)) n i where F and f are the cdf and pdf of U Uniform0, ). In particular we have In the case of Uniform0,) r.v.s we then obtain the marginal pdf of U i), say g i g i x) = Remark : This is a special case of a Beta distribution. { i )!n i)! xi x) n i if 0 < x < 0 otherwise Remark : Since this is a probability density for any positive integer n and i i n, we also have replacing the integers n, i by m, k) that 0 x k x) m k dx = k )!m k)! m! Thus EU i) ) = = = = = 0 0 xg i x)dx x i )!n i)! xi x) n i dx i )!n i)! i )!n i)! i n + 0 x i x) n+) i+) dx i!n i)! n + )!
8 Moment Approximations 8 Thus to a first order Taylor s approximation we have ) i E i) ) = EF U i) ) = F EU i) )) = F n +. Thus when we order the iid data, 2,..., n, and plot the pairs F ) points are of the form F, F i i n+ n+ i n+ ), i) on average these )), that is they fall on a straight line of slope. This notion corresponds to plotting on one axis the theoretical quantiles and on the other hand the empirical quantiles or order statistics. This type of plot is called a quantile-quantile or QQ plot. Normal QQ Plots In some cases, namely location and scale families of distributions, it is possible to make QQ plots with respect to a particular member of this family. This is what is done for normal QQ plots. First we find the relation between non-standard normal and standard normal quantiles. Recall for a given number 0 < q <, the q-th quantile of a distribution with cdf F is defined as the solution, say x q of the equation F x) = q. Let x q be the q-th quantile for a Nµ, σ 2 ) distribution, and z q be the q-th quantile for a standard normal distribution. Thus, letting Nµ, σ 2 ) and Z N0, ) µ P x q ) = P x ) q µ σ σ = P Z x ) q µ σ Therefore or equivalently x q µ σ = z q F q) = x q = σz q + µ where F is the Nµ, σ 2 ) cdf. Let Φ be the N0, ) cdf. ) Plotting i) against F i n+ will follow approximately a line with intercept 0 and slope. However ) plotting i) against Φ i n+ will follow approximately a straight line of slope σ and intercept µ. In fact this is what is done in software implementations of normal QQ plots, such as qqnorm in the statistical programming language R. Aside : R actually does something slightly different. i ) E i) ) = Φ 2. n
9 Moment Approximations 9 In R the qqnorm function plots the observed data x i) against the normal quantiles above. The student might wish to check this by making the corresponding plots in R. Exponential QQ Plots One may construct other QQ plots for continuous distributions. Consider the exponential distribution, with cdf F x) = e λx if x > 0 and F x) = 0 for x 0. Quantiles are easy to obtain, so that the q-th quantile is the solution of q = F x). Let x q be this solution. Therefore q = e λxq or equivalently x q = λ log q) = log/ q)). λ For data x,..., x n the exponential λ QQ plot would plot the pairs λ log i ) n + ), x i) = λ logn + i n + ), x i) i =,..., n. This plot of course would require knowing λ. Since the exponential λ quantiles are the standard exponential exponential parameter ) quantiles divided by λ, one may also plot log i ) n + ), x i) = log n + i ) n + ), x i), i =,..., n. Notice that if the i are iid exponential, no mater what value for λ, this plot will be approximately a straight line. The departures from the straight line will depend on the sampling variability of the order statistics. The student might wish to write an R, matlab or other program to make these exponential QQ plots and then try it for some simulated exponential data. You will then see the variability is much more than for normal QQ plots, but useful nonetheless. ),
Order Statistics and Distributions
Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density
More informationThe Delta Method and Applications
Chapter 5 The Delta Method and Applications 5.1 Local linear approximations Suppose that a particular random sequence converges in distribution to a particular constant. The idea of using a first-order
More informationDelta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0)
Delta Method Often estimators are functions of other random variables, for example in the method of moments. These functions of random variables can sometimes inherit a normal approximation from the underlying
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More information1 Acceptance-Rejection Method
Copyright c 2016 by Karl Sigman 1 Acceptance-Rejection Method As we already know, finding an explicit formula for F 1 (y), y [0, 1], for the cdf of a rv X we wish to generate, F (x) = P (X x), x R, is
More informationFunctions of Random Variables
Functions of Random Variables Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Functions of Random Variables As we ve seen before, if X N(µ, σ 2 ), then Y = ax + b is also normally distributed.
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationREVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)
REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationStatistics 3657 : Moment Generating Functions
Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationIndependent Events. Two events are independent if knowing that one occurs does not change the probability of the other occurring
Independent Events Two events are independent if knowing that one occurs does not change the probability of the other occurring Conditional probability is denoted P(A B), which is defined to be: P(A and
More informationWeek 9 The Central Limit Theorem and Estimation Concepts
Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationMFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators
MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators Thilo Klein University of Cambridge Judge Business School Session 4: Linear regression,
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationNon-parametric Inference and Resampling
Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More informationStatistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University
Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationFirst Year Examination Department of Statistics, University of Florida
First Year Examination Department of Statistics, University of Florida August 19, 010, 8:00 am - 1:00 noon Instructions: 1. You have four hours to answer questions in this examination.. You must show your
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationMTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6
MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward
More informationMIT Spring 2015
Assessing Goodness Of Fit MIT 8.443 Dr. Kempthorne Spring 205 Outline 2 Poisson Distribution Counts of events that occur at constant rate Counts in disjoint intervals/regions are independent If intervals/regions
More informationIntroduction to Normal Distribution
Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction
More informationLecture 21: Convergence of transformations and generating a random variable
Lecture 21: Convergence of transformations and generating a random variable If Z n converges to Z in some sense, we often need to check whether h(z n ) converges to h(z ) in the same sense. Continuous
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationTwo hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45
Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationStatistics (1): Estimation
Statistics (1): Estimation Marco Banterlé, Christian Robert and Judith Rousseau Practicals 2014-2015 L3, MIDO, Université Paris Dauphine 1 Table des matières 1 Random variables, probability, expectation
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to
More informationECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable
ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous
More informationSolution to Assignment 3
The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises IV (based on lectures 7-8, work week 5, hand in lecture Mon 30 Oct) ALL questions count towards the continuous assessment for this module. Q1. If a random variable
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.
UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Christopher Barr University of California, Los Angeles,
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationPanHomc'r I'rui;* :".>r '.a'' W"»' I'fltolt. 'j'l :. r... Jnfii<on. Kslaiaaac. <.T i.. %.. 1 >
5 28 (x / &» )»(»»» Q ( 3 Q» (» ( (3 5» ( q 2 5 q 2 5 5 8) 5 2 2 ) ~ ( / x {» /»»»»» (»»» ( 3 ) / & Q ) X ] Q & X X X x» 8 ( &» 2 & % X ) 8 x & X ( #»»q 3 ( ) & X 3 / Q X»»» %» ( z 22 (»» 2» }» / & 2 X
More informationLecture 35. Summarizing Data - II
Math 48 - Mathematical Statistics Lecture 35. Summarizing Data - II April 26, 212 Konstantin Zuev (USC) Math 48, Lecture 35 April 26, 213 1 / 18 Agenda Quantile-Quantile Plots Histograms Kernel Probability
More informationCopula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011
Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Outline Ordinary Least Squares (OLS) Regression Generalized Linear Models
More informationECE 541 Stochastic Signals and Systems Problem Set 9 Solutions
ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationSpring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =
Spring 2012 Math 541A Exam 1 1. (a) Let Z i be independent N(0, 1), i = 1, 2,, n. Are Z = 1 n n Z i and S 2 Z = 1 n 1 n (Z i Z) 2 independent? Prove your claim. (b) Let X 1, X 2,, X n be independent identically
More informationHW Solution 12 Due: Dec 2, 9:19 AM
ECS 315: Probability and Random Processes 2015/1 HW Solution 12 Due: Dec 2, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Let X E(3). (a) For each of the following function g(x). Indicate whether
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationConfidence Intervals Unknown σ
Confidence Intervals Unknown σ Estimate σ Student s t-distribution Step-by-step instructions Example Confidence Intervals - Known σ Standard normal distribution aaad1hicbvjbtxqxfc6sf1xvoi/gzojcgrohm4neetahyqipxlbzmaer6xtozjz0kmnk5cmt8zxn33vf+ap8qf44h/xdedcbzp0eubr951lz8kqkwobx7+nplvnzl+4ohopffnk1wvxz+duvk/1yhdocs1+zcxgqrq0lpcsvhqgwbljmer10p91ufwnrcq017ueg/ziusa8gzreijo5zj6jxfcxe38tuznxgpblz0kiojm7a/xvz+9vpxs7c9m/aa75qarluwr1vzkle07zqzgenybjmqogn9lbwyjqvgjdd81aftoiuiojwba4fyaubxiwnlxr+uwrfpktlhkiqzcqcw9ar7o3jud0jviwukbougdccm4n4kbnglb5dmyry47osmcrpbjogtpo+oxigdtf1ek+nkibrp/giujpa4e4gtskuxe0tpibaemohx3rchh47x57pkjjhcqhavyozu7ssr0nbtrsi1nl5gm47m5c0al5ciff+9pxd/yrtjlbabeonyhkshf/gwdtmrajtbvdpaztj4cuqxipgrdb3mfstkso1kpqamtf6z7cxxt9//7htq6tiqitckcjcndkmnvxxgrybn69ffbeuzrd7qyp0m66sulpohygqp3jlsfixeijndhxxdhkvnnexptpvfhigzzhauzwmo4jwkng55cnkktxu9dgl1kx6tdnzekmm1q6rosrjoqhwsppyqbpeu4m+ua+kx+trzzvfw59oarotx1pbpkj1fr6fqtsik=
More informationProblem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51
Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )
UCLA STAT 35 Applied Computational and Interactive Probability Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Chris Barr Continuous Random Variables and Probability
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationChapter 7: Exponents
Chapter : Exponents Algebra Chapter Notes Name: Notes #: Sections.. Section.: Review Simplify; leave all answers in positive exponents:.) m -.) y -.) m 0.) -.) -.) - -.) (m ) 0.) 0 x y Evaluate if a =
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationMATH 3B (Butler) Practice for Final (I, Solutions)
MATH 3B (Butler) Practice for Final (I, Solutions). Gabriel s horn is a mathematical object taken by rotating the curve y = x around the x-axis for x
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationDiscrete Mathematics and Probability Theory Fall 2015 Note 20. A Brief Introduction to Continuous Probability
CS 7 Discrete Mathematics and Probability Theory Fall 215 Note 2 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces Ω, where the number
More informationProbability review. September 11, Stoch. Systems Analysis Introduction 1
Probability review Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ September 11, 2015 Stoch.
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationStat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1
Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function
More informationTopic 4: Continuous random variables
Topic 4: Continuous random variables Course 3, 216 Page Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative function
More informationDiscrete Mathematics and Probability Theory Fall 2013 Vazirani Note 16. A Brief Introduction to Continuous Probability
CS 7 Discrete Mathematics and Probability Theory Fall 213 Vazirani Note 16 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces Ω, where the
More informationSTAT 512 sp 2018 Summary Sheet
STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}
More information(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.
54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationThis exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.
GROUND RULES: This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner. This exam is closed book and closed notes. Show
More informationSection Properties of Rational Expressions
88 Section. - Properties of Rational Expressions Recall that a rational number is any number that can be written as the ratio of two integers where the integer in the denominator cannot be. Rational Numbers:
More informationMath 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.
Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample
More informationPreliminary Statistics. Lecture 3: Probability Models and Distributions
Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution
More informationF X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.
10.2 Properties of PDF and CDF for Continuous Random Variables 10.18. The pdf f X is determined only almost everywhere 42. That is, given a pdf f for a random variable X, if we construct a function g by
More informationStatistics 3858 : Maximum Likelihood Estimators
Statistics 3858 : Maximum Likelihood Estimators 1 Method of Maximum Likelihood In this method we construct the so called likelihood function, that is L(θ) = L(θ; X 1, X 2,..., X n ) = f n (X 1, X 2,...,
More informationPart 6: Multivariate Normal and Linear Models
Part 6: Multivariate Normal and Linear Models 1 Multiple measurements Up until now all of our statistical models have been univariate models models for a single measurement on each member of a sample of
More informationlim F n(x) = F(x) will not use either of these. In particular, I m keeping reserved for implies. ) Note:
APPM/MATH 4/5520, Fall 2013 Notes 9: Convergence in Distribution and the Central Limit Theorem Definition: Let {X n } be a sequence of random variables with cdfs F n (x) = P(X n x). Let X be a random variable
More informationECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.
ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More information12 The Analysis of Residuals
B.Sc./Cert./M.Sc. Qualif. - Statistics: Theory and Practice 12 The Analysis of Residuals 12.1 Errors and residuals Recall that in the statistical model for the completely randomized one-way design, Y ij
More information