STA510. Solutions to exercises. 3, fall 2012
|
|
- Jody Gardner
- 5 years ago
- Views:
Transcription
1 Solutions-3, 3. oktober 212; s. 1 STA51. Solutions to eercises. 3, fall 212 See R script le 'problem-set-3-h-12.r'. Problem 1 Histogram of y.non.risk The histograms to the right show the simulated distributions (1 replications) of resources riskand non risk weighted y.non.risk Histogram of y.risk y.risk Histogram of y.c c) The histogram to the right shows the simulated distribution of resources. P (Y 62) is estimated to.78 (1 replications) y.c Simulation of Poisson processes In for eample problem 2, we are asked to simulate a (homogenous) Poisson process N(t) with intensity λ = 2. A recommended way to do this is to start with simulating the inter arrival times T 1, T 2, T 3,.... The T j 's are independent Ep(λ) (E(T j ) = 1, λ is called the rate of the process): λ T < rep(n = n.interarr, lambda = 2) Then the time points at which N(t) jumps, is found with: S.t < cumsum(t) ( S.t is assigned the numbers: S 1 = T 1, S 2 = T 1 +T 2, S 3 = T 1 +T 2 +T 3,..., i.e. the time points at which the process makes a one unit jump; called the arrival times.) Undervisning\STA51, Statistical modeling and simulation\eercises\1213-solutions-3.te
2 Solutions-3, 3. oktober 212; s. 2 The process N(t) is zero for t < t 1, and t 1 is the rst time point in S.t, i.e. S.t[1]. Then the process is one for t 1 t < t 2 and so on. Therefore we can plot a simulated realization of N(t) by Nt < : length( S.t ) plot( = c(, S.t ), y = Nt, type = s ) (The type = s produces a step function plot.) The value of N(t) (at time t) in the simulation is: n < min( which( S.t > t ) ) 1 See R script le 'problem-set-3-h-12.r Problem 2 Non homogenous Poisson processes In N(t) is a non homogenous Poisson processes with intensity λ(t) which we are going to simulate on the interval [, t ], we have to nd an upper limit for the intensity λ(t); nd a value λ such that λ(t) λ for t t. Simulate inter arrival times T 1, T 2, T 3,... where T j 's are independent Ep(λ ) (these are inter arrival times of a homogenous processes with rate λ.) The T 1, T 2, T 3,... form the arrival times S 1 = T 1, S 2 = T 1 + T 2, S 3 = T 1 + T 2 + T 3,... of the homogenous processes with rate λ, and only a subset of these events at the times S 1, S 2, S 3,... corresponds to the non homogenous processes. At time S j = t an event should be regarded an event of the non homogenous processes with probability λ(t)/λ. Thus: collect arrival times S j with probability λ(s j )/λ ; disregard the other events. The the resulting (sub) set of arrival times constitutes the the ones of the non homogenous processes. The number of inter arrival times One question is how large n.interarr, the number of inter arrival times, has to be. This depends on the time point t we want to study the process, N(t ), and it depends on the rate λ. The epected number of arrivals at time t is E{N(t )} = λt. But there is a considerable probability that the number eceeds λt, and n.interarr should be larger than this. If λt is not small (larger than 9 is sucient), N(t ) is approimately normally distributed with mean λt and standard deviation λt. Thus the probability that N(t ) eceeds λt + 3 λt, e.g., can be calculated approimately. This is: P (N(t ) > λt + 3 λt ) P (Z > 3) =.135, Undervisning\STA51, Statistical modeling and simulation\eercises\1213-solutions-3.te
3 Solutions-3, 3. oktober 212; s. 3 where Z N(, 1). Therefore, if we choose n.interarr < ceiling( lambda t.star + 3 sqrt( lambda t.star ) ) only in approimately 14 out of 1 times we would simulate too few inter arrival times. (Here t.star is t.) This method for computing the number of needed T j 's is not used in problems 2 and 3. But it is used in problem 3.21 from the book. It could of course as well have been used in 2 and 3. See R script le 'problem-set-3-h-12.r Problem 3 Plot of realization of N 1 (t) and N 2 (t). Nt N1 N t Problem 4 {, with probability 1 pj For i = 1,..., 1 let I j =, where I 1, with probability p j = 1 means that component j j works. Then i) P (I 1 = 1 I 2 = 1) = P (I 1 = 1)P (I 2 = 1) = p 1 p 2 =.7.95 =.665 ii) P (I 1 = 1 I 2 = 1) = P (I 1 = 1) + P (I 2 = 1) P (I 1 = 1)P (I 2 = 1) = p 1 + p 2 p 1 p 2 iii) P { (I 1 = 1 I 2 = 1) (I 3 = 1 I 4 = 1) } = P { (I 1 = 1 I 2 = 1 I 3 = 1) (I 1 = 1 I 2 = 1 I 4 = 1) } = p 1 p 2 p 3 + p 1 p 2 p 4 p 1 p 2 p 3 p 4 =.6633 iv) P (I I 1 7) =... may be found in principle, but it is much more easy to nd this probability by simulation! Cf. R script le. Undervisning\STA51, Statistical modeling and simulation\eercises\1213-solutions-3.te
4 Solutions-3, 3. oktober 212; s. 4 Problem 5 X, Y and Z are normal random variables, where E(X) = µ X ; Var(X) = σ 2 X E(Y ) = µ Y ; Var(Y ) = σ 2 Y Corr(X, Y ) = Cov(X,Y ) σ X σ Y = ρ, and if Y = ax + Z, (Z independent of X) we get: Cov(X, Y ) = avar(x) ρ = Corr(X, Y ) = Cov(X, Y ) σ X σ Y E(Y ) = ae(x) + E(Z) E(Z) = E(Y ) ae(x) = a σ X σ Y a = ρ σ Y σ X Var(Y ) = a 2 Var(X) + Var(Z) Var(Z) = Var(Y ) a 2 Var(X) Problem 6 The a 1, a 2 and a 3 should be chosen as the epectations of X 1, X 2 and X 3 resp. The matri A should be chosen as the cholesky decomposition of the covariance matri of [X 1, X 2, X 3 ] : Σ = Cov(X 1, X 1 ) Cov(X 1, X 2 ) Cov(X 1, X 3 ) Cov(X 2, X 1 ) Cov(X 2, X 2 ) Cov(X 2, X 3 ) Cov(X 3, X 1 ) Cov(X 3, X 2 ) Cov(X 3, X 3 )... This is how it should be done. But why does this work? Simulation estimate of P (X 1 + X 2 + X 3 3):.465 (1 replications). Problem 3.11 (Rizzo) p1=.1 p1=.2 p1=.3 p1=.4 p1=.5 p1=.6 p1=.7 p1=.8 p1=.9 Undervisning\STA51, Statistical modeling and simulation\eercises\1213-solutions-3.te
5 Solutions-3, 3. oktober 212; s. 5 Problem 3.14 (Rizzo) X X 2 X Problem 3.21 (Rizzo) Inhomogeneous Poisson process N(t) is an inhomogeneous Poisson process with mean value function m(t) = t 2 + 2t. That is the rate (intensity) λ(t) is determined by m(t) = t λ()d. This means that λ(t) = m (t) = 2t + 2. To simulate N(t) on the interval [4, 5], we could eploit that N(4) is Poisson with mean m(4) = = 24. So if Y P oiss(24), N(t) = Y + N (t 4), t 4, where N (t) is an inhomogeneous Poisson process with mean value function m (t) = m(t + 4) (intensity: λ (t) = λ(t + 4) = 2(t + 4) + 2 = 2t + 1). We will simulate N(t) for t < 6, to cover the interval [4, 5]. That is, we have to simulate N (t) for t < 2. In this period the intensity has the maimal value: λ (t) λ (2) = = 14 (since λ (t) increases with t) The number of inter arrival times: When the intensity is not constant, we can try to use the average intensity in the period considered, for the calculations of number of needed inter arrival times. Generally, the average Undervisning\STA51, Statistical modeling and simulation\eercises\1213-solutions-3.te
6 Solutions-3, 3. oktober 212; s. 6 1 b m(b) m(a) intensity in the interval [a, b] is: λ(t)dt =. a b a b a The average intensity of N (t) in [, 2] is m (2) m () = = T j 's are needed. = 12. This implies that If we do not eploit that N(4) is Poisson with mean m(4) = = 24 we can proceed as follows: We will simulate N(t) for t < 6, to cover the interval [4, 5]. In this period the intensity has the maimal value: λ(t) λ(6) = = 14 (since λ(t) increases with t) The number of inter arrival times: The average intensity of N(t) in [, 6] is = T j 's are needed. m(6) m() 6 = 48 6 = 8. This implies that Problem 6.4 (Rizzo) X 1,..., X n : i.i.d., lognormal(µ, σ 2 ). I.e. X i = e Y i, where Y i N(µ, σ 2 ). E(Y i ) = E{ln(X i )} = µ. E(X i ) = e µ+σ2 /2 ; Var(X i ) = e 2µ+σ2 (e σ2 1). (Y i Y ) 2 (ordi- Y µ For n large, we have approimately: SY 2 /n N(, 1), where S2 Y = 1 n 1 nary variance estimator, estimates σ 2 = Var(Y i )). On this background if n is large: n i=1 P ( z α/2 < Y µ S 2 Y /n < z α/2) 1 α P (Y z α/2 SY 2 /n < µ < Y + z α/2 SY 2 /n) 1 α Therefore, ( Y z α/2 S 2 Y /n, Y + z α/2 S 2 Y /n ) is a condence interval for µ with approimate condence level 1 α. Actually, since the Y i 's are i.i.d. N(µ, σ 2 ), we have that S Y µ Y 2 /n P ( t α/2,n 1 < Y µ < t S 2 α/2,n 1) = 1 α and that ( Y t α/2,n 1 S 2 Y /n Y /n, t(n 1), Y + t α/2,n 1 S 2 Y /n ) Undervisning\STA51, Statistical modeling and simulation\eercises\1213-solutions-3.te
7 Solutions-3, 3. oktober 212; s. 7 is a condence interval for µ with (eact) condence level 1 α. We are asked to use a Monte Carlo method to obtain an empirical estimate of the condence level of the interval. C.f. book at pages and check R script le. For dierent n (2, 4, 1, 2), this gave estimated condence levels very close to 95% (1 replications used). Problem 6.5 (Rizzo) X 1,..., X 2 : i.i.d., χ 2 (2) (chi square distributed with 2 degrees of freedom.) We are supposed to (erroneously) use the t-interval: ( Y t α/2,n 1 SY 2 /n, Y + t α/2,n 1 SY 2 /n ) as a condence interval for E(X i ) = µ = 2 (=d.f.) in this situation. (Erroneous, because this t-interval assumes that the X i 's are i.i.d. N(µ, σ 2 ).) This mean that the condence interval does not necessarily have condence 1 α. C.f. book at pages and check R script le. With 1 replications used, this gave an estimated condence level 92.7% which is very close to 95%. π/3 sin(t)dt = cos(t) Problem 5.1 (Rizzo) π/3 π/3 = cos(π/3) + 1 =.5 If U U[, π/3], then E{sin(U)} = sin(t) 1 π/3 dt = 3 sin(t)dt. So, for n U[, π/3] distributed U 1,..., U n, we have that the average of sin(u 1 ),..., sin(u n ) times ( 3 π ) 1 will π estimate the quantity π/3 π/3 sin(t)dt. The Monte Carlo estimate is then: v ( 3 π ) 1, (with v i = sin(u i )). We want to estimate (assume < ) The term Φ() = Problem 5.2 (Rizzo) 1 e y2 dy =.5 + e 1 2 y2 dy 2π 2π 1 2π e 1 2 y2 dy can be estimated by Monte Carlo simulation by using U U[, ]. Then we have: E{e 1 2 U 2 } = e 1 1 2π 2 y2 dy = 1 e 1 2 y2 dy. 2π Undervisning\STA51, Statistical modeling and simulation\eercises\1213-solutions-3.te
8 Solutions-3, 3. oktober 212; s. 8 Thus if U 1,..., U n are i.i.d. U[, ] and V i = e 1 2 U 2 i, then Here V = 1 n ni=1 V i. 1 V estimates e 1 2 y2 dy. 2π 2π Remark: The gure below shows a histogram of n = 1 simulated values of Φ() =.5 + V for 2π = 2. As may be seen, it is possible that a simulated value may larger that 1., but we know that Φ() 1 for all. Histogram of fi..vec For these simulated values: Maimum value: Variance: e-5 The condence interval, 1: (.9631,.9929) The condence interval, 2: (.9615,.992) Theoretical value: fi..vec Undervisning\STA51, Statistical modeling and simulation\eercises\1213-solutions-3.te
ECON Fundamentals of Probability
ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,
More informationHomework set 5 - Solutions
Homework set 5 - Solutions Math 3 Renato Feres 1. Illustrating the central limit theorem. Let X be a random variable having the uniform distribution over the interval [1,]. Denote by X 1, X, X 3,... a
More informationQualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama
Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours
More informationVariance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18
Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We
More informationChapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations
Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationRandom Vectors and Multivariate Normal Distributions
Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random 75 variables. For instance, X = X 1 X 2., where each
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationVariance reduction techniques
Variance reduction techniques Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/ moltchan/modsim/ http://www.cs.tut.fi/kurssit/tlt-2706/ OUTLINE: Simulation with a given confidence;
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model
More informationIE 581 Introduction to Stochastic Simulation
1. List criteria for choosing the majorizing density r (x) when creating an acceptance/rejection random-variate generator for a specified density function f (x). 2. Suppose the rate function of a nonhomogeneous
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationChapter 6: Large Random Samples Sections
Chapter 6: Large Random Samples Sections 6.1: Introduction 6.2: The Law of Large Numbers Skip p. 356-358 Skip p. 366-368 Skip 6.4: The correction for continuity Remember: The Midterm is October 25th in
More informationChapter 4. Continuous Random Variables
Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,
More informationVariance reduction techniques
Variance reduction techniques Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Simulation with a given accuracy; Variance reduction techniques;
More informationBIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84
Chapter 2 84 Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random variables. For instance, X = X 1 X 2.
More informationPreliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics,
More informationThe Multivariate Normal Distribution. In this case according to our theorem
The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this
More informationOptimization and Simulation
Optimization and Simulation Variance reduction Michel Bierlaire Transport and Mobility Laboratory School of Architecture, Civil and Environmental Engineering Ecole Polytechnique Fédérale de Lausanne M.
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationLecture Note 1: Probability Theory and Statistics
Univ. of Michigan - NAME 568/EECS 568/ROB 530 Winter 2018 Lecture Note 1: Probability Theory and Statistics Lecturer: Maani Ghaffari Jadidi Date: April 6, 2018 For this and all future notes, if you would
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationRandom Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30
Random Vectors 1 STA442/2101 Fall 2017 1 See last slide for copyright information. 1 / 30 Background Reading: Renscher and Schaalje s Linear models in statistics Chapter 3 on Random Vectors and Matrices
More informationLecture 19. Condence Interval
Lecture 19. Condence Interval December 5, 2011 The phrase condence interval can refer to a random interval, called an interval estimator, that covers the true value θ 0 of a parameter of interest with
More informationISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16)
1 NAME ISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16) This test is 85 minutes. You are allowed two cheat sheets. Good luck! 1. Some short answer questions to get things going. (a) Consider the
More informationMATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours
MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationSummary of Chapter 7 (Sections ) and Chapter 8 (Section 8.1)
Summary of Chapter 7 (Sections 7.2-7.5) and Chapter 8 (Section 8.1) Chapter 7. Tests of Statistical Hypotheses 7.2. Tests about One Mean (1) Test about One Mean Case 1: σ is known. Assume that X N(µ, σ
More informationLECTURE 1. Introduction to Econometrics
LECTURE 1 Introduction to Econometrics Ján Palguta September 20, 2016 1 / 29 WHAT IS ECONOMETRICS? To beginning students, it may seem as if econometrics is an overly complex obstacle to an otherwise useful
More informationIn this course we: do distribution theory when ǫ i N(0, σ 2 ) discuss what if the errors, ǫ i are not normal? omit proofs.
Distribution Theory Question: What is distribution theory? Answer: How to compute the distribution of an estimator, test or other statistic, T : Find P(T t), the Cumulative Distribution Function (CDF)
More informationSimulating events: the Poisson process
Simulating events: te Poisson process p. 1/15 Simulating events: te Poisson process Micel Bierlaire micel.bierlaire@epfl.c Transport and Mobility Laboratory Simulating events: te Poisson process p. 2/15
More informationCh. 5 Joint Probability Distributions and Random Samples
Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationComplexity of two and multi-stage stochastic programming problems
Complexity of two and multi-stage stochastic programming problems A. Shapiro School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332-0205, USA The concept
More informationBrandon C. Kelly (Harvard Smithsonian Center for Astrophysics)
Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Probability quantifies randomness and uncertainty How do I estimate the normalization and logarithmic slope of a X ray continuum, assuming
More informationSemester , Example Exam 1
Semester 1 2017, Example Exam 1 1 of 10 Instructions The exam consists of 4 questions, 1-4. Each question has four items, a-d. Within each question: Item (a) carries a weight of 8 marks. Item (b) carries
More informationECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes
ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV
More information2WB05 Simulation Lecture 7: Output analysis
2WB05 Simulation Lecture 7: Output analysis Marko Boon http://www.win.tue.nl/courses/2wb05 December 17, 2012 Outline 2/33 Output analysis of a simulation Confidence intervals Warm-up interval Common random
More informationIntelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham
Intelligent Data Analysis Principal Component Analysis Peter Tiňo School of Computer Science University of Birmingham Discovering low-dimensional spatial layout in higher dimensional spaces - 1-D/3-D example
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationPreliminary Statistics. Lecture 3: Probability Models and Distributions
Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More information8 - Continuous random vectors
8-1 Continuous random vectors S. Lall, Stanford 2011.01.25.01 8 - Continuous random vectors Mean-square deviation Mean-variance decomposition Gaussian random vectors The Gamma function The χ 2 distribution
More information14.30 Introduction to Statistical Methods in Economics Spring 2009
MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationCovariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance
Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007
UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas
More informationSTT 843 Key to Homework 1 Spring 2018
STT 843 Key to Homework Spring 208 Due date: Feb 4, 208 42 (a Because σ = 2, σ 22 = and ρ 2 = 05, we have σ 2 = ρ 2 σ σ22 = 2/2 Then, the mean and covariance of the bivariate normal is µ = ( 0 2 and Σ
More informationIntroduction to Probability Theory for Graduate Economics Fall 2008
Introduction to Probability Theory for Graduate Economics Fall 008 Yiğit Sağlam October 10, 008 CHAPTER - RANDOM VARIABLES AND EXPECTATION 1 1 Random Variables A random variable (RV) is a real-valued function
More informationCHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable
CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More informationNext tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2
Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationStatistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }
Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationStatistics. Statistics
The main aims of statistics 1 1 Choosing a model 2 Estimating its parameter(s) 1 point estimates 2 interval estimates 3 Testing hypotheses Distributions used in statistics: χ 2 n-distribution 2 Let X 1,
More informationEC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)
1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationIII - MULTIVARIATE RANDOM VARIABLES
Computational Methods and advanced Statistics Tools III - MULTIVARIATE RANDOM VARIABLES A random vector, or multivariate random variable, is a vector of n scalar random variables. The random vector is
More informationA Comparison of Particle Filters for Personal Positioning
VI Hotine-Marussi Symposium of Theoretical and Computational Geodesy May 9-June 6. A Comparison of Particle Filters for Personal Positioning D. Petrovich and R. Piché Institute of Mathematics Tampere University
More informationAsymptotic Properties and simulation in gretl
Asymptotic Properties and simulation in gretl Quantitative Microeconomics R. Mora Department of Economics Universidad Carlos III de Madrid Outline 1 Asymptotic Results for OLS 2 3 4 5 Classical Assumptions
More informationM(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1
Math 66/566 - Midterm Solutions NOTE: These solutions are for both the 66 and 566 exam. The problems are the same until questions and 5. 1. The moment generating function of a random variable X is M(t)
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationStat 516, Homework 1
Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball
More informationManual for SOA Exam MLC.
Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationHW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)
HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationECSE B Solutions to Assignment 8 Fall 2008
ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant
More informationBivariate Paired Numerical Data
Bivariate Paired Numerical Data Pearson s correlation, Spearman s ρ and Kendall s τ, tests of independence University of California, San Diego Instructor: Ery Arias-Castro http://math.ucsd.edu/~eariasca/teaching.html
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationContinuous Distributions
Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationClass 26: review for final exam 18.05, Spring 2014
Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationSolution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.
Solutions Stochastic Processes and Simulation II, May 18, 217 Problem 1: Poisson Processes Let {N(t), t } be a homogeneous Poisson Process on (, ) with rate λ. Let {S i, i = 1, 2, } be the points of the
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationPoisson processes and their properties
Poisson processes and their properties Poisson processes. collection {N(t) : t [, )} of rando variable indexed by tie t is called a continuous-tie stochastic process, Furtherore, we call N(t) a Poisson
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationLecture 28: Asymptotic confidence sets
Lecture 28: Asymptotic confidence sets 1 α asymptotic confidence sets Similar to testing hypotheses, in many situations it is difficult to find a confidence set with a given confidence coefficient or level
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More information