Approximation of the conditional number of exceedances
|
|
- Phillip McCoy
- 5 years ago
- Views:
Transcription
1 Approximation of the conditional number of exceedances Han Liang Gan University of Melbourne July 9, 2013 Joint work with Aihua Xia Han Liang Gan Approximation of the conditional number of exceedances 1 / 16
2 Conditional exceedances Han Liang Gan Approximation of the conditional number of exceedances 2 / 16
3 Conditional exceedances Given a sequence of identical but not independent random variables X 1,..., X n, define n N s,n = 1 {Xi >s}. i=1 Han Liang Gan Approximation of the conditional number of exceedances 2 / 16
4 Conditional exceedances Given a sequence of identical but not independent random variables X 1,..., X n, define n N s,n = 1 {Xi >s}. i=1 The fragility index of order m, denoted FI n (m) is defined as FI n (m) = lim s E(N s,n N s,n m). Han Liang Gan Approximation of the conditional number of exceedances 2 / 16
5 Conditional exceedances Given a sequence of identical but not independent random variables X 1,..., X n, define n N s,n = 1 {Xi >s}. i=1 The fragility index of order m, denoted FI n (m) is defined as FI n (m) = lim s E(N s,n N s,n m). Applications: Insurance. Han Liang Gan Approximation of the conditional number of exceedances 2 / 16
6 Compound Poisson limit Under certain conditions, Tsing et. al. (1988) showed that the exceedance process converges to a Compound Poisson limit. Han Liang Gan Approximation of the conditional number of exceedances 3 / 16
7 Assumptions Han Liang Gan Approximation of the conditional number of exceedances 4 / 16
8 Assumptions Existence of the limits. Han Liang Gan Approximation of the conditional number of exceedances 4 / 16
9 Assumptions Existence of the limits. Tail behaviour is nice enough. Han Liang Gan Approximation of the conditional number of exceedances 4 / 16
10 Assumptions Existence of the limits. Tail behaviour is nice enough. Mixing condition. Han Liang Gan Approximation of the conditional number of exceedances 4 / 16
11 Assumptions Existence of the limits. Tail behaviour is nice enough. Mixing condition. Uniform convergence. Han Liang Gan Approximation of the conditional number of exceedances 4 / 16
12 Assumptions Existence of the limits. Tail behaviour is nice enough. Mixing condition. Uniform convergence. Uniform integrability. Han Liang Gan Approximation of the conditional number of exceedances 4 / 16
13 Alternatives In contrast to limit theorems, we can use approximation theory. Our tool of choice will be Stein s method. Han Liang Gan Approximation of the conditional number of exceedances 5 / 16
14 Alternatives In contrast to limit theorems, we can use approximation theory. Our tool of choice will be Stein s method. For the next section of the talk, we shall stick to conditioning on just one exceedance, Poisson approximation, and total variation distance. Han Liang Gan Approximation of the conditional number of exceedances 5 / 16
15 A simple example Let X 1,..., X n be a sequence of i.i.d. exponential random variables, p = p s = P(X 1 > s) and Z λ Po(λ). Han Liang Gan Approximation of the conditional number of exceedances 6 / 16
16 A simple example Let X 1,..., X n be a sequence of i.i.d. exponential random variables, p = p s = P(X 1 > s) and Z λ Po(λ). d TV (L(N 1 ), Po 1 (λ)) = 1 2 P(N 1 = j) P(Zλ 1 = j). j=1 Han Liang Gan Approximation of the conditional number of exceedances 6 / 16
17 A simple example (2) Set λ such that e λ = (1 p) n. Han Liang Gan Approximation of the conditional number of exceedances 7 / 16
18 A simple example (2) Set λ such that e λ = (1 p) n. So now we have d TV (L(N 1 ), Po 1 (λ)) = 1 2(1 P(N = 0)) P(N = j) P(Z λ = j). j=1 Han Liang Gan Approximation of the conditional number of exceedances 7 / 16
19 A simple example (3) Set λ = np, and we can reduce the problem to known results. d TV (L(N 1 ), Po 1 (λ)) = 1 2(1 P(N = 0)) [ P(N = j) P(Z λ = j) j=1 + P(Z λ = j) P(Z λ = j) ]. Han Liang Gan Approximation of the conditional number of exceedances 8 / 16
20 A simple example (3) Set λ = np, and we can reduce the problem to known results. d TV (L(N 1 ), Po 1 (λ)) = 1 2(1 P(N = 0)) [ P(N = j) P(Z λ = j) j=1 + P(Z λ = j) P(Z λ = j) ]. Using results from Barbour, Holst and Janson (1992), it can be shown that: d TV (L(N 1 ), Po 1 (λ)) = p + o(p). Han Liang Gan Approximation of the conditional number of exceedances 8 / 16
21 The conditional Poisson Stein Identity Lemma W Po 1 (λ) if and only if for all bounded functions g : Z + R, E [ λg(w + 1) Wg(W ) 1 {W 2} ] = 0. Han Liang Gan Approximation of the conditional number of exceedances 9 / 16
22 Stein s method in one slide We construct a function g for any set A, that satisfies: 1 {j A} Po 1 (λ){a} = λg(j + 1) jg(j) 1 {j 2}. Han Liang Gan Approximation of the conditional number of exceedances 10 / 16
23 Stein s method in one slide We construct a function g for any set A, that satisfies: 1 {j A} Po 1 (λ){a} = λg(j + 1) jg(j) 1 {j 2}. It now follows that, P(W A) Po(λ){A} = E [ ] λg(w + 1) Wg(W ) 1 {W 2}. Han Liang Gan Approximation of the conditional number of exceedances 10 / 16
24 Stein s method in one slide We construct a function g for any set A, that satisfies: 1 {j A} Po 1 (λ){a} = λg(j + 1) jg(j) 1 {j 2}. It now follows that, P(W A) Po(λ){A} = E [ ] λg(w + 1) Wg(W ) 1 {W 2}. We then only need to bound the right hand side. Han Liang Gan Approximation of the conditional number of exceedances 10 / 16
25 Generator interpretation If we set g(w) = h(w) h(w 1), the equation becomes: Han Liang Gan Approximation of the conditional number of exceedances 11 / 16
26 Generator interpretation If we set g(w) = h(w) h(w 1), the equation becomes: E [ λ (h(w + 1) h(w )) W (h(w ) h(w 1))1 {W 2} ]. Han Liang Gan Approximation of the conditional number of exceedances 11 / 16
27 Generator interpretation If we set g(w) = h(w) h(w 1), the equation becomes: E [ λ (h(w + 1) h(w )) W (h(w ) h(w 1))1 {W 2} ]. This looks like the generator for a immigration death process. Han Liang Gan Approximation of the conditional number of exceedances 11 / 16
28 Generator interpretation If we set g(w) = h(w) h(w 1), the equation becomes: E [ ] λ (h(w + 1) h(w )) W (h(w ) h(w 1))1 {W 2}. This looks like the generator for a immigration death process. Moreover, the stationary distribution for this generator is Po 1 (λ). Han Liang Gan Approximation of the conditional number of exceedances 11 / 16
29 Stein Factors We can use theorem 2.10 from Brown & Xia (2001) to get: Han Liang Gan Approximation of the conditional number of exceedances 12 / 16
30 Stein Factors We can use theorem 2.10 from Brown & Xia (2001) to get: Theorem The solution g that satisfies the Stein Equation for the total variation distance satisfies: g 1 e λ λe λ λ(1 e λ. ) Han Liang Gan Approximation of the conditional number of exceedances 12 / 16
31 The example revisited Han Liang Gan Approximation of the conditional number of exceedances 13 / 16
32 The example revisited Recall that from before: d TV (L(N 1 ), Po 1 (λ)) = p + o(p). Han Liang Gan Approximation of the conditional number of exceedances 13 / 16
33 The example revisited Recall that from before: d TV (L(N 1 ), Po 1 (λ)) = p + o(p). Using Stein s method directly for conditional Poisson approximation, it can be shown that: d TV (L(N 1 ), Po 1 (λ )) = p 2 + o(p). Han Liang Gan Approximation of the conditional number of exceedances 13 / 16
34 Generalisations The conditional Poisson Stein Identity generalises for conditioning on multiple exceedances: Han Liang Gan Approximation of the conditional number of exceedances 14 / 16
35 Generalisations The conditional Poisson Stein Identity generalises for conditioning on multiple exceedances: Lemma W Po a (λ) if and only if for all bounded functions g : {a, a + 1,...} R, E [ λg(w + 1) Wg(W ) 1 {W >a} ] = 0. Han Liang Gan Approximation of the conditional number of exceedances 14 / 16
36 Negative Binomial Approximation We can do exactly the same for negative binomial random variables. Lemma W Nb a (r, p) if and only if for all bounded functions g : {a, a + 1,...} R, E [ (1 p)(r + W )g(w ) Wg(W )) 1 {W >a} ] = 0. Han Liang Gan Approximation of the conditional number of exceedances 15 / 16
37 Different metrics Have we solved our original problem? Can we calculate errors when estimating fragility indicies? Han Liang Gan Approximation of the conditional number of exceedances 16 / 16
38 Different metrics Have we solved our original problem? Can we calculate errors when estimating fragility indicies? No, we still have the uniform integrability problem. Han Liang Gan Approximation of the conditional number of exceedances 16 / 16
39 Different metrics Have we solved our original problem? Can we calculate errors when estimating fragility indicies? No, we still have the uniform integrability problem. We need to use a stronger metric, such as the Wasserstein distance. Han Liang Gan Approximation of the conditional number of exceedances 16 / 16
arxiv:math/ v1 [math.pr] 27 Feb 2007
IMS Lecture Notes Monograph Series Time Series and Related Topics Vol. 52 (26) 236 244 c Institute of Mathematical Statistics, 26 DOI: 1.1214/7492176176 Poisson process approximation: From Palm theory
More informationPoisson Process Approximation: From Palm Theory to Stein s Method
IMS Lecture Notes Monograph Series c Institute of Mathematical Statistics Poisson Process Approximation: From Palm Theory to Stein s Method Louis H. Y. Chen 1 and Aihua Xia 2 National University of Singapore
More informationConditional distribution approximation and Stein s method
Conditional distribution approximation and Stein s method Han Liang Gan Submitted in total ulilment o the requirements o the degree o Doctor o Philosophy July 214 Department o Mathematics and Statistics
More informationOn the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method
On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel ETH, Zurich,
More informationOn bounds in multivariate Poisson approximation
Workshop on New Directions in Stein s Method IMS (NUS, Singapore), May 18-29, 2015 Acknowledgments Thanks Institute for Mathematical Sciences (IMS, NUS), for the invitation, hospitality and accommodation
More informationMultivariate approximation in total variation
Multivariate approximation in total variation A. D. Barbour, M. J. Luczak and A. Xia Department of Mathematics and Statistics The University of Melbourne, VIC 3010 29 May, 2015 [Slide 1] Background Information
More informationPoisson Approximation for Independent Geometric Random Variables
International Mathematical Forum, 2, 2007, no. 65, 3211-3218 Poisson Approximation for Independent Geometric Random Variables K. Teerapabolarn 1 and P. Wongasem Department of Mathematics, Faculty of Science
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationA Note on Poisson Approximation for Independent Geometric Random Variables
International Mathematical Forum, 4, 2009, no, 53-535 A Note on Poisson Approximation for Independent Geometric Random Variables K Teerapabolarn Department of Mathematics, Faculty of Science Burapha University,
More informationStein s Method and its Application in Radar Signal Processing
Stein s Method and its Application in Radar Signal Processing Graham V. Weinberg Electronic Warfare and Radar Division Systems Sciences Laboratory DSTO TR 1735 ABSTRACT In the analysis of radar signal
More informationNotes on Poisson Approximation
Notes on Poisson Approximation A. D. Barbour* Universität Zürich Progress in Stein s Method, Singapore, January 2009 These notes are a supplement to the article Topics in Poisson Approximation, which appeared
More informationSTEIN S METHOD AND BIRTH-DEATH PROCESSES 1. By Timothy C. Brown and Aihua Xia University of Melbourne
The Annals of Probability 2001, Vol. 29, No. 3, 1373 1403 STEIN S METHOD AND BIRTH-DEATH PROCESSES 1 By Timothy C. Brown and Aihua Xia University of Melbourne Barbour introduced a probabilistic view of
More informationWeak convergence in Probability Theory A summer excursion! Day 3
BCAM June 2013 1 Weak convergence in Probability Theory A summer excursion! Day 3 Armand M. Makowski ECE & ISR/HyNet University of Maryland at College Park armand@isr.umd.edu BCAM June 2013 2 Day 1: Basic
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationOn the asymptotic behaviour of the number of renewals via translated Poisson
On the asymptotic behaviour of the number of renewals via translated Poisson Aihua Xia ( 夏爱华 ) School of Mathematics and Statistics The University of Melbourne, VIC 3010 20 July, 2018 The 14th Workshop
More informationA COMPOUND POISSON APPROXIMATION INEQUALITY
J. Appl. Prob. 43, 282 288 (2006) Printed in Israel Applied Probability Trust 2006 A COMPOUND POISSON APPROXIMATION INEQUALITY EROL A. PEKÖZ, Boston University Abstract We give conditions under which the
More informationLecture 5: Moment Generating Functions
Lecture 5: Moment Generating Functions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 28th, 2018 Rasmussen (CUED) Lecture 5: Moment
More informationOverview of Extreme Value Theory. Dr. Sawsan Hilal space
Overview of Extreme Value Theory Dr. Sawsan Hilal space Maths Department - University of Bahrain space November 2010 Outline Part-1: Univariate Extremes Motivation Threshold Exceedances Part-2: Bivariate
More informationFundamental Tools - Probability Theory IV
Fundamental Tools - Probability Theory IV MSc Financial Mathematics The University of Warwick October 1, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory IV 1 / 14 Model-independent
More informationCSE 312 Final Review: Section AA
CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material
More informationA Short Introduction to Stein s Method
A Short Introduction to Stein s Method Gesine Reinert Department of Statistics University of Oxford 1 Overview Lecture 1: focusses mainly on normal approximation Lecture 2: other approximations 2 1. The
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationMixing time for a random walk on a ring
Mixing time for a random walk on a ring Stephen Connor Joint work with Michael Bate Paris, September 2013 Introduction Let X be a discrete time Markov chain on a finite state space S, with transition matrix
More informationChapter 8: Continuous Probability Distributions
Chapter 8: Continuous Probability Distributions 8.1 Introduction This chapter continued our discussion of probability distributions. It began by describing continuous probability distributions in general,
More informationarxiv:math/ v1 [math.pr] 5 Aug 2006
Preprint SYMMETRIC AND CENTERED BINOMIAL APPROXIMATION OF SUMS OF LOCALLY DEPENDENT RANDOM VARIABLES arxiv:math/68138v1 [math.pr] 5 Aug 26 By Adrian Röllin Stein s method is used to approximate sums of
More informationStronger wireless signals appear more Poisson
1 Stronger wireless signals appear more Poisson Paul Keeler, Nathan Ross, Aihua Xia and Bartłomiej Błaszczyszyn Abstract arxiv:1604.02986v3 [cs.ni] 20 Jun 2016 Keeler, Ross and Xia [1] recently derived
More informationPoisson Processes. Stochastic Processes. Feb UC3M
Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written
More informationHypothesis testing: theory and methods
Statistical Methods Warsaw School of Economics November 3, 2017 Statistical hypothesis is the name of any conjecture about unknown parameters of a population distribution. The hypothesis should be verifiable
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationContinuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2
STAT 4 Exam I Continuous RVs Fall 7 Practice. Suppose a random variable X has the following probability density function: f ( x ) = sin x, < x < π, zero otherwise. a) Find P ( X < 4 π ). b) Find µ = E
More informationThe Generalized Coupon Collector Problem
The Generalized Coupon Collector Problem John Moriarty & Peter Neal First version: 30 April 2008 Research Report No. 9, 2008, Probability and Statistics Group School of Mathematics, The University of Manchester
More information1 Inverse Transform Method and some alternative algorithms
Copyright c 2016 by Karl Sigman 1 Inverse Transform Method and some alternative algorithms Assuming our computer can hand us, upon demand, iid copies of rvs that are uniformly distributed on (0, 1), it
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More information1 Poisson processes, and Compound (batch) Poisson processes
Copyright c 2007 by Karl Sigman 1 Poisson processes, and Compound (batch) Poisson processes 1.1 Point Processes Definition 1.1 A simple point process ψ = {t n : n 1} is a sequence of strictly increasing
More informationInformation Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST
Information Theory Lecture 5 Entropy rate and Markov sources STEFAN HÖST Universal Source Coding Huffman coding is optimal, what is the problem? In the previous coding schemes (Huffman and Shannon-Fano)it
More information4 Branching Processes
4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring
More informationApplied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 4 Continuous Random Variables and Probability Distributions 4 Continuous CHAPTER OUTLINE Random
More informationChapter 4 Continuous Random Variables and Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 4 Continuous Random Variables and Probability Distributions 4 Continuous CHAPTER OUTLINE 4-1
More informationStein factors for negative binomial approximation in Wasserstein distance
Bernoulli 21(2), 215, 12 113 DOI: 1.315/14-BEJ595 arxiv:131.674v2 [math.pr] 1 Jun 215 Stein factors for negative binomial approximation in Wasserstein distance A.D. BARBOUR 1, H.L. GAN 2,* and A. XIA 2,**
More information6. Bernoulli Trials and the Poisson Process
1 of 5 7/16/2009 7:09 AM Virtual Laboratories > 14. The Poisson Process > 1 2 3 4 5 6 7 6. Bernoulli Trials and the Poisson Process Basic Comparison In some sense, the Poisson process is a continuous time
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2
IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read
More informationProbability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2
Probability Density Functions and the Normal Distribution Quantitative Understanding in Biology, 1.2 1. Discrete Probability Distributions 1.1. The Binomial Distribution Question: You ve decided to flip
More information6 The normal distribution, the central limit theorem and random samples
6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e
More informationCS145: Probability & Computing
CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,
More informationTotal variation error bounds for geometric approximation
Bernoulli 192, 2013, 610 632 DOI: 10.3150/11-BEJ406 Total variation error bounds for geometric approximation EROL A. PEKÖZ 1, ADRIAN RÖLLIN 2 and NATHAN ROSS 3 1 School of Management, Boston University,
More informationThe largest eigenvalues of the sample covariance matrix. in the heavy-tail case
The largest eigenvalues of the sample covariance matrix 1 in the heavy-tail case Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia NY), Johannes Heiny (Aarhus University)
More informationMAS113 Introduction to Probability and Statistics
MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically
More informationAPPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 651 APPENDIX B. BIBLIOGRAPHY 677 APPENDIX C. ANSWERS TO SELECTED EXERCISES 679
APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 1 Table I Summary of Common Probability Distributions 2 Table II Cumulative Standard Normal Distribution Table III Percentage Points, 2 of the Chi-Squared
More informationCDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationCS229T/STATS231: Statistical Learning Theory. Lecturer: Tengyu Ma Lecture 11 Scribe: Jongho Kim, Jamie Kang October 29th, 2018
CS229T/STATS231: Statistical Learning Theory Lecturer: Tengyu Ma Lecture 11 Scribe: Jongho Kim, Jamie Kang October 29th, 2018 1 Overview This lecture mainly covers Recall the statistical theory of GANs
More information3 Conditional Expectation
3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.
More informationStein s Method: Distributional Approximation and Concentration of Measure
Stein s Method: Distributional Approximation and Concentration of Measure Larry Goldstein University of Southern California 36 th Midwest Probability Colloquium, 2014 Stein s method for Distributional
More information2.1 Elementary probability; random sampling
Chapter 2 Probability Theory Chapter 2 outlines the probability theory necessary to understand this text. It is meant as a refresher for students who need review and as a reference for concepts and theorems
More informationAnewmetricbetweendistributionsofpointprocesses
Anewmetricbetweendistributionsofpointprocesses Dominic Schuhmacher School of Mathematics and Statistics The University of Western Australia WA 69, Australia E-mail: dominic@maths.uwa.edu.au and Aihua Xia
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationPROBABILITY THEORY LECTURE 3
PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE
More informationGeometric Skew-Normal Distribution
Debasis Kundu Arun Kumar Chair Professor Department of Mathematics & Statistics Indian Institute of Technology Kanpur Part of this work is going to appear in Sankhya, Ser. B. April 11, 2014 Outline 1 Motivation
More informationSusceptible-Infective-Removed Epidemics and Erdős-Rényi random
Susceptible-Infective-Removed Epidemics and Erdős-Rényi random graphs MSR-Inria Joint Centre October 13, 2015 SIR epidemics: the Reed-Frost model Individuals i [n] when infected, attempt to infect all
More informationPoisson approximations
Chapter 9 Poisson approximations 9.1 Overview The Binn, p) can be thought of as the distribution of a sum of independent indicator random variables X 1 + + X n, with {X i = 1} denoting a head on the ith
More informationTail process and its role in limit theorems Bojan Basrak, University of Zagreb
Tail process and its role in limit theorems Bojan Basrak, University of Zagreb The Fields Institute Toronto, May 2016 based on the joint work (in progress) with Philippe Soulier, Azra Tafro, Hrvoje Planinić
More informationSTAT 430/510: Lecture 15
STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional
More informationInference for non-stationary, non-gaussian, Irregularly-Sampled Processes
Inference for non-stationary, non-gaussian, Irregularly-Sampled Processes Robert Wolpert Duke Department of Statistical Science wolpert@stat.duke.edu SAMSI ASTRO Opening WS Aug 22-26 2016 Aug 24 Hamner
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More information6 Continuous-Time Birth and Death Chains
6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.
More informationEntropy, Compound Poisson Approximation, Log-Sobolev Inequalities and Measure Concentration
Entropy, Compound Poisson Approximation, Log-Sobolev Inequalities and Measure Concentration Ioannis Kontoyiannis 1 and Mokshay Madiman Division of Applied Mathematics, Brown University 182 George St.,
More informationMA 519 Probability: Review
MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................
More informationarxiv: v1 [math.pr] 16 Jun 2009
A THREE-PARAMETER BINOMIAL APPROXIMATION arxiv:0906.2855v1 [math.pr] 16 Jun 2009 Vydas Čekanavičius, Erol A. Peköz, Adrian Röllin and Michael Shwartz Version from June 3, 2018 Abstract We approximate the
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More information18.175: Lecture 13 Infinite divisibility and Lévy processes
18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility
More informationParameter Estimation
Parameter Estimation Chapters 13-15 Stat 477 - Loss Models Chapters 13-15 (Stat 477) Parameter Estimation Brian Hartman - BYU 1 / 23 Methods for parameter estimation Methods for parameter estimation Methods
More informationStein s method, Palm theory and Poisson process approximation
Stein s method, Palm theory and Poisson process approximation Louis H. Y. Chen and Aihua Xia National University of Singapore and University of Melbourne 3 November, 2003 (First version: 24 June, 2002
More informationLecture 7. Testing for Poisson cdf - Poisson regression - Random points in space 1
Lecture 7. Testing for Poisson cdf - Poisson regression - Random points in space 1 Igor Rychlik Chalmers Department of Mathematical Sciences Probability, Statistics and Risk, MVE300 Chalmers April 2010
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More informationA Gentle Introduction to Stein s Method for Normal Approximation I
A Gentle Introduction to Stein s Method for Normal Approximation I Larry Goldstein University of Southern California Introduction to Stein s Method for Normal Approximation 1. Much activity since Stein
More informationMonotonicity and Aging Properties of Random Sums
Monotonicity and Aging Properties of Random Sums Jun Cai and Gordon E. Willmot Department of Statistics and Actuarial Science University of Waterloo Waterloo, Ontario Canada N2L 3G1 E-mail: jcai@uwaterloo.ca,
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationLecture 8: Information Theory and Statistics
Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang
More information2. Variance and Higher Moments
1 of 16 7/16/2009 5:45 AM Virtual Laboratories > 4. Expected Value > 1 2 3 4 5 6 2. Variance and Higher Moments Recall that by taking the expected value of various transformations of a random variable,
More informationCovering n-permutations with (n + 1)-Permutations
Covering n-permutations with n + 1)-Permutations Taylor F. Allison Department of Mathematics University of North Carolina Chapel Hill, NC, U.S.A. tfallis2@ncsu.edu Kathryn M. Hawley Department of Mathematics
More informationStatistics 1B. Statistics 1B 1 (1 1)
0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?
More informationNEW FUNCTIONAL INEQUALITIES
1 / 29 NEW FUNCTIONAL INEQUALITIES VIA STEIN S METHOD Giovanni Peccati (Luxembourg University) IMA, Minneapolis: April 28, 2015 2 / 29 INTRODUCTION Based on two joint works: (1) Nourdin, Peccati and Swan
More information2. Topic: Series (Mathematical Induction, Method of Difference) (i) Let P n be the statement. Whenn = 1,
GCE A Level October/November 200 Suggested Solutions Mathematics H (9740/02) version 2. MATHEMATICS (H2) Paper 2 Suggested Solutions 9740/02 October/November 200. Topic:Complex Numbers (Complex Roots of
More informationMODERATE DEVIATIONS IN POISSON APPROXIMATION: A FIRST ATTEMPT
Statistica Sinica 23 (2013), 1523-1540 doi:http://dx.doi.org/10.5705/ss.2012.203s MODERATE DEVIATIONS IN POISSON APPROXIMATION: A FIRST ATTEMPT Louis H. Y. Chen 1, Xiao Fang 1,2 and Qi-Man Shao 3 1 National
More informationF79SM STATISTICAL METHODS
F79SM STATISTICAL METHODS SUMMARY NOTES 9 Hypothesis testing 9.1 Introduction As before we have a random sample x of size n of a population r.v. X with pdf/pf f(x;θ). The distribution we assign to X is
More informationData analysis and stochastic modeling
Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt
More informationQuestion Points Score Total: 70
The University of British Columbia Final Examination - April 204 Mathematics 303 Dr. D. Brydges Time: 2.5 hours Last Name First Signature Student Number Special Instructions: Closed book exam, no calculators.
More informationSTAT 380 Markov Chains
STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring 2016 1 / 38 1/41 PoissonProcesses.pdf (#2) Poisson Processes
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More informationPump failure data. Pump Failures Time
Outline 1. Poisson distribution 2. Tests of hypothesis for a single Poisson mean 3. Comparing multiple Poisson means 4. Likelihood equivalence with exponential model Pump failure data Pump 1 2 3 4 5 Failures
More informationInformation theoretic perspectives on learning algorithms
Information theoretic perspectives on learning algorithms Varun Jog University of Wisconsin - Madison Departments of ECE and Mathematics Shannon Channel Hangout! May 8, 2018 Jointly with Adrian Tovar-Lopez
More informationSub-Gaussian estimators under heavy tails
Sub-Gaussian estimators under heavy tails Roberto Imbuzeiro Oliveira XIX Escola Brasileira de Probabilidade Maresias, August 6th 2015 Joint with Luc Devroye (McGill) Matthieu Lerasle (CNRS/Nice) Gábor
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationAPM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65
APM 504: Probability Notes Jay Taylor Spring 2015 Jay Taylor (ASU) APM 504 Fall 2013 1 / 65 Outline Outline 1 Probability and Uncertainty 2 Random Variables Discrete Distributions Continuous Distributions
More informationTCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis
TCOM 50: Networking Theory & Fundamentals Lecture 6 February 9, 003 Prof. Yannis A. Korilis 6- Topics Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues
More informationarxiv:math.pr/ v1 24 Aug 2005
The Annals of Applied Probability 2005, Vol. 15, No. 3, 1733 1764 DOI: 10.1214/105051605000000205 c Institute of Mathematical Statistics, 2005 arxiv:math.pr/0508451 v1 24 Aug 2005 ON THE POWER OF TWO CHOICES:
More informationLecture 13 and 14: Bayesian estimation theory
1 Lecture 13 and 14: Bayesian estimation theory Spring 2012 - EE 194 Networked estimation and control (Prof. Khan) March 26 2012 I. BAYESIAN ESTIMATORS Mother Nature conducts a random experiment that generates
More informationFirst order logic on Galton-Watson trees
First order logic on Galton-Watson trees Moumanti Podder Georgia Institute of Technology Joint work with Joel Spencer January 9, 2018 Mathematics Seminar, Indian Institute of Science, Bangalore 1 / 20
More informationJumping Sequences. Steve Butler 1. (joint work with Ron Graham and Nan Zang) University of California Los Angelese
Jumping Sequences Steve Butler 1 (joint work with Ron Graham and Nan Zang) 1 Department of Mathematics University of California Los Angelese www.math.ucla.edu/~butler UCSD Combinatorics Seminar 14 October
More information