Poisson Approximation for Two Scan Statistics with Rates of Convergence
|
|
- Warren Charles
- 5 years ago
- Views:
Transcription
1 Poisson Approximation for Two Scan Statistics with Rates of Convergence Xiao Fang (Joint work with David Siegmund) National University of Singapore May 28, 2015
2 Outline The first scan statistic The second scan statistic Other scan statistics
3 A statistical testing problem Let {X 1,..., X n } be an independent sequence of random variables. We want to test the hypothesis against the alternative H 0 : X 1,..., X n F θ0 ( ) H 1 : for some i < j, X i+1,..., X j F θ1 ( ) X 1,..., X i, X j+1,..., X n F θ0 ( ) i and j are called change-points. They are not specified in the alternative hypothesis. θ 0 may be given, or may need to be estimated. θ 1 may be given, or may be a nuisance parameter.
4 The first scan statistic If j i = t is given and F θ0 ( ) and F θ1 ( ) have different mean values, a natural statistic is M n;t = max T i, T i = X i + + X i+t 1. 1 i n t 1 We are interested in its p-value: Assume X 1,..., X n F θ0 ( ), P(M n;t b) = P( max T i b) 1 i n t+1 =?
5 Known results Let Y i = I (T i b). {max 1 i n t+1 T i b} = { n t+1 i=1 Y i 1}. Dembo and Karlin (1992) proved that if t is fixed and b, n plus mild conditions on F θ0 ( ), then n t+1 P(M n;t b) = P( Y i 1) 1 e λ where λ = (n t + 1)E(Y 1 ). i=1 Mild conditions on F θ0 ( ) ensures that P(Y i+1 = 1 Y i = 1) 0.
6 t : If X i Bernoulli(p) and b is an integer, Arratia, Gordon and Waterman (1990) prove that P(M n;t b) (1 e λ ) C(e ct + t )(λ 1) (1) n where λ = (n t + 1)P(T 1 = b)( b t p). Haiman (2007) derived more accurate approximations using the distribution function of Z k := max{t 1,..., T kt+1 } for k = 1 and 2. The distribution functions of Z k for k = 1 and 2 are only known for Bernoulli and Poisson random variables. Our objective is to extend (1) to other random variables.
7 Preparation for the main result: Let µ 0 = E(X 1 ). We assume b = at where a > µ 0. P( max T i b) = P( 1 i n t+1 max 1 i n t+1 X i + + X i+t 1 t a). We assume the distribution of X 1 can be imbedded in an exponential family of distributions df θ (x) = e θx Ψ(θ) df (x), θ Θ. (2) It is known that F θ has mean Ψ (θ) and variance Ψ (θ). Assume θ 0 = 0, i.e., X 1 F and there exists θ a Θ o such that Ψ (θ a ) = a. Example: X 1 N(0, 1), Ψ(θ) = θ2 2, θ a = a, F θa N(a, 1).
8 Assumption (2) is used in two places: 1 To obtain an accurate approximation to the marginal probability P(T 1 at) by change of measure. 2 Local limit theorem Diaconis and Freedman (1988): d TV (L(X 1,..., X m T 1 = at), L(X a 1,..., X a m)) Cm t where X a 1,..., X a m are i.i.d. and X a 1 F θ a. Let D k = k i=1 (X a i X i ). Let σ 2 a = Ψ (θ a ).
9 Theorem Under the assumption (2), for some constant C depending only on the exponential family (2), µ 0, and a, we have P(M n;t at) (1 e λ ) C( (log t)2 + t where if X 1 is nonlattice plus mild conditions, (log t log(n t)) )(λ 1), n t λ = (n t + 1)e [aθa Ψ(θa)]t θ a σ a (2πt) 1/2 exp[ k=1 1 k E(e θad+ k )], and if X 1 is integer-valued with span 1, λ = (n t + 1)e (aθa Ψ(θa))t e θa( at at) (1 e θa )σ a (2πt) 1/2 exp[ k=1 1 k E(e θad+ k )].
10 Remarks: We don t have an explicit expression for the constant C. The relative error 0 if t, n t. Let g(x) = Ee ixd 1 and ξ(x) = log{1/[1 g(x)]}. Woodroofe (1979) proved that for the nonlattice case, 1 k E(e θad+ k ) = log[(a µ0 )θ a ] 1 π k=1 + 1 π 0 0 θ 2 a[i ξ(x) π 2 ] x(θ 2 a + x 2 ) dx θ a {Rξ(x) + log[(a µ 0 )x]} θa 2 + x 2 dx where R and I denote real and imaginary parts. Tu and Siegmund (1999) proved that for the arithmetic case, 1 k E(e θad+ k ) = log(a µ0 ) k= π 2π 0 { ξ(x)e θ a ix 1 e θa ix + ξ(x) + log[(a µ 0)(1 e ix } )] 1 e ix dx.
11 Example 1: Normal distribution. n t a p 1 p
12 Example 2: Bernoulli distribution. n t µ 0 a p 1 p /
13 Sketch of proof: Let m = C(log t log(n t)). Let Let Y i = I (T i at,t i+1 < T i,..., T i+m < T i T i 1 < T i,..., T i m < T i ). W = n t+1 i=1 P(M n;t at) P(W 1). Y i, λ 1 = EW = (n t + 1)EY 1. From the Poisson approximation theorem of Arratia, Goldstein and Gordon (1990), we have P(W 1) (1 e λ 1 ) C( 1 t + 1 )(λ 1). n t
14 Approximating λ 1 by λ: EY 1 =P(T 1 at, T 2 < T 1,..., T 1+m < T 1 ; T 0 < T 1,..., T 1 m < T 1 ) P(T 1 at)p 2 (T 1 T 2 > 0,..., T 1 T 1+m > 0 T 1 at) Note that T 1 T 2 = X 1 X t+1 and that given T 1 at, X 1 F θa approximately and X t+1 F. Thus, {T 1 T 2 > 0} {D 1 > 0} where D 1 = X a 1 X 1. Similarly, {T 1 T k+1 > 0} {D k > 0}, D k = k Therefore, i=1 (X a EY 1 P(T 1 at)p 2 (D k > 0, k = 1, 2,... ). i X i ). Recall λ = (n t + 1)e [aθa Ψ(θa)]t θ a σ a (2πt) 1/2 exp[ k=1 1 k E(e θad+ k )].
15 Corollary Let {X 1,..., X n } be i.i.d. random variables with distribution function F that can be imbedded in an exponential family, as in (2). Let EX 1 = µ 0. Assume X 1 is integer-valued with span 1. Suppose a = sup{x : p x := P(X 1 = x) > 0} is finite. Let b = at. Then we have, with constants C and c depending only on p a, P(M n;t b) (1 e λ ) C(λ 1)e ct where λ = (n t)p t a(1 p a ) + p t a.
16 The second scan statistic Recall that we want to test against the alternative H 0 : X 1,..., X n F θ0 ( ) H 1 : for some i < j, X i+1,..., X j F θ1 ( ) X 1,..., X i, X j+1,..., X n F θ0 ( ) Now assume j i is not given, and F θ0 and F θ1 are from the same exponential family of distributions df θ (x) = e θx Ψ(θ) df (x), θ Θ. Then the log likelihood ratio statistic is j max (θ 1 θ 0 )(X k Ψ(θ 1) Ψ(θ 0 ) ). 0 i<j n θ 1 θ 0 k=i+1
17 It reduces to the following problem: Let {X 1,..., X n } be independent, identically distributed random variables. Let EX 1 = µ 0 < 0. Let S 0 = 0 and S i = i j=1 X j for 1 i n. We are interested in the distribution of M n := max (S j S i ). 0 i<j n Iglehart (1972) observed that it can be interpreted as the maximum waiting time of the first n customers in a single server queue. Karlin, Dembo and Kawabata (1990) discussed genomic applications.
18 The limiting distribution was derived by Iglehart (1972): Assume the distribution of X 1 can be imbedded in an exponential family of distributions df θ (x) = e θx Ψ(θ) df (x), θ Θ. Assume EX 1 = Ψ (0) = µ 0 < 0 and there exists a positive θ 1 Θ such that Ψ (θ 1 ) = µ 1, Ψ(θ 1 ) = 0. When X 1 is nonlattice, we have lim n P(M n log n θ 1 + x) = 1 exp( K e θ 1x ).
19 Theorem Let h(b) > 0 be any function such that h(b), h(b) = O(b 1/2 ) as b. Suppose n b/µ 1 > b 1/2 h(b). We have, P(M n b) (1 e λ ) { } ( Cλ 1 + b/h2 (b) ) e ch2 (b) + b1/2 h(b) n b/µ 1 n b µ 1 where if X 1 is nonlattice plus mild conditions, λ = (n b µ 1 ) e θ 1b θ 1 µ 1 exp( 2 k=1 1 k E θ 1 e θ 1S + k ), and if X 1 is integer-valued with span 1 and b is an integer, λ = (n b e θ 1b 1 ) µ 1 (1 e θ exp( 2 1 )µ1 k E θ 1 e θ 1S + k ). k=1
20 Remarks: By choosing h(b) = b 1/2, we get P(M n b) (1 e λ ) Cλ{e cb + b n } By choosing h(b) = C(log b) 1/2 with large enough C, we can see that the relative error in the Poisson approximation goes to zero under the conditions b, (b log b) 1/2 n b/µ 1 = O(e θ 1b ), where n b/µ 1 = O(e θ 1b ) ensures that λ is bounded. For the smaller range (in which case λ 0) b, δb n b/µ 1 = o(e 1 2 θ 1b ) for some δ > 0, Siegmund (1988) obtained more accurate estimates by a technique different from ours.
21 Let G(z) = 0 p kz k + 1 q kz k, and let z 0 denote the unique root > 1 of G(z) = 1. For the case p k = 0 for k > 1, using the notation Q(z) = k q kz k, one can show for large values of n and b that λ nz0 b {[Q(1) Q(z 1 0 )] (1 z 1 0 )z 1 0 Q (z0 1 )}. For the case q k = 0 for k > 1, λ nz0 b (1 z 1 0 )) G (1) 2 /G (z 0 ). In particular if q 1 = q and p 1 = p, where p + q = 1, both these results specialize to λ n(p/q) b (q p) 2 /q.
22 Sketch of proof (for the case h(b) = b 1/2 ): Recall S i = ik=1 X k. Define T b := inf{n 1 : S n / [0, b)}. For a positive integer m, let ω + m be the m-shifted sample path of ω := {X 1,..., X n }. Let t = b µ 1 + b and m = cb such that m < t. For 1 i n t, let Y i = I ( S i < S i j, 1 j m; T b (ω + i ) t, S Tb (ω + i ) b ). That is, Y i is the indicator of the event that the sequence {S 1,... S n } reaches a local minimum at i and the i-shifted sequence {S i (ω + α )} exits the interval [0, b) within time t and the first exiting position is b. Let W = n t i=1 Y i.
23 Sketch of proof (cont.) P(M n b) P(W 1). P(W 1) (1 e λ 1 ) Cλe cb. λ 1 = (n t)ey 1 (n t)p(τ 0 = )P(S Tb b) where τ 0 := inf{n 1 : S n 0}. λ 1 λ.
24 Other statistics Recall again that we want to test H 0 : X 1,..., X n F θ0 ( ) against the alternative H 1 : for some i < j, X i+1,..., X j F θ1 ( ) X 1,..., X i, X j+1,..., X n F θ0 ( ) 1. If θ 0 is not given, we need to consider P(M n;t b S n ) and P(M n b S n ).
25 2. If θ 0 is given but θ 1 is a nuisance parameter, then the log likelihood ratio statistic is max max[θ(s j S i ) (j i)ψ(θ)]. 0 i<j n For normal distribution, it reduces to θ (S j S i ) 2 max 0 i<j n 2(j i). The limit of is only know for normal distribution and for n b 2 [Siegmund and Venkatraman (1995)].
26 3. Frick, Munk and Sieling (2014) proposed the following multiscale statistic: { Sj S i max 2 log ( n ) }. 0 i<j n j i j i The penalty term 2 log(n/(j i)) was first studied in Dümbgen and Spokoiny (2001) and motivated by Lévy s modulus of continuity theorem.
27 4. Let X 1,..., X m be an independent sequence of Gaussian random variables with mean EX i = µ i and variance 1. We are interested in testing the null hypothesis H 0 : µ 1 = = µ m against the alternative hypothesis that there exist 1 τ 1 < < τ K m 1 such that H 1 :µ 1 =... µ τ1 µ τ1 +1 = = µ τ2 = µ τk µ τk +1 = = µ m.
28 4. (cont.) If K = 1, the log likelihood ratio statistic is S m S t m t St t max. 1 t m 1 1 t + 1 m t If K > 1, an appropriate statistic is max 0 i<j<k m { S j S i j i S k S j k j 1 j i + 1 k j }.
29 Thank you!
Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk
Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments
More informationIntroduction to Rare Event Simulation
Introduction to Rare Event Simulation Brown University: Summer School on Rare Event Simulation Jose Blanchet Columbia University. Department of Statistics, Department of IEOR. Blanchet (Columbia) 1 / 31
More informationEstimation of arrival and service rates for M/M/c queue system
Estimation of arrival and service rates for M/M/c queue system Katarína Starinská starinskak@gmail.com Charles University Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More information18.175: Lecture 13 Infinite divisibility and Lévy processes
18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility
More informationEcon 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines
Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines Maximilian Kasy Department of Economics, Harvard University 1 / 37 Agenda 6 equivalent representations of the
More informationModel Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao
Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics Jiti Gao Department of Statistics School of Mathematics and Statistics The University of Western Australia Crawley
More informationSTAT FINAL EXAM
STAT101 2013 FINAL EXAM This exam is 2 hours long. It is closed book but you can use an A-4 size cheat sheet. There are 10 questions. Questions are not of equal weight. You may need a calculator for some
More informationRare-Event Simulation
Rare-Event Simulation Background: Read Chapter 6 of text. 1 Why is Rare-Event Simulation Challenging? Consider the problem of computing α = P(A) when P(A) is small (i.e. rare ). The crude Monte Carlo estimator
More informationStein s Method: Distributional Approximation and Concentration of Measure
Stein s Method: Distributional Approximation and Concentration of Measure Larry Goldstein University of Southern California 36 th Midwest Probability Colloquium, 2014 Concentration of Measure Distributional
More informationPROBABILITY THEORY LECTURE 3
PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationMaster s Written Examination - Solution
Master s Written Examination - Solution Spring 204 Problem Stat 40 Suppose X and X 2 have the joint pdf f X,X 2 (x, x 2 ) = 2e (x +x 2 ), 0 < x < x 2
More informationLecture 5: Moment Generating Functions
Lecture 5: Moment Generating Functions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 28th, 2018 Rasmussen (CUED) Lecture 5: Moment
More information1 Complete Statistics
Complete Statistics February 4, 2016 Debdeep Pati 1 Complete Statistics Suppose X P θ, θ Θ. Let (X (1),..., X (n) ) denote the order statistics. Definition 1. A statistic T = T (X) is complete if E θ g(t
More informationLearning Patterns for Detection with Multiscale Scan Statistics
Learning Patterns for Detection with Multiscale Scan Statistics J. Sharpnack 1 1 Statistics Department UC Davis UC Davis Statistics Seminar 2018 Work supported by NSF DMS-1712996 Hyperspectral Gas Detection
More informationData analysis and stochastic modeling
Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationChapter 4: Modelling
Chapter 4: Modelling Exchangeability and Invariance Markus Harva 17.10. / Reading Circle on Bayesian Theory Outline 1 Introduction 2 Models via exchangeability 3 Models via invariance 4 Exercise Statistical
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More information18.175: Lecture 15 Characteristic functions and central limit theorem
18.175: Lecture 15 Characteristic functions and central limit theorem Scott Sheffield MIT Outline Characteristic functions Outline Characteristic functions Characteristic functions Let X be a random variable.
More informationOther properties of M M 1
Other properties of M M 1 Přemysl Bejda premyslbejda@gmail.com 2012 Contents 1 Reflected Lévy Process 2 Time dependent properties of M M 1 3 Waiting times and queue disciplines in M M 1 Contents 1 Reflected
More informationReview. December 4 th, Review
December 4 th, 2017 Att. Final exam: Course evaluation Friday, 12/14/2018, 10:30am 12:30pm Gore Hall 115 Overview Week 2 Week 4 Week 7 Week 10 Week 12 Chapter 6: Statistics and Sampling Distributions Chapter
More informationNotes 9 : Infinitely divisible and stable laws
Notes 9 : Infinitely divisible and stable laws Math 733 - Fall 203 Lecturer: Sebastien Roch References: [Dur0, Section 3.7, 3.8], [Shi96, Section III.6]. Infinitely divisible distributions Recall: EX 9.
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationSpring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =
Spring 2012 Math 541A Exam 1 1. (a) Let Z i be independent N(0, 1), i = 1, 2,, n. Are Z = 1 n n Z i and S 2 Z = 1 n 1 n (Z i Z) 2 independent? Prove your claim. (b) Let X 1, X 2,, X n be independent identically
More informationEconomics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,
Economics 520 Lecture Note 9: Hypothesis Testing via the Neyman-Pearson Lemma CB 8., 8.3.-8.3.3 Uniformly Most Powerful Tests and the Neyman-Pearson Lemma Let s return to the hypothesis testing problem
More informationSTAT215: Solutions for Homework 1
STAT25: Solutions for Homework Due: Wednesday, Jan 30. (0 pt) For X Be(α, β), Evaluate E[X a ( X) b ] for all real numbers a and b. For which a, b is it finite? (b) What is the MGF M log X (t) for the
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationq P (T b < T a Z 0 = z, X 1 = 1) = p P (T b < T a Z 0 = z + 1) + q P (T b < T a Z 0 = z 1)
Random Walks Suppose X 0 is fixed at some value, and X 1, X 2,..., are iid Bernoulli trials with P (X i = 1) = p and P (X i = 1) = q = 1 p. Let Z n = X 1 + + X n (the n th partial sum of the X i ). The
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationPoisson Processes. Stochastic Processes. Feb UC3M
Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationChapter 3 : Likelihood function and inference
Chapter 3 : Likelihood function and inference 4 Likelihood function and inference The likelihood Information and curvature Sufficiency and ancilarity Maximum likelihood estimation Non-regular models EM
More informationOn the Breiman conjecture
Péter Kevei 1 David Mason 2 1 TU Munich 2 University of Delaware 12th GPSD Bochum Outline Introduction Motivation Earlier results Partial solution Sketch of the proof Subsequential limits Further remarks
More informationStat 426 : Homework 1.
Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationUC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes
UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores
More informationIn Memory of Wenbo V Li s Contributions
In Memory of Wenbo V Li s Contributions Qi-Man Shao The Chinese University of Hong Kong qmshao@cuhk.edu.hk The research is partially supported by Hong Kong RGC GRF 403513 Outline Lower tail probabilities
More informationECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes
ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV
More information10. Composite Hypothesis Testing. ECE 830, Spring 2014
10. Composite Hypothesis Testing ECE 830, Spring 2014 1 / 25 In many real world problems, it is difficult to precisely specify probability distributions. Our models for data may involve unknown parameters
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationContinuous-Time Markov Chain
Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationNotes 18 : Optional Sampling Theorem
Notes 18 : Optional Sampling Theorem Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Chapter 14], [Dur10, Section 5.7]. Recall: DEF 18.1 (Uniform Integrability) A collection
More informationMODERATE DEVIATIONS IN POISSON APPROXIMATION: A FIRST ATTEMPT
Statistica Sinica 23 (2013), 1523-1540 doi:http://dx.doi.org/10.5705/ss.2012.203s MODERATE DEVIATIONS IN POISSON APPROXIMATION: A FIRST ATTEMPT Louis H. Y. Chen 1, Xiao Fang 1,2 and Qi-Man Shao 3 1 National
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationContinuous-time Markov Chains
Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017
More informationNetworks: Lectures 9 & 10 Random graphs
Networks: Lectures 9 & 10 Random graphs Heather A Harrington Mathematical Institute University of Oxford HT 2017 What you re in for Week 1: Introduction and basic concepts Week 2: Small worlds Week 3:
More informationFinal. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationCS Lecture 19. Exponential Families & Expectation Propagation
CS 6347 Lecture 19 Exponential Families & Expectation Propagation Discrete State Spaces We have been focusing on the case of MRFs over discrete state spaces Probability distributions over discrete spaces
More informationConcentration of Measures by Bounded Couplings
Concentration of Measures by Bounded Couplings Subhankar Ghosh, Larry Goldstein and Ümit Işlak University of Southern California [arxiv:0906.3886] [arxiv:1304.5001] May 2013 Concentration of Measure Distributional
More informationNotes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed
18.466 Notes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed 1. MLEs in exponential families Let f(x,θ) for x X and θ Θ be a likelihood function, that is, for present purposes,
More informationNonparametric hypothesis tests and permutation tests
Nonparametric hypothesis tests and permutation tests 1.7 & 2.3. Probability Generating Functions 3.8.3. Wilcoxon Signed Rank Test 3.8.2. Mann-Whitney Test Prof. Tesler Math 283 Fall 2018 Prof. Tesler Wilcoxon
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More information2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES
295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More informationRenewal theory and its applications
Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially
More informationPhysicsAndMathsTutor.com
PhysicsAndMathsTutor.com June 2005 6. A continuous random variable X has probability density function f(x) where 3 k(4 x x ), 0 x 2, f( x) = 0, otherwise, where k is a positive integer. 1 (a) Show that
More informationStatistics Ph.D. Qualifying Exam: Part I October 18, 2003
Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer
More informationLower Tail Probabilities and Normal Comparison Inequalities. In Memory of Wenbo V. Li s Contributions
Lower Tail Probabilities and Normal Comparison Inequalities In Memory of Wenbo V. Li s Contributions Qi-Man Shao The Chinese University of Hong Kong Lower Tail Probabilities and Normal Comparison Inequalities
More information{σ x >t}p x. (σ x >t)=e at.
3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ
More informationSpring 2012 Math 541B Exam 1
Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote
More informationRandom Walks Conditioned to Stay Positive
1 Random Walks Conditioned to Stay Positive Bob Keener Let S n be a random walk formed by summing i.i.d. integer valued random variables X i, i 1: S n = X 1 + + X n. If the drift EX i is negative, then
More informationSTA 260: Statistics and Probability II
Al Nosedal. University of Toronto. Winter 2017 1 Chapter 7. Sampling Distributions and the Central Limit Theorem If you can t explain it simply, you don t understand it well enough Albert Einstein. Theorem
More informationStein Couplings for Concentration of Measure
Stein Couplings for Concentration of Measure Jay Bartroff, Subhankar Ghosh, Larry Goldstein and Ümit Işlak University of Southern California [arxiv:0906.3886] [arxiv:1304.5001] [arxiv:1402.6769] Borchard
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More information[Chapter 6. Functions of Random Variables]
[Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating
More informationReview. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with
More information18.175: Lecture 17 Poisson random variables
18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More
More informationADVANCED PROBABILITY: SOLUTIONS TO SHEET 1
ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More information8 Laws of large numbers
8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable
More informationA Few Notes on Fisher Information (WIP)
A Few Notes on Fisher Information (WIP) David Meyer dmm@{-4-5.net,uoregon.edu} Last update: April 30, 208 Definitions There are so many interesting things about Fisher Information and its theoretical properties
More informationMath 576: Quantitative Risk Management
Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 11 Haijun Li Math 576: Quantitative Risk Management Week 11 1 / 21 Outline 1
More informationStat 100a, Introduction to Probability.
Stat 100a, Introduction to Probability. Outline for the day: 1. Geometric random variables. 2. Negative binomial random variables. 3. Moment generating functions. 4. Poisson random variables. 5. Continuous
More informationMATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours
MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY
More information,..., θ(2),..., θ(n)
Likelihoods for Multivariate Binary Data Log-Linear Model We have 2 n 1 distinct probabilities, but we wish to consider formulations that allow more parsimonious descriptions as a function of covariates.
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationThe Central Limit Theorem: More of the Story
The Central Limit Theorem: More of the Story Steven Janke November 2015 Steven Janke (Seminar) The Central Limit Theorem:More of the Story November 2015 1 / 33 Central Limit Theorem Theorem (Central Limit
More informationE cient Monte Carlo for Gaussian Fields and Processes
E cient Monte Carlo for Gaussian Fields and Processes Jose Blanchet (with R. Adler, J. C. Liu, and C. Li) Columbia University Nov, 2010 Jose Blanchet (Columbia) Monte Carlo for Gaussian Fields Nov, 2010
More informationACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION. <www.ie.ncsu.edu/jwilson> May 22, 2006
ACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION Slide 1 Faker Zouaoui Sabre Holdings James R. Wilson NC State University May, 006 Slide From American
More information6.041/6.431 Fall 2010 Quiz 2 Solutions
6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential
More information