On Reaching Nirvana (a.k.a. on Reaching a Steady State) Moshe Pollak Joint work (in progress) with Tom Hope
|
|
- Shawn Sherman
- 5 years ago
- Views:
Transcription
1 On Reaching Nirvana (a.k.a. on Reaching a Steady State) Moshe Pollak Joint work (in progress) with Tom Hope 1
2 What is Nirvana? "ביקש יעקב אבינו לישב בשלוה" רש"י, בראשית לז 2 the Pursuit of Happiness (US Declaration of Independence) The Big Mathematical Question: 2
3 and the crooked shall become level WHEN????? Isaiah
4 4 \ / Warm-up period? Where does it end?
5 Terms Warm-up Start-up Burn-in Transient state 5
6 Hoad 2008: 46 methods 1 Simple Time Series Inspection Graphical 48 2 Ensemble (Batch) Average Plots Graphical 51 3 Cumulative-Mean Rule Graphical 48, 35, 32, 16, 57, 6, 51, 37, 4, 45 4 Deleting-The-Cumulative-Mean Rule Graphical 57, 6 5 CUSUM Plots Graphical 16 6 Welch's Method Graphical 34, 7, 53, 52, 51, 36, 4, 1, 45 7 Variance Plots (or Gordon Rule) Graphical 48, 35, 32, 7 8 Statistical Process Control Method (SPC) Graphical 52, 1, 8 9 Ensemble (Batch) Average Plots with Schribner's Rule Heuristic 35, 13, 7 10 Conway Rule or Forward Data-Interval Rule Heuristic 40, 13, 32, 35, 50, 7, 23, 5, 1, Modified Conway Rule or Backward Data-Interval Rule Heuristic 35, 32, 5, Crossing-Of-The-Mean Rule Heuristic 35, 32, 13, 7, 5, 58, 1 13 Autocorrelation Estimator Rule Heuristic 24, 35, 7 14 Marginal Confidence Rule or Marginal Standard Error Rules (MSER) Heuristic 5, 28, Marginal Standard Error Rule m, (e.g. m=5, MSER-5) Heuristic 28, 1, Goodness-Of-Fit Test Statistical 7 17 Relaxation Heuristics Heuristic 46, 7, 57, 6, Kelton and Law Regression Method Statistical 11, 34, 46, 7, 57, 6, 47, 52, Randomisation Tests For Initialisation Bias Statistical 23, 1 20 Schruben's Maximum Test (STS) Initialisation Bias Tests 20, 34, 18, 23, 22, Schruben's Modified Test Initialisation Bias Tests 16, 34, 28, Optimal Test (Brownian Bridge Process) Initialisation Bias Tests 22 18, 46, 7, 31, Rank Test Initialisation Bias Tests 14, 31, Batch Means Based Tests - Max Test Initialisation Bias Tests 33, 42, 43, 52, Batch Means Based Tests - Batch Means Test Initialisation Bias Tests 33, 43, 22, 28, Batch Means Based Tests - Area Test Initialisation Bias Tests 33, 43, 22, Pawlikowski's Sequential Method Hybrid 7 28 Scale Invariant Truncation Point Method (SIT) Hybrid Exponentially Weighted Moving Average Control Charts Graphical 9 30 Algorithm for a Static Dataset (ASD) Statistical 4 31 Algorithm for a Dynamic Dataset (ADD) Statistical 4 34 Telephone Network Rule Heuristic Ockerman & Goldsman Students t-tests Method Initialisation Bias Tests Ockerman & Goldsman (t-test) Compound Tests Initialisation Bias Tests Glynn & Iglehart Bias Deletion Rule Statistical Wavelet-based spectral method (WASSP) Statistical 39, 54, Queueing approximations method (MSEASVT) Statistical Chaos Theory Methods (methods M1 and M2) Statistical Beck's Approach for Cyclic output Heuristic Tocher's Cycle Rule Heuristic 7 43 Kimbler's Double exponential smoothing method Heuristic Kalman Filter method Statistical 47, Euclidean Distance (ED) Method Heuristic M eth o d Ty pe Paper Refs ID 46 Neural Networks (NN) Method Heuristic 58 6
7 State of the Art "Analytically tractable models that adequately capture steady-state response also may be unavailable, or at least difficult to determine with any degree of confidence or fidelity" (White and Robinson, 2010). Special Case Glynn and Zhang (2010): Renewal Queues 7
8 A Simple Case X 1,X 2, ~ N(δ i,1) independent δ i δ Mathematically, Nirvana is never reached We have to make our compromise with life 8
9 The Compromise There exists ε>0 such that if we are within ε of δ we pretend to be in steady-state 9
10 A Model In warm-up: X 1,X 2, ~ N(δ i,1), δ i, δ i δ-ε In steady-state: X 1,X 2, ~ N(δ,1) X 1,X 2,,X ν-1 in warm-up period X ν, X ν+1, in steady-state 10
11 Approach: sequential test of hypotheses T = stopping time, at which we hope that we have already reached steady-state H 0 : at time T we are not in steady-state H 1 : we are 11
12 If δ i, δ were known Λ k n = likelihood ratio of ν=k vs. ν= Π i=1,,k-1 exp(-½(x i -δ i ) 2 Π i=k,,n exp(-½(x i -δ) 2 = Π i=1,,n exp(-½(x i -δ i ) 2 = exp(σ i=k,,n (δ-δ i )X i -½ Σ i=k,,n (δ 2- δ i2 )) = exp(σ i=k,,n (δ-δ i )(X i -½ (δ+δ i )) 12
13 A comment In some applications, δ i are assumed to have a parametric form: δ i = δ / [1+e η-γi ] In this case, estimating η,γ will yield an estimate of δ i, to be substituted for δ i 13
14 Alternatively Represent warm-up parameters δ i by δ-ε Resulting likelihood ratios are Λ n k = exp(ε [Σ i=k,,n (X i -½(2δ-ε)) ] ) 14
15 Estimation: MLE ν n = argmax 1 k n Λ k n Stopping time: CUSUM T A = min{n max 1 k n Λ n k A} = min{n max 1 k n εσ i=k,,n (X i -½(2δ-ε)) log A} = min{n max 1 k n Σ i=k,,n (X i -½(2δ-ε)) (log A)/ε} 15
16 Confidence Proposal: measure confidence of being in steady-state at time T by P steady-state (ν T will forever remain the MLE of ν I T) (if we were to continue taking observations) 16
17 Recall: T A = min{n max 1 k n Σ i=k,,n (X i -½(2δ-ε)) (log A)/ε} log A ε Σ i=1,,n (X i -½(2δ-ε)) n CUSUM goes by cycles ν T will not remain forever log Λ T+m will go below 0 for some m ν T for some m Λ T Λ T+m 1 1 /Λ T+m Λ T A ν T ν T ν T ν T 17
18 Meaning of 1 /Λ T+m Λ T A ν T ν T 1 /Λ T+m is the likelihood ratio of X T+1,,X T+m for ν= vs. ν T ν T When ν T, the probability that this likelihood ratio will ever exceed A is less than 1 / A Therefore, if we were to continue taking observations, P steady-state ( ν T will forever remain the MLE of ν I T) 1-1 / A Hence, if we set A= 1 / β, we can guarantee with confidence 1-β that we have reached steady-state Recall: H 0 : at time T we are not in steady-state H 1 : we are Note: the power of the test is P steady-state ( ν T will forever remain the MLE of ν I T) = P ν T (likelihood ratio sequence of future observations will never exceed A I T) 1-1 / A 18
19 Type I error Definition: Let T be a stopping time G(t)= sup{ P(T t) P is a pre-steady-state probability on {X 1,X 2, } } Set t so that G(t) = α The p-value associated with stopping before having reached steady-state is G(T) 19
20 Recall: if δ i = δ-ε Λ k n = exp(σ i=k,,n (δ-δ i )X i -½Σ i=k,,n (δ 2- δ i2 )) = exp(εσ i=k,,n X i - ½( δ 2 -(δ-ε) 2 )(n-k+1) ) = exp(ε(σ i=k,,n X i (δ-½ε)(n-k+1) ) ) T β = min{n max 1 k n Λ k n 1 / β } = min{n max 1 k n {Σ i=k,,n (X i -δ i )-Σ i=k,,n (δ-δ i )+½ε log( 1 / β )/ε} min{n max 1 k n {Σ i=k,,n ( Z i )-Σ i=k,,n ( ε )+½ε log( 1 / β )/ε} T β is stochastically smallest when δ i = δ-ε p-value = G β (T β ) where G β is the cdf of T β G β (T β ) 1 exp( T β / ARL2FA ) ARL2FA 2 / ε 2 ( 1 / β + log β -1 ) + o(1) 20
21 DELTA KNOWN * Gamma = 0.1 ; Eta = 1; delta = 50 * Epsilon = 0.1; Beta = 1/100 * Nu = 73; nu_hat = 81; Stopping Time= 207 * Epsilon at stopping time = 1.4*10-7
22 If δ is unknown Define Y i = X 2i X 2i-1, Y n t(ranspose) =(Y 1,,Y n ) Here exp{ -½ (Y n -µ (k) n ) t Σ -1 (Y n -µ (k) n ) } Λ n k = exp{ -½ (Y n -µ ( ) n ) t Σ -1 (Y n -µ ( ) n ) } where (µ n (k) ) t = (µ 1,,µ k-1,0,,0) are the expectations of Y i 's (µ n ( ) ) t = (µ 1,.,µ n ) Assumption: µ i 0 as i Representing µ i by ε (or by anything non-anticipating), here, too, P steady-state (Λ n 1 for some n>t A I T ) 1/A ν A T 22
23 Type I error Clearly Π k m n f ν=k (Y m Y m-1,, Y 1 ) Λ n k = Π k m n f ν= (Y m Y m-1,, Y 1 ) Can show for m k : log ( f ν=k (Y m Y m-1,, Y 1 ) / f ν= (Y m Y m-1,, Y 1 ) ) = ε[k(k-1) m(m+1)]{z m +( m-1 / m,, 1 / m )Z - ε[k(k-1)+m(m+1)]/(4m)}/(2(m+1)) + ε[k(k-1) m(m+1)] (µ m +m -1 Σ 1 i m-1 iµ i ) /(2(m+1)) This is decreasing (separately) in each µ i T β is stochastically smallest when µ i ε p-value = G β (T β ) where G β is the cdf of T β 23
24 DELTA UNKNOWN * Same sequence as in previous slide * Epsilon = 0.01; Beta = 1/100 * Nu = 73; nu_hat = 358; Stopping Time = 422 * Epsilon at stopping time
25 Non-normal data Suppose X i ~fθ i in the warm-up period and X i ~fθ 0 in steady-state where X i are stochastically increasing. If θ 0 is known, a procedure analogous to the case of normally distributed data can be constructed. If θ 0 is not known, a procedure analogous to the case of normally distributed data cannot be easily constructed. A sometimes solution: define Y i =h(x 2i )-h(x 2i-1 ) for a suitably chosen h such that Y i are stochastically decreasing. In steady-state, EY i =0. Proceed as in the case of known steady-state (based on {-Y i }). An alternative is to go non-parametric. 25
26 Nonparametrics Y i ~f i in warm-up and Y i ~f 0 in steady-state, such that in warm-up Y i are stochastically decreasing. f i and f 0 are unknown, though f 0 is symmetric about 0. Define: σ i =1(Y i >0) r in =(Σ m=1,,n 1(Y m Y i )) Z n =((r 11,σ 1 ), (r 22,σ 2 ),, (r nn,σ n )) We need: a likelihood ratio for Z n 26
27 {Z n } is invariant with respect to transformations of Y i that leave σ i and r n i intact Let F DE be the cdf of the double exponential dist and let F 0 be the cdf of the steady-state dist F -1 DE (F 0 (Y i )) is such a transformation of Y i w.l.g. assume that in steady-state X i ~double exp Represent (transformed) warm-up distributions by f(x)=p a exp(-ax)1(x>0)+(1-p) b exp(bx)1(x<0) where p>½, a<1<b 27
28 Lemma (Savage, 1956) Let X 1, X 2,, X n ~ iid Exponential(1) and let u 1, u 2,, u n be positive constants. Then P( X 1 /u 1 < X 2 /u2 < < X n /un ) = Π i=1,,n (u i /Σ m=i,,n u m ) 28
29 Example Consider: Z 5 = ( (1,1), (1,0), (2,0), (4,1), (4,1) ) This is equivalent to Y 2 < Y 3 < 0 < Y 1 < Y 5 < Y 4 If ν=3: then Y 1 ~Exp(a), -Y 2 ~Exp(b), -Y 3,Y 4,Y 5 ~Exp(1) P ν=3 (Y 2 <Y 3 <0<Y 1 <Y 5 <Y 4 ) = [ P ν=3 (-Y 2 >-Y 3 >0 Y 2,Y 3 <0<Y 1,Y 5,Y 4 ) P(0<Y 1 <Y 5 <Y 4 Y 2,Y 3 <0<Y 1,Y 5,Y 4 ) ] P(Y 2,Y 3 <0<Y 1,Y 5,Y 4 ) = [P(Exp(1)/1 < Exp(1)/b) P(Exp(1)/a<Exp(1)/1 <Exp(1)/1)] (1-p)½p½½ 29
30 Type I error Can show that if in warm-up Y i are stochastically decreasing then T β is stochastically smallest when f i (x) p a exp(-ax)1(x>0)+(1-p) b exp(bx)1(x<0) p-value = G β (T β ) where G β is the cdf of T β 30
31 General stochastically decreasing processes Data: a sequence X 1, X 2, that is stochastically decreasing towards steady-state. Regard these as fixed. Do not observe them. After the n th data point get someone to sample a point Y n from X 1, X 2,, X n (uniformly), independently of past Y i 's. Apply a nonparametric procedure to {Y n }. 31
32 32
Overall Plan of Simulation and Modeling I. Chapters
Overall Plan of Simulation and Modeling I Chapters Introduction to Simulation Discrete Simulation Analytical Modeling Modeling Paradigms Input Modeling Random Number Generation Output Analysis Continuous
More informationInitial Transient Phase of Steady state Simulation: Methods of Its Length Detection and Their Evaluation in Akaroa2
Initial Transient Phase of Steady state Simulation: Methods of Its Length Detection and Their Evaluation in Akaroa2 A Thesis Submitted in Partial Fulfilment of the Requirement for the Degree of Master
More informationCOSC460 Honours Report: A Sequential Steady-State Detection Method for Quantitative Discrete-Event Simulation
COSC460 Honours Report: A Sequential Steady-State Detection Method for Quantitative Discrete-Event Simulation December 1, 2012 Adam Freeth afr58@uclive.ac.nz Department of Computer Science and Software
More information2WB05 Simulation Lecture 7: Output analysis
2WB05 Simulation Lecture 7: Output analysis Marko Boon http://www.win.tue.nl/courses/2wb05 December 17, 2012 Outline 2/33 Output analysis of a simulation Confidence intervals Warm-up interval Common random
More informationSequence labeling. Taking collective a set of interrelated instances x 1,, x T and jointly labeling them
HMM, MEMM and CRF 40-957 Special opics in Artificial Intelligence: Probabilistic Graphical Models Sharif University of echnology Soleymani Spring 2014 Sequence labeling aking collective a set of interrelated
More informationOutput Data Analysis for a Single System
Output Data Analysis for a Single System Chapter 9 Based on the slides provided with the textbook 2 9.1 Introduction Output data analysis is often not conducted appropriately Treating output of a single
More information7 Day 3: Time Varying Parameter Models
7 Day 3: Time Varying Parameter Models References: 1. Durbin, J. and S.-J. Koopman (2001). Time Series Analysis by State Space Methods. Oxford University Press, Oxford 2. Koopman, S.-J., N. Shephard, and
More informationTime-Varying Parameters
Kalman Filter and state-space models: time-varying parameter models; models with unobservable variables; basic tool: Kalman filter; implementation is task-specific. y t = x t β t + e t (1) β t = µ + Fβ
More informationMonte Carlo Studies. The response in a Monte Carlo study is a random variable.
Monte Carlo Studies The response in a Monte Carlo study is a random variable. The response in a Monte Carlo study has a variance that comes from the variance of the stochastic elements in the data-generating
More information5 Introduction to the Theory of Order Statistics and Rank Statistics
5 Introduction to the Theory of Order Statistics and Rank Statistics This section will contain a summary of important definitions and theorems that will be useful for understanding the theory of order
More informationNonparametric Bayesian Methods (Gaussian Processes)
[70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent
More informationP 1.5 X 4.5 / X 2 and (iii) The smallest value of n for
DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X
More informationNew Statistical Methods for Simulation Output Analysis
University of Iowa Iowa Research Online Theses and Dissertations Summer 2013 New Statistical Methods for Simulation Output Analysis Huan Yu University of Iowa Copyright 2013 Huan Yu This dissertation is
More informationChapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model
Chapter Output Analysis for a Single Model. Contents Types of Simulation Stochastic Nature of Output Data Measures of Performance Output Analysis for Terminating Simulations Output Analysis for Steady-state
More informationComputational statistics
Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated
More informationFundamentals of Data Assimila1on
014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University
More informationwith Current Status Data
Estimation and Testing with Current Status Data Jon A. Wellner University of Washington Estimation and Testing p. 1/4 joint work with Moulinath Banerjee, University of Michigan Talk at Université Paul
More informationStochastic Gradient Descent
Stochastic Gradient Descent Machine Learning CSE546 Carlos Guestrin University of Washington October 9, 2013 1 Logistic Regression Logistic function (or Sigmoid): Learn P(Y X) directly Assume a particular
More informationarxiv: v1 [stat.me] 14 Jan 2019
arxiv:1901.04443v1 [stat.me] 14 Jan 2019 An Approach to Statistical Process Control that is New, Nonparametric, Simple, and Powerful W.J. Conover, Texas Tech University, Lubbock, Texas V. G. Tercero-Gómez,Tecnológico
More informationA Very Brief Summary of Statistical Inference, and Examples
A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2009 Prof. Gesine Reinert Our standard situation is that we have data x = x 1, x 2,..., x n, which we view as realisations of random
More informationCopula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011
Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Outline Ordinary Least Squares (OLS) Regression Generalized Linear Models
More informationIntroduction to Maximum Likelihood Estimation
Introduction to Maximum Likelihood Estimation Eric Zivot July 26, 2012 The Likelihood Function Let 1 be an iid sample with pdf ( ; ) where is a ( 1) vector of parameters that characterize ( ; ) Example:
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More informationMathematical statistics
October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation
More informationBayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine
Bayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine Mike Tipping Gaussian prior Marginal prior: single α Independent α Cambridge, UK Lecture 3: Overview
More informationAdvanced Statistics II: Non Parametric Tests
Advanced Statistics II: Non Parametric Tests Aurélien Garivier ParisTech February 27, 2011 Outline Fitting a distribution Rank Tests for the comparison of two samples Two unrelated samples: Mann-Whitney
More informationPreface Introduction to Statistics and Data Analysis Overview: Statistical Inference, Samples, Populations, and Experimental Design The Role of
Preface Introduction to Statistics and Data Analysis Overview: Statistical Inference, Samples, Populations, and Experimental Design The Role of Probability Sampling Procedures Collection of Data Measures
More informationSlides 12: Output Analysis for a Single Model
Slides 12: Output Analysis for a Single Model Objective: Estimate system performance via simulation. If θ is the system performance, the precision of the estimator ˆθ can be measured by: The standard error
More informationSTAT 512 sp 2018 Summary Sheet
STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}
More informationLecture 4: Hidden Markov Models: An Introduction to Dynamic Decision Making. November 11, 2010
Hidden Lecture 4: Hidden : An Introduction to Dynamic Decision Making November 11, 2010 Special Meeting 1/26 Markov Model Hidden When a dynamical system is probabilistic it may be determined by the transition
More informationA State Space Model for Wind Forecast Correction
A State Space Model for Wind Forecast Correction Valrie Monbe, Pierre Ailliot 2, and Anne Cuzol 1 1 Lab-STICC, Université Européenne de Bretagne, France (e-mail: valerie.monbet@univ-ubs.fr, anne.cuzol@univ-ubs.fr)
More informationMathematics Qualifying Examination January 2015 STAT Mathematical Statistics
Mathematics Qualifying Examination January 2015 STAT 52800 - Mathematical Statistics NOTE: Answer all questions completely and justify your derivations and steps. A calculator and statistical tables (normal,
More informationST495: Survival Analysis: Maximum likelihood
ST495: Survival Analysis: Maximum likelihood Eric B. Laber Department of Statistics, North Carolina State University February 11, 2014 Everything is deception: seeking the minimum of illusion, keeping
More informationNotes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed
18.466 Notes, March 4, 2013, R. Dudley Maximum likelihood estimation: actual or supposed 1. MLEs in exponential families Let f(x,θ) for x X and θ Θ be a likelihood function, that is, for present purposes,
More informationLECTURE 15 Markov chain Monte Carlo
LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte
More informationContents. Preface to Second Edition Preface to First Edition Abbreviations PART I PRINCIPLES OF STATISTICAL THINKING AND ANALYSIS 1
Contents Preface to Second Edition Preface to First Edition Abbreviations xv xvii xix PART I PRINCIPLES OF STATISTICAL THINKING AND ANALYSIS 1 1 The Role of Statistical Methods in Modern Industry and Services
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 0 Output Analysis for a Single Model Purpose Objective: Estimate system performance via simulation If θ is the system performance, the precision of the
More informationSTATISTICAL METHODS FOR SIGNAL PROCESSING c Alfred Hero
STATISTICAL METHODS FOR SIGNAL PROCESSING c Alfred Hero 1999 32 Statistic used Meaning in plain english Reduction ratio T (X) [X 1,..., X n ] T, entire data sample RR 1 T (X) [X (1),..., X (n) ] T, rank
More informationTHE INITIAL TRANSIENT IN STEADY-STATE POINT ESTIMATION: CONTEXTS, A BIBLIOGRAPHY, THE MSE CRITERION, AND THE MSER STATISTIC
Proceedings of the 2010 Winter Simulation Conference B. Johansson, S. Jain, J. Montoya-Torres, J. Hugan, and E. Yücesan, eds. THE INITIAL TRANSIENT IN STEADY-STATE POINT ESTIMATION: CONTEXTS, A BIBLIOGRAPHY,
More informationThe Shiryaev-Roberts Changepoint Detection Procedure in Retrospect - Theory and Practice
The Shiryaev-Roberts Changepoint Detection Procedure in Retrospect - Theory and Practice Department of Statistics The Hebrew University of Jerusalem Mount Scopus 91905 Jerusalem, Israel msmp@mscc.huji.ac.il
More informationCentral Limit Theorem ( 5.3)
Central Limit Theorem ( 5.3) Let X 1, X 2,... be a sequence of independent random variables, each having n mean µ and variance σ 2. Then the distribution of the partial sum S n = X i i=1 becomes approximately
More informationFall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.
1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n
More informationChapter 2: Fundamentals of Statistics Lecture 15: Models and statistics
Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:
More informationSTAT 6350 Analysis of Lifetime Data. Probability Plotting
STAT 6350 Analysis of Lifetime Data Probability Plotting Purpose of Probability Plots Probability plots are an important tool for analyzing data and have been particular popular in the analysis of life
More information1 Glivenko-Cantelli type theorems
STA79 Lecture Spring Semester Glivenko-Cantelli type theorems Given i.i.d. observations X,..., X n with unknown distribution function F (t, consider the empirical (sample CDF ˆF n (t = I [Xi t]. n Then
More informationComments. Assignment 3 code released. Thought questions 3 due this week. Mini-project: hopefully you have started. implement classification algorithms
Neural networks Comments Assignment 3 code released implement classification algorithms use kernels for census dataset Thought questions 3 due this week Mini-project: hopefully you have started 2 Example:
More informationIntroduction. log p θ (y k y 1:k 1 ), k=1
ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationISyE 6644 Fall 2014 Test 3 Solutions
1 NAME ISyE 6644 Fall 14 Test 3 Solutions revised 8/4/18 You have 1 minutes for this test. You are allowed three cheat sheets. Circle all final answers. Good luck! 1. [4 points] Suppose that the joint
More informationChapter 9: Hypothesis Testing Sections
Chapter 9: Hypothesis Testing Sections 9.1 Problems of Testing Hypotheses 9.2 Testing Simple Hypotheses 9.3 Uniformly Most Powerful Tests Skip: 9.4 Two-Sided Alternatives 9.6 Comparing the Means of Two
More information17 : Markov Chain Monte Carlo
10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo
More informationECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering
ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:
More informationGAUSSIAN PROCESS REGRESSION
GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The
More informationQualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama
Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours
More informationTesting Statistical Hypotheses
E.L. Lehmann Joseph P. Romano Testing Statistical Hypotheses Third Edition 4y Springer Preface vii I Small-Sample Theory 1 1 The General Decision Problem 3 1.1 Statistical Inference and Statistical Decisions
More informationLecture 3. Truncation, length-bias and prevalence sampling
Lecture 3. Truncation, length-bias and prevalence sampling 3.1 Prevalent sampling Statistical techniques for truncated data have been integrated into survival analysis in last two decades. Truncation in
More informationThe geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan
The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan Background: Global Optimization and Gaussian Processes The Geometry of Gaussian Processes and the Chaining Trick Algorithm
More informationOn the Asymptotic Validity of Fully Sequential Selection Procedures for Steady-State Simulation
On the Asymptotic Validity of Fully Sequential Selection Procedures for Steady-State Simulation Seong-Hee Kim School of Industrial & Systems Engineering Georgia Institute of Technology Barry L. Nelson
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationCOPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition
Preface Preface to the First Edition xi xiii 1 Basic Probability Theory 1 1.1 Introduction 1 1.2 Sample Spaces and Events 3 1.3 The Axioms of Probability 7 1.4 Finite Sample Spaces and Combinatorics 15
More informationBootstrap tests. Patrick Breheny. October 11. Bootstrap vs. permutation tests Testing for equality of location
Bootstrap tests Patrick Breheny October 11 Patrick Breheny STA 621: Nonparametric Statistics 1/14 Introduction Conditioning on the observed data to obtain permutation tests is certainly an important idea
More informationThe Poisson transform for unnormalised statistical models. Nicolas Chopin (ENSAE) joint work with Simon Barthelmé (CNRS, Gipsa-LAB)
The Poisson transform for unnormalised statistical models Nicolas Chopin (ENSAE) joint work with Simon Barthelmé (CNRS, Gipsa-LAB) Part I Unnormalised statistical models Unnormalised statistical models
More informationStatistics 300B Winter 2018 Final Exam Due 24 Hours after receiving it
Statistics 300B Winter 08 Final Exam Due 4 Hours after receiving it Directions: This test is open book and open internet, but must be done without consulting other students. Any consultation of other students
More informationChapter 6. Order Statistics and Quantiles. 6.1 Extreme Order Statistics
Chapter 6 Order Statistics and Quantiles 61 Extreme Order Statistics Suppose we have a finite sample X 1,, X n Conditional on this sample, we define the values X 1),, X n) to be a permutation of X 1,,
More informationNetwork Simulation Chapter 6: Output Data Analysis
Network Simulation Chapter 6: Output Data Analysis Prof. Dr. Jürgen Jasperneite 1 Contents Introduction Types of simulation output Transient detection When to terminate a simulation 2 Prof. Dr. J ürgen
More informationMathematical Statistics
Mathematical Statistics MAS 713 Chapter 8 Previous lecture: 1 Bayesian Inference 2 Decision theory 3 Bayesian Vs. Frequentist 4 Loss functions 5 Conjugate priors Any questions? Mathematical Statistics
More informationNEW ESTIMATORS FOR PARALLEL STEADY-STATE SIMULATIONS
roceedings of the 2009 Winter Simulation Conference M. D. Rossetti, R. R. Hill, B. Johansson, A. Dunkin, and R. G. Ingalls, eds. NEW ESTIMATORS FOR ARALLEL STEADY-STATE SIMULATIONS Ming-hua Hsieh Department
More informationA new Hierarchical Bayes approach to ensemble-variational data assimilation
A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More informationStatistical inference on Lévy processes
Alberto Coca Cabrero University of Cambridge - CCA Supervisors: Dr. Richard Nickl and Professor L.C.G.Rogers Funded by Fundación Mutua Madrileña and EPSRC MASDOC/CCA student workshop 2013 26th March Outline
More informationMathematics Ph.D. Qualifying Examination Stat Probability, January 2018
Mathematics Ph.D. Qualifying Examination Stat 52800 Probability, January 2018 NOTE: Answers all questions completely. Justify every step. Time allowed: 3 hours. 1. Let X 1,..., X n be a random sample from
More informationStatistical Inference and Methods
Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 31st January 2006 Part VI Session 6: Filtering and Time to Event Data Session 6: Filtering and
More informationLecturer: Olga Galinina
Renewal models Lecturer: Olga Galinina E-mail: olga.galinina@tut.fi Outline Reminder. Exponential models definition of renewal processes exponential interval distribution Erlang distribution hyperexponential
More informationProblem Selected Scores
Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected
More information41903: Introduction to Nonparametrics
41903: Notes 5 Introduction Nonparametrics fundamentally about fitting flexible models: want model that is flexible enough to accommodate important patterns but not so flexible it overspecializes to specific
More informationCPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017
CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017 Quote of the Day A person with one watch knows what time it is. A person with two
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationProbabilistic Graphical Models for Image Analysis - Lecture 4
Probabilistic Graphical Models for Image Analysis - Lecture 4 Stefan Bauer 12 October 2018 Max Planck ETH Center for Learning Systems Overview 1. Repetition 2. α-divergence 3. Variational Inference 4.
More informationIntelligent Systems (AI-2)
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 19 Oct, 24, 2016 Slide Sources Raymond J. Mooney University of Texas at Austin D. Koller, Stanford CS - Probabilistic Graphical Models D. Page,
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationClassification: Logistic Regression from Data
Classification: Logistic Regression from Data Machine Learning: Jordan Boyd-Graber University of Colorado Boulder LECTURE 3 Slides adapted from Emily Fox Machine Learning: Jordan Boyd-Graber Boulder Classification:
More informationStudent-t Process as Alternative to Gaussian Processes Discussion
Student-t Process as Alternative to Gaussian Processes Discussion A. Shah, A. G. Wilson and Z. Gharamani Discussion by: R. Henao Duke University June 20, 2014 Contributions The paper is concerned about
More informationHypothesis testing: theory and methods
Statistical Methods Warsaw School of Economics November 3, 2017 Statistical hypothesis is the name of any conjecture about unknown parameters of a population distribution. The hypothesis should be verifiable
More informationDesign and Implementation of CUSUM Exceedance Control Charts for Unknown Location
Design and Implementation of CUSUM Exceedance Control Charts for Unknown Location MARIEN A. GRAHAM Department of Statistics University of Pretoria South Africa marien.graham@up.ac.za S. CHAKRABORTI Department
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationLecture 21: Convergence of transformations and generating a random variable
Lecture 21: Convergence of transformations and generating a random variable If Z n converges to Z in some sense, we often need to check whether h(z n ) converges to h(z ) in the same sense. Continuous
More informationEEC 686/785 Modeling & Performance Evaluation of Computer Systems. Lecture 19
EEC 686/785 Modeling & Performance Evaluation of Computer Systems Lecture 19 Department of Electrical and Computer Engineering Cleveland State University wenbing@ieee.org (based on Dr. Raj Jain s lecture
More informationRedacted for Privacy
AN ABSTRACT OF THE THESIS OF Lori K. Baxter for the degree of Master of Science in Industrial and Manufacturing Engineering presented on June 4, 1990. Title: Truncation Rules in Simulation Analysis: Effect
More informationStatistical Data Mining and Machine Learning Hilary Term 2016
Statistical Data Mining and Machine Learning Hilary Term 2016 Dino Sejdinovic Department of Statistics Oxford Slides and other materials available at: http://www.stats.ox.ac.uk/~sejdinov/sdmml Naïve Bayes
More informationPROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers
PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates Rutgers, The State University ofnew Jersey David J. Goodman Rutgers, The State University
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationDesign of the Fuzzy Rank Tests Package
Design of the Fuzzy Rank Tests Package Charles J. Geyer July 15, 2013 1 Introduction We do fuzzy P -values and confidence intervals following Geyer and Meeden (2005) and Thompson and Geyer (2007) for three
More informationQuantifying Stochastic Model Errors via Robust Optimization
Quantifying Stochastic Model Errors via Robust Optimization IPAM Workshop on Uncertainty Quantification for Multiscale Stochastic Systems and Applications Jan 19, 2016 Henry Lam Industrial & Operations
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationMotivation Scale Mixutres of Normals Finite Gaussian Mixtures Skew-Normal Models. Mixture Models. Econ 690. Purdue University
Econ 690 Purdue University In virtually all of the previous lectures, our models have made use of normality assumptions. From a computational point of view, the reason for this assumption is clear: combined
More informationOutline. Simulation of a Single-Server Queueing System. EEC 686/785 Modeling & Performance Evaluation of Computer Systems.
EEC 686/785 Modeling & Performance Evaluation of Computer Systems Lecture 19 Outline Simulation of a Single-Server Queueing System Review of midterm # Department of Electrical and Computer Engineering
More informationReview. December 4 th, Review
December 4 th, 2017 Att. Final exam: Course evaluation Friday, 12/14/2018, 10:30am 12:30pm Gore Hall 115 Overview Week 2 Week 4 Week 7 Week 10 Week 12 Chapter 6: Statistics and Sampling Distributions Chapter
More informationWill Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:
Will Murray s Probability, XXXII. Moment-Generating Functions XXXII. Moment-Generating Functions Premise We have several random variables, Y, Y, etc. We want to study functions of them: U (Y,..., Y n ).
More informationSTA414/2104 Statistical Methods for Machine Learning II
STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements
More informationLecture Characterization of Infinitely Divisible Distributions
Lecture 10 1 Characterization of Infinitely Divisible Distributions We have shown that a distribution µ is infinitely divisible if and only if it is the weak limit of S n := X n,1 + + X n,n for a uniformly
More information