ACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION. < May 22, 2006

Size: px
Start display at page:

Download "ACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION. <www.ie.ncsu.edu/jwilson> May 22, 2006"

Transcription

1 ACCOUNTING FOR INPUT-MODEL AND INPUT-PARAMETER UNCERTAINTIES IN SIMULATION Slide 1 Faker Zouaoui Sabre Holdings James R. Wilson NC State University < May, 006 Slide From American Scientist, September October 004.

2 OVERVIEW I. Introduction A. Objectives B. Structure of the Simulation Experiment C. Bayesian Model Averaging (BMA) Slide 3 II. BMA-Based Simulation Replication Algorithm A. Variance Decomposition B. Replication-Allocation Procedures C. Output Analysis III. Application to M/G/1 Queueing Simulation IV. Conclusions and Recommendations I. Introduction A. Objectives Slide 4 Formulation of a Bayesian approach to selection of input models in stochastic simulation that accounts for 3 main sources of uncertainty: 1. Stochastic uncertainty arises from dependence of the simulation output on the random numbers generated and used during each run.. Model uncertainty arises when choosing between different types of input models with different functional forms that adequately fit available sample data or subjective information. 3. Parameter uncertainty arises when the parameters of the selected input model(s) are unknown and must be estimated from sample data or subjective information. Evaluation of the performance of the Bayesian approach versus conventional frequentist techniques.

3 B. Structure of Simulation Experiment The output of interest y is an unknown function of the random number input u, the input model M, and its parameter vector θ M, Slide 5 y = y(u,m,θ M ). Let η(m, θ M ) E u [ y(u,m,θm ) M,θM ] = y(u,m,θ M ) du denote the conditional expected value of y given M and θ M. Let M ={M k : k = 1,...,K} denote the set of adequate input models that fit the data X. The kth model M k has prior probability p(m k ) and parameter vector θ k with prior distribution p(θ k M k ). Slide 6 We seek point and confidence-interval estimators of the posterior mean E(y X) = p(m k X) η(m k, θ k )p(θ k X,M k ) dθ k, where p(θ k X,M k ) is the posterior distribution of θ k under model M k and p(m k X) is the posterior probability of model M k.

4 C. Bayesian Model Averaging (BMA) The posterior model probabilities p(m k X) are computed as p(m k X) = p(m k )p(x M k ) Kj=1 p(m j )p(x M j ) for k = 1,...,K, (1) Slide 7 where the marginal data density given input model M k is p(x M k ) = p(x M k, θ k )p(θ k M k )dθ k for k = 1,...,K. () The posterior distributions p(θ k X,M k ) are computed using Bayes rule as p(θ k X,M k ) = p(θ k M k )p(x M k, θ k ) for k = 1,...,K. (3) p(x M k ) Computational methods: numerical integration, asymptotic approximations, Markov Chain Monte Carlo (MCMC) methods. II. BMA-Based Simulation Replication Algorithm Slide 8 for k = 1,...,K set the input model M M k for r = 1,...,R k generate the rth sample θ r independently from p(θ X,M) set the input-parameter vector θ θ r for j = 1,...,m generate the jth sample u j of i.i.d. random numbers set the random-number input u u j perform the jth simulation run using u, M, and θ calculate the simulation output response y krj = y(u,m,θ) end loop compute y kr = m / j=1 y krj m end loop compute the grand mean for the kth input model, y k = R k r=1 y kr/ Rk end loop compute the weighted mean β K p(m k X)y k to estimate E(y X)

5 A. Variance Decomposition Basic Assumptions The response from the jth run using the random-number input u j, the kth input model M k, and the rth sample of the associated input-model parameters θk r is y krj = y(u j,m k, θ r k ) = η(m k, θ r k ) + e j (u j,m k, θ r k ) (4) Slide 9 for k = 1,...,K; r = 1,...,R k ; and j = 1,...,m, where [ E uj ej (u j,m k, θk r ) X,M k, θk r ] = 0 [ Var uj ej (u j,m k, θk r ) X,Mk, θk r ] = τk so that E uj (y krj X,M k, θk r ) = η(m k, θk r ) Var uj (y krj X,M k, θk r ) = τ k for k = 1,...,K; r = 1,...,R k ; and j = 1,...,m. (5) (6) Slide 10 We take η(m k, θ r k ) = β k + δ kr (M k, θ r k ) (7) for k = 1,...,K and r = 1,...,R k, where [ β k E θ r η(mk k, θk r ) ] X,Mk = η(m k, θ k )p(θ k X,M k ) dθ k = E(y X,M k ) is the posterior mean response given the kth input model for k = 1,...,K so that we have [ E θ r δkr k (M k, θk r ) ] X,M k = 0 [ Var θ r δkr k (M k, θk r ) ] X,M k = σk (8) (9) for k = 1,...,K.

6 Main Variance-Decomposition Result For simplicity, let p k p(m k X) denote the kth posterior model probability for k = 1,...,K. Based on assumptions (4) (9), the posterior variance of y has the decomposition Slide 11 Var(y X) = p k (β k β) }{{} from model uncertainty + p k σk }{{} from parameter uncertainty + p k τk }{{} from stochastic uncertainty, where β E(y X) = is the posterior mean response. p k β k Estimating the Variance Components To estimate the {σk }, we also assume Cov uj,θ r k[ η(mk, θ r k ), e j (u j,m k, θ r k ) X,Mk ] = 0 (10) for k = 1,...,K; r = 1,...,R k ; and j = 1,...,m. Slide 1 Using (10), we estimate {β k }, {τk }, and {σ k } from the output of the BMA-Based Simulation Replication Algorithm as follows: β k y k, τ k 1 R k m (y krj y R k (m 1) kr ), r=1 j=1 [ ] σ k 1 R k ( ) ykr y (R k 1) k τ k for k = 1,...,K. m r=1

7 Estimating the Posterior Mean Based on the { β k : k = 1,...,K }, we estimate the posterior mean response E[y X] as β = p k β k. Slide 13 Estimating the Posterior Variance Based on the { τ k, σ k : k = 1,...,K}, we estimate the posterior response variance Var(y X) as Var(y X) = K p k ( β k β) + p k σ k + p k τ k = V mod + V par + V sto. B. Replication-Allocation Procedures Optimal Allocation Procedure (OAP) Decision Variables the sample sizes {R k : k = 1,...,K} that are allocated to the input models {M k : k = 1,...,K}. Objective Function the variance of the posterior mean estimator is Slide 14 Optimization Problem Var ( β ) = min {R k :1 k K} subject to: pk σk + R k R k p k p k [ σk + τ k ] m τ k mr k. R k = N/m = N. (11)

8 Optimal Allocation Scheme if ψ k = σk + τ k /m, then solving (11) yields the optimal replication counts R k = N p k ψk Ki=1 p i ψi for k = 1,...,K. (1) Slide 15 Approximately Optimal Allocation Scheme Make a small equal number of pilot runs, using each input model M k for k = 1,...,K. Estimate ψ k by ψ k = σ k + τ k /m for k = 1,...,K. Allocate the rest of the runs according to (1). Proportional Allocation Procedure (PAP) One easily implemented solution to (11) is the proportional allocation scheme R k = p k N for k = 1,...,K, (13) yielding the weighted mean response β pa computed over N runs. Slide 16 Simple Random Sampling (SRS) Procedure At the start of each run, randomly sample a new input model and its input-parameter vector from their respective posterior distributions; and compute the (unweighted) average response β srs over N independent runs. Variance Reduction Result Var ( β srs X ) = Var ( β pa X ) + 1 N p k [(β k β) (m 1)σk ].

9 C. Output Analysis Variance Estimator Slide 17 where Var( β) = V k = R k p k ( σ k + τ ) k = R k mr k (y kr y k ) r=1 R k (R k 1) pk V k, (14) for k = 1,...,K. Satterthwaite Approximation: (14) is approximately chi-squared with effective degrees of freedom ( K ) / f eff = pk V K p 4 V k k k. (15) (R k 1) Approximate Confidence Interval An approximate 100(1 α)% confidence interval for β is Slide 18 ( K ) 1/ p k y k ± t 1 α/, f eff pk V k. (16) Whether or not assumption (10) is satisfied, (16) is still an approximately valid 100(1 α)% confidence interval for β; and we have E [ R k Vk ] = ψk for k = 1,...,K as required in the optimal allocation scheme (1).

10 III. Application to M/G/1 Queueing Simulation Slide 19 Available system information Poisson arrivals with unknown arrival rate λ; and Service times randomly sampled from an unknown distribution, with gamma and lognormal input models equally plausible. True system configuration Arrival rate λ 0 = 5.7 customers per time unit; Service times randomly sampled from Pareto c.d.f. { 0, if x<ξ, G X (x) = 1 (ξ/x) ω, if x ξ, where ξ = 0.1 and ω = 3.5 so that the server utilization is 80%; and Pollaczek-Khintchine formula implies mean queue waiting time is λ 0 ξ ω(ω 1) β 0 = = time units. (ω )(ω 1 λξω) Input model for interarrival times M 1,1 : p(x 1,i λ 1,1 ) = λ 1,1 exp ( λ 1,1 x 1,i ) for x1,i 0, where λ 1,1 was unknown; and we observed a sample data set of size n = 1,000 from the true interarrival-time distribution. Slide 0 Input models for service times Gamma distribution M,1 : p(x,i M,1,α,1,μ,1 ) = μα,1,1 Ɣ(α,1 ) xα,1 1,i exp ( ) μ,1 x,i for x,i > 0. Lognormal distribution { 1 M, : p(x,i M,,μ,,σ, ) = exp [ ln(x,i) μ, ] } σ, x,i π σ, for x,i > 0. We take prior model probabilities p(m 1,1 ) = 1 and p(m,k ) = 1 for k = 1,.

11 Setup for Using the BMA-Based Simulation Replication Algorithm For the qth input process (where q = 1, ), we used a training sample z q ={z q,1,...,z q,t } of size T = 100 Slide 1 to obtain proper prior distributions for the corresponding parameter vectors that are used to calculate the marginal data densities (). For the qth input process (where q = 1, ), we used a regular sample x q ={x q,1,...,x q,n } of size n = 1000 to obtain posterior distributions (3) for the corresponding parameter vectors. Complete input models for the M/G/1 queueing simulation Input model M 1 = ( ) M 1,1,M,1 has prior probability p(m 1 ) = p(m 1,1 )p(m,1 ) = 1, and input-parameter vector Slide θ 1 = (λ 1,1,α,1,μ,1 ); and given M 1 and θ 1, the data X = (x 1, x ) have the joint conditional p.d.f. p(x M 1, θ 1 ) n = λ 1,1 exp ( ) μ α,1,1 λ 1,1 x 1,i Ɣ(α,1 ) xα,1 1,i exp ( ) μ,1 x,i. i=1

12 Input model M = ( M 1,1,M, ) has prior probability and input-parameter vector p(m ) = p(m 1,1 )p(m, ) = 1 Slide 3 θ = (λ 1,1,μ,,σ, ); and given M and θ, the data X = (x 1, x ) have the joint conditional p.d.f. p(x M, θ ) n = λ 1,1 exp ( { ) 1 λ 1,1 x 1,i exp [ ln(x },i) μ, ] σ, x,i π σ,. i=1 Calculation of Data Densities Marginal Density of Interarrival-Time Data Ɣ(n + T) ( T ) T p(x 1 M 1,1 ) = t=1 z 1,t Ɣ(T ) ( T t=1 z 1,t + n ) n+t. i=1 x 1,i Slide 4 Marginal Density of Service-Time Data Given Gamma Input Model p(x M,1 ) = Ɣ( n ν,1 + T ν,1 ) Ɣ(T ν,1 )Ɣ( ν,1 ) ( ni=1 x,i ) ν,1 1 ( Tt=1 z,t ) T ν,1 ( Tt=1 z,t + n i=1 x,i ) n ν,1 +T ν,1, where ν,1 = z / S z and we let z and S z respectively denote the mean and variance of the service-time training sample z.

13 Marginal Density of Service-Time Data Given Lognormal Input Model p(x M, ) = π n/ (T 1)/ Ɣ[(n + T 1)/]S w Ɣ[(T 1)/]S (n+t 1)/ Slide 5 where (T 1)/ (T 1) ( ni=1 ), x,i (T + n) 1/ (n + T 1) (n+t 1)/ S = (T 1)S v n + T 1 + (n 1)S w n + T 1 + nt (w v ) (n + T 1)(n + T), where w and Sw respectively denote the sample mean and variance of the logged regular sample {w,i ln(x,i ) : i = 1,...,n}; and v and Sv respectively denote the sample mean and variance of the logged training sample {v,t ln(z,t ) : t = 1,...,T}. Calculation of Posterior Densities Posterior model probababilities are calculated from the marginal data densities via (1). Slide 6 Input model M 1,1 for interarrival times: p(λ 1,1 x 1,M 1,1 ) is a gamma p.d.f. with shape parameter n and scale parameter 1 /( n i=1 x 1,i ). Input model M,1 for service times: We took α,1 = x / S x as if it were the true value of α 1,1, where x and S x respectively denote the mean and variance of x. p(μ,1 x,m,1 ) is a gamma p.d.f. with shape parameter n α,1 and scale parameter 1 /( n i=1 x,i ).

14 Input model M, for service times: Slide 7 p(σ, x,m, ) is an inverse-gamma p.d.f. with shape parameter (n 1)/ and scale parameter n ( ) / i=1 w,i w. The posterior p.d.f. of μ, is a generalized Student s t-distribution, p(μ, x,m, ) = Ɣ(n/) n Ɣ[(n 1)/]S w (n 1)π [ 1 + n ( ) ] μ, w n/. n 1 S w Layout of the Simulation Experiments Each run consisted of C = 10,000 customer waiting times, with the first C 0 = 10,000 waiting times deleted to eliminate start-up effects. Slide 8 The interarrival times {A i } and service times {X i } are sampled by inversion; and the associated waiting times {W i } are computed by taking W 1 = 0 and W i = max{w i 1 + X i 1 A i, 0} for i =, 3,.... On the jth simulation run using the kth input model M k and the rth random sample θ r k from p(θ k X,M k ), the simulation response was y krj = y(u j,m k, θ r k ) = 1 C C 0 C i=c 0 +1 W i (17) for k = 1,...,K; r = 1,...,R k ; and j = 1,...,m, where we took K =, R 1 = R = 100, and m = 10.

15 We performed a metaexperiment consisting of 00 independent replications of a basic simulation experiment with the following steps: Slide 9 a) We generated the training samples z 1, z and data samples x 1, x independently to obtain the required posterior input-model and input-parameter distributions; b) Using the BMA-Based Simulation Replication Algorithm, we executed KR 1 m =,000 runs of the simulation, each with C = 10,000 waiting times and with the truncated mean waiting time (17) computed on each run; and c) From the results of step b), we constructed a nominal 90% confidence interval for the steady-state mean waiting time β 0 based on (16). Posterior probability, mean and variance estimates for each candidate input model for service times Slide 30 Service-time Model Post. Prob. Mean Stochastic Var. Parameter Var. M,k p(m,k x ) β k τ k σ k k = 1: Gamma E E 03 k = : Lognormal E E 03

16 Mean absolute percentage error (MAPE), mean squared error (MSE), and standard error (SE) of the MSE for each approach s estimator of average waiting time in the queue Slide 31 Approach Service-time Mean MAPE MSE SE(MSE) Model β 100E [ ]/ β β 0 β0 % E [( ) ] β β 0 Classical Gamma % Frequentist Lognormal % Partial Bayes Gamma % Lognormal % BMA Mixture 0.3 4% BMA + PAP Mixture 0.3 5% BMA + OAP Mixture % Performance of nominal 90% confidence intervals for the average waiting time in terms of the CIL, the sample average confidence-interval length, and CV(CIL), the sample coefficient of variation of the CI length Slide 3 Approach Model CIL CV(CIL) Coverage Classical Gamma % Frequentist Lognormal % Partial Bayes Gamma % Lognormal % BMA Mixture % BMA + PAP Mixture % BMA + OAP Mixture %

17 Summary of Key Results for the M/G/1 Queueing Simulation In terms of posterior probabilities, the lognormal service-time model M, was slightly better than the gamma model M,1. Slide 33 The point-estimator accuracy for the BMA approach was much better than for the classical frequentist and partial Bayes approaches. From the variance estimates V mod, V par, and V sto, we obtained the following decomposition of the posterior response variance: % is due to stochastic uncertainty; 80% is due to parameter uncertainty; and 18% is due to model uncertainty. In comparison with the classical frequentist approach, the BMA approach delivered confidence intervals with much higher coverage probabilities. Slide 34 The BMA-based proportional allocation (PAP) and optimal allocation (OAP) procedures delivered confidence intervals with slightly better coverage probabilities compared with those resulting from the equal-allocation BMA approach. In other applications with highly disparate posterior model probabilities, we observed large improvements in the accuracy and reliability of confidence intervals based on BMA + PAPorBMA+ OAP.

18 IV. Conclusions and Recommendations Main Results We developed a BMA-based framework for simulation input modeling that is designed to handle input-model and input-parameter uncertainties as well as the conventional stochastic uncertainty. Slide 35 We developed a BMA-based Simulation Replication Algorithm for estimating the posterior mean response and for assessing the sources of variability in the simulation output. We formulated an approximate confidence interval on the posterior mean response. Failure to account for input-model and input-parameter uncertainties can result in misleading output performance measures. Recommendations for Future Research Extension of the response-surface model (4) (9) to eliminate the restrictive assumption (6) since in general the residual variance about the response surface, τk, does depend on the parameter vector θ k. Slide 36 Development of correlated sampling schemes for improving the efficiency of the BMA-based Simulation Replication Algorithm. Implementation of a comprehensive experimental performance evaluation of Bayesian techniques for simulation input modeling. Implementation of a user-friendly software tool that enables (nearly) routine use of Bayesian techniques in simulation input modeling and output analysis.

The Jeffreys Prior. Yingbo Li MATH Clemson University. Yingbo Li (Clemson) The Jeffreys Prior MATH / 13

The Jeffreys Prior. Yingbo Li MATH Clemson University. Yingbo Li (Clemson) The Jeffreys Prior MATH / 13 The Jeffreys Prior Yingbo Li Clemson University MATH 9810 Yingbo Li (Clemson) The Jeffreys Prior MATH 9810 1 / 13 Sir Harold Jeffreys English mathematician, statistician, geophysicist, and astronomer His

More information

ARD: AN AUTOMATED REPLICATION-DELETION METHOD FOR SIMULATION ANALYSIS

ARD: AN AUTOMATED REPLICATION-DELETION METHOD FOR SIMULATION ANALYSIS ARD: AN AUTOMATED REPLICATION-DELETION METHOD FOR SIMULATION ANALYSIS Emily K. Lada Anup C. Mokashi SAS Institute Inc. 100 SAS Campus Drive, R5413 Cary, NC 27513-8617, USA James R. Wilson North Carolina

More information

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions Queueing Theory II Summary! M/M/1 Output process! Networks of Queue! Method of Stages " Erlang Distribution " Hyperexponential Distribution! General Distributions " Embedded Markov Chains M/M/1 Output

More information

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017 CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017 Quote of the Day A person with one watch knows what time it is. A person with two

More information

Penalized Loss functions for Bayesian Model Choice

Penalized Loss functions for Bayesian Model Choice Penalized Loss functions for Bayesian Model Choice Martyn International Agency for Research on Cancer Lyon, France 13 November 2009 The pure approach For a Bayesian purist, all uncertainty is represented

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

ARD: AN AUTOMATED REPLICATION-DELETION METHOD FOR SIMULATION ANALYSIS

ARD: AN AUTOMATED REPLICATION-DELETION METHOD FOR SIMULATION ANALYSIS Proceedings of the 2013 Winter Simulation Conference R. Pasupathy, S.-H. Kim, A. Tolk, R. Hill, and M. E. Kuhl, eds. ARD: AN AUTOMATED REPLICATION-DELETION METHOD FOR SIMULATION ANALYSIS Emily K. Lada

More information

An Introduction to Bayesian Linear Regression

An Introduction to Bayesian Linear Regression An Introduction to Bayesian Linear Regression APPM 5720: Bayesian Computation Fall 2018 A SIMPLE LINEAR MODEL Suppose that we observe explanatory variables x 1, x 2,..., x n and dependent variables y 1,

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian

More information

ABC methods for phase-type distributions with applications in insurance risk problems

ABC methods for phase-type distributions with applications in insurance risk problems ABC methods for phase-type with applications problems Concepcion Ausin, Department of Statistics, Universidad Carlos III de Madrid Joint work with: Pedro Galeano, Universidad Carlos III de Madrid Simon

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters

More information

Bayesian inference. Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark. April 10, 2017

Bayesian inference. Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark. April 10, 2017 Bayesian inference Rasmus Waagepetersen Department of Mathematics Aalborg University Denmark April 10, 2017 1 / 22 Outline for today A genetic example Bayes theorem Examples Priors Posterior summaries

More information

Outline. Simulation of a Single-Server Queueing System. EEC 686/785 Modeling & Performance Evaluation of Computer Systems.

Outline. Simulation of a Single-Server Queueing System. EEC 686/785 Modeling & Performance Evaluation of Computer Systems. EEC 686/785 Modeling & Performance Evaluation of Computer Systems Lecture 19 Outline Simulation of a Single-Server Queueing System Review of midterm # Department of Electrical and Computer Engineering

More information

Example: Ground Motion Attenuation

Example: Ground Motion Attenuation Example: Ground Motion Attenuation Problem: Predict the probability distribution for Peak Ground Acceleration (PGA), the level of ground shaking caused by an earthquake Earthquake records are used to update

More information

Introduction to Rare Event Simulation

Introduction to Rare Event Simulation Introduction to Rare Event Simulation Brown University: Summer School on Rare Event Simulation Jose Blanchet Columbia University. Department of Statistics, Department of IEOR. Blanchet (Columbia) 1 / 31

More information

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models Statistical regularity Properties of relative frequency

More information

Data Analysis and Uncertainty Part 2: Estimation

Data Analysis and Uncertainty Part 2: Estimation Data Analysis and Uncertainty Part 2: Estimation Instructor: Sargur N. University at Buffalo The State University of New York srihari@cedar.buffalo.edu 1 Topics in Estimation 1. Estimation 2. Desirable

More information

EEC 686/785 Modeling & Performance Evaluation of Computer Systems. Lecture 19

EEC 686/785 Modeling & Performance Evaluation of Computer Systems. Lecture 19 EEC 686/785 Modeling & Performance Evaluation of Computer Systems Lecture 19 Department of Electrical and Computer Engineering Cleveland State University wenbing@ieee.org (based on Dr. Raj Jain s lecture

More information

CS281A/Stat241A Lecture 22

CS281A/Stat241A Lecture 22 CS281A/Stat241A Lecture 22 p. 1/4 CS281A/Stat241A Lecture 22 Monte Carlo Methods Peter Bartlett CS281A/Stat241A Lecture 22 p. 2/4 Key ideas of this lecture Sampling in Bayesian methods: Predictive distribution

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Estimation of Operational Risk Capital Charge under Parameter Uncertainty

Estimation of Operational Risk Capital Charge under Parameter Uncertainty Estimation of Operational Risk Capital Charge under Parameter Uncertainty Pavel V. Shevchenko Principal Research Scientist, CSIRO Mathematical and Information Sciences, Sydney, Locked Bag 17, North Ryde,

More information

Fluid Models of Parallel Service Systems under FCFS

Fluid Models of Parallel Service Systems under FCFS Fluid Models of Parallel Service Systems under FCFS Hanqin Zhang Business School, National University of Singapore Joint work with Yuval Nov and Gideon Weiss from The University of Haifa, Israel Queueing

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Output Data Analysis for a Single System

Output Data Analysis for a Single System Output Data Analysis for a Single System Chapter 9 Based on the slides provided with the textbook 2 9.1 Introduction Output data analysis is often not conducted appropriately Treating output of a single

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Remarks on Improper Ignorance Priors

Remarks on Improper Ignorance Priors As a limit of proper priors Remarks on Improper Ignorance Priors Two caveats relating to computations with improper priors, based on their relationship with finitely-additive, but not countably-additive

More information

Glossary availability cellular manufacturing closed queueing network coefficient of variation (CV) conditional probability CONWIP

Glossary availability cellular manufacturing closed queueing network coefficient of variation (CV) conditional probability CONWIP Glossary availability The long-run average fraction of time that the processor is available for processing jobs, denoted by a (p. 113). cellular manufacturing The concept of organizing the factory into

More information

Lecture 4: Dynamic models

Lecture 4: Dynamic models linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu

More information

Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos. Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2

Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos. Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2 How Random Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2 1 Center for Complex Systems Research and Department of Physics University

More information

Vector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I.

Vector Autoregressive Model. Vector Autoregressions II. Estimation of Vector Autoregressions II. Estimation of Vector Autoregressions I. Vector Autoregressive Model Vector Autoregressions II Empirical Macroeconomics - Lect 2 Dr. Ana Beatriz Galvao Queen Mary University of London January 2012 A VAR(p) model of the m 1 vector of time series

More information

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability

More information

Introduction to Bayesian Methods. Introduction to Bayesian Methods p.1/??

Introduction to Bayesian Methods. Introduction to Bayesian Methods p.1/?? to Bayesian Methods Introduction to Bayesian Methods p.1/?? We develop the Bayesian paradigm for parametric inference. To this end, suppose we conduct (or wish to design) a study, in which the parameter

More information

Name of the Student:

Name of the Student: SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Overall Plan of Simulation and Modeling I. Chapters

Overall Plan of Simulation and Modeling I. Chapters Overall Plan of Simulation and Modeling I Chapters Introduction to Simulation Discrete Simulation Analytical Modeling Modeling Paradigms Input Modeling Random Number Generation Output Analysis Continuous

More information

The aim of this expository survey on Bayesian simulation is to stimulate more work in the area by decision

The aim of this expository survey on Bayesian simulation is to stimulate more work in the area by decision Decision Analysis Vol. 6, No. 4, December 2009, pp. 222 238 issn 1545-8490 eissn 1545-8504 09 0604 0222 informs doi 10.1287/deca.1090.0151 2009 INFORMS Bayesian Simulation and Decision Analysis: An Expository

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee and Andrew O. Finley 2 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department

More information

Bayesian Inference. Chapter 2: Conjugate models

Bayesian Inference. Chapter 2: Conjugate models Bayesian Inference Chapter 2: Conjugate models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master in

More information

Classical and Bayesian inference

Classical and Bayesian inference Classical and Bayesian inference AMS 132 January 18, 2018 Claudia Wehrhahn (UCSC) Classical and Bayesian inference January 18, 2018 1 / 9 Sampling from a Bernoulli Distribution Theorem (Beta-Bernoulli

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

COS513 LECTURE 8 STATISTICAL CONCEPTS

COS513 LECTURE 8 STATISTICAL CONCEPTS COS513 LECTURE 8 STATISTICAL CONCEPTS NIKOLAI SLAVOV AND ANKUR PARIKH 1. MAKING MEANINGFUL STATEMENTS FROM JOINT PROBABILITY DISTRIBUTIONS. A graphical model (GM) represents a family of probability distributions

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Introduction to Markov chain Monte Carlo The Gibbs Sampler Examples Overview of the Lecture

More information

ST440/540: Applied Bayesian Statistics. (9) Model selection and goodness-of-fit checks

ST440/540: Applied Bayesian Statistics. (9) Model selection and goodness-of-fit checks (9) Model selection and goodness-of-fit checks Objectives In this module we will study methods for model comparisons and checking for model adequacy For model comparisons there are a finite number of candidate

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model Chapter Output Analysis for a Single Model. Contents Types of Simulation Stochastic Nature of Output Data Measures of Performance Output Analysis for Terminating Simulations Output Analysis for Steady-state

More information

Miscellany : Long Run Behavior of Bayesian Methods; Bayesian Experimental Design (Lecture 4)

Miscellany : Long Run Behavior of Bayesian Methods; Bayesian Experimental Design (Lecture 4) Miscellany : Long Run Behavior of Bayesian Methods; Bayesian Experimental Design (Lecture 4) Tom Loredo Dept. of Astronomy, Cornell University http://www.astro.cornell.edu/staff/loredo/bayes/ Bayesian

More information

Part 4: Multi-parameter and normal models

Part 4: Multi-parameter and normal models Part 4: Multi-parameter and normal models 1 The normal model Perhaps the most useful (or utilized) probability model for data analysis is the normal distribution There are several reasons for this, e.g.,

More information

Quiz 1. Name: Instructions: Closed book, notes, and no electronic devices.

Quiz 1. Name: Instructions: Closed book, notes, and no electronic devices. Quiz 1. Name: Instructions: Closed book, notes, and no electronic devices. 1. What is the difference between a deterministic model and a probabilistic model? (Two or three sentences only). 2. What is the

More information

IGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October th 2014, Kalmar, Sweden

IGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October th 2014, Kalmar, Sweden IGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October 28-30 th 2014, Kalmar, Sweden Comparison of probabilistic and alternative evidence theoretical methods for the handling of parameter

More information

Introduction to Bayesian Statistics. James Swain University of Alabama in Huntsville ISEEM Department

Introduction to Bayesian Statistics. James Swain University of Alabama in Huntsville ISEEM Department Introduction to Bayesian Statistics James Swain University of Alabama in Huntsville ISEEM Department Author Introduction James J. Swain is Professor of Industrial and Systems Engineering Management at

More information

REDUCING INPUT PARAMETER UNCERTAINTY FOR SIMULATIONS. Szu Hui Ng Stephen E. Chick

REDUCING INPUT PARAMETER UNCERTAINTY FOR SIMULATIONS. Szu Hui Ng Stephen E. Chick Proceedings of the 2001 Winter Simulation Conference B. A. Peters, J. S. Smith, D. J. Medeiros, and M. W. Rohrer, eds. REDUCING INPUT PARAMETER UNCERTAINTY FOR SIMULATIONS Szu Hui Ng Stephen E. Chick Department

More information

Bayesian Inference: Concept and Practice

Bayesian Inference: Concept and Practice Inference: Concept and Practice fundamentals Johan A. Elkink School of Politics & International Relations University College Dublin 5 June 2017 1 2 3 Bayes theorem In order to estimate the parameters of

More information

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for

More information

Contents LIST OF TABLES... LIST OF FIGURES... xvii. LIST OF LISTINGS... xxi PREFACE. ...xxiii

Contents LIST OF TABLES... LIST OF FIGURES... xvii. LIST OF LISTINGS... xxi PREFACE. ...xxiii LIST OF TABLES... xv LIST OF FIGURES... xvii LIST OF LISTINGS... xxi PREFACE...xxiii CHAPTER 1. PERFORMANCE EVALUATION... 1 1.1. Performance evaluation... 1 1.2. Performance versus resources provisioning...

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 CS students: don t forget to re-register in CS-535D. Even if you just audit this course, please do register.

More information

A note on multiple imputation for general purpose estimation

A note on multiple imputation for general purpose estimation A note on multiple imputation for general purpose estimation Shu Yang Jae Kwang Kim SSC meeting June 16, 2015 Shu Yang, Jae Kwang Kim Multiple Imputation June 16, 2015 1 / 32 Introduction Basic Setup Assume

More information

The Jackknife-Like Method for Assessing Uncertainty of Point Estimates for Bayesian Estimation in a Finite Gaussian Mixture Model

The Jackknife-Like Method for Assessing Uncertainty of Point Estimates for Bayesian Estimation in a Finite Gaussian Mixture Model Thai Journal of Mathematics : 45 58 Special Issue: Annual Meeting in Mathematics 207 http://thaijmath.in.cmu.ac.th ISSN 686-0209 The Jackknife-Like Method for Assessing Uncertainty of Point Estimates for

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, Jeffreys priors. exp 1 ) p 2

Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, Jeffreys priors. exp 1 ) p 2 Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, 2010 Jeffreys priors Lecturer: Michael I. Jordan Scribe: Timothy Hunter 1 Priors for the multivariate Gaussian Consider a multivariate

More information

Plausible Values for Latent Variables Using Mplus

Plausible Values for Latent Variables Using Mplus Plausible Values for Latent Variables Using Mplus Tihomir Asparouhov and Bengt Muthén August 21, 2010 1 1 Introduction Plausible values are imputed values for latent variables. All latent variables can

More information

Estimation of Quantiles

Estimation of Quantiles 9 Estimation of Quantiles The notion of quantiles was introduced in Section 3.2: recall that a quantile x α for an r.v. X is a constant such that P(X x α )=1 α. (9.1) In this chapter we examine quantiles

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 0 Output Analysis for a Single Model Purpose Objective: Estimate system performance via simulation If θ is the system performance, the precision of the

More information

Stable Limit Laws for Marginal Probabilities from MCMC Streams: Acceleration of Convergence

Stable Limit Laws for Marginal Probabilities from MCMC Streams: Acceleration of Convergence Stable Limit Laws for Marginal Probabilities from MCMC Streams: Acceleration of Convergence Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham NC 778-5 - Revised April,

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department

More information

Katsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus. Abstract

Katsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus. Abstract Bayesian analysis of a vector autoregressive model with multiple structural breaks Katsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus Abstract This paper develops a Bayesian approach

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Advanced Statistics and Data Mining Summer School

More information

inferences on stress-strength reliability from lindley distributions

inferences on stress-strength reliability from lindley distributions inferences on stress-strength reliability from lindley distributions D.K. Al-Mutairi, M.E. Ghitany & Debasis Kundu Abstract This paper deals with the estimation of the stress-strength parameter R = P (Y

More information

Bayesian Model Diagnostics and Checking

Bayesian Model Diagnostics and Checking Earvin Balderama Quantitative Ecology Lab Department of Forestry and Environmental Resources North Carolina State University April 12, 2013 1 / 34 Introduction MCMCMC 2 / 34 Introduction MCMCMC Steps in

More information

Bayesian Econometrics

Bayesian Econometrics Bayesian Econometrics Christopher A. Sims Princeton University sims@princeton.edu September 20, 2016 Outline I. The difference between Bayesian and non-bayesian inference. II. Confidence sets and confidence

More information

The bootstrap and Markov chain Monte Carlo

The bootstrap and Markov chain Monte Carlo The bootstrap and Markov chain Monte Carlo Bradley Efron Stanford University Abstract This note concerns the use of parametric bootstrap sampling to carry out Bayesian inference calculations. This is only

More information

Queueing Theory and Simulation. Introduction

Queueing Theory and Simulation. Introduction Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University, Japan

More information

Quantifying Stochastic Model Errors via Robust Optimization

Quantifying Stochastic Model Errors via Robust Optimization Quantifying Stochastic Model Errors via Robust Optimization IPAM Workshop on Uncertainty Quantification for Multiscale Stochastic Systems and Applications Jan 19, 2016 Henry Lam Industrial & Operations

More information

Hypothesis Testing. Econ 690. Purdue University. Justin L. Tobias (Purdue) Testing 1 / 33

Hypothesis Testing. Econ 690. Purdue University. Justin L. Tobias (Purdue) Testing 1 / 33 Hypothesis Testing Econ 690 Purdue University Justin L. Tobias (Purdue) Testing 1 / 33 Outline 1 Basic Testing Framework 2 Testing with HPD intervals 3 Example 4 Savage Dickey Density Ratio 5 Bartlett

More information

Stat 451 Lecture Notes Monte Carlo Integration

Stat 451 Lecture Notes Monte Carlo Integration Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:

More information

Using Model Selection and Prior Specification to Improve Regime-switching Asset Simulations

Using Model Selection and Prior Specification to Improve Regime-switching Asset Simulations Using Model Selection and Prior Specification to Improve Regime-switching Asset Simulations Brian M. Hartman, PhD ASA Assistant Professor of Actuarial Science University of Connecticut BYU Statistics Department

More information

Bayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014

Bayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014 Bayesian Prediction of Code Output ASA Albuquerque Chapter Short Course October 2014 Abstract This presentation summarizes Bayesian prediction methodology for the Gaussian process (GP) surrogate representation

More information

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1 Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data

More information

Checking the Reliability of Reliability Models.

Checking the Reliability of Reliability Models. Checking the Reliability of Reliability Models. Seminario de Estadística, CIMAT Abril 3, 007. Víctor Aguirre Torres Departmento de Estadística, ITAM Área de Probabilidad y Estadística, CIMAT Credits. Partially

More information

Data analysis and stochastic modeling

Data analysis and stochastic modeling Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

Bayesian methods in economics and finance

Bayesian methods in economics and finance 1/26 Bayesian methods in economics and finance Linear regression: Bayesian model selection and sparsity priors Linear Regression 2/26 Linear regression Model for relationship between (several) independent

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Down by the Bayes, where the Watermelons Grow

Down by the Bayes, where the Watermelons Grow Down by the Bayes, where the Watermelons Grow A Bayesian example using SAS SUAVe: Victoria SAS User Group Meeting November 21, 2017 Peter K. Ott, M.Sc., P.Stat. Strategic Analysis 1 Outline 1. Motivating

More information

Variational Bayesian Logistic Regression

Variational Bayesian Logistic Regression Variational Bayesian Logistic Regression Sargur N. University at Buffalo, State University of New York USA Topics in Linear Models for Classification Overview 1. Discriminant Functions 2. Probabilistic

More information

Consistent high-dimensional Bayesian variable selection via penalized credible regions

Consistent high-dimensional Bayesian variable selection via penalized credible regions Consistent high-dimensional Bayesian variable selection via penalized credible regions Howard Bondell bondell@stat.ncsu.edu Joint work with Brian Reich Howard Bondell p. 1 Outline High-Dimensional Variable

More information

EE/PEP 345. Modeling and Simulation. Spring Class 11

EE/PEP 345. Modeling and Simulation. Spring Class 11 EE/PEP 345 Modeling and Simulation Class 11 11-1 Output Analysis for a Single Model Performance measures System being simulated Output Output analysis Stochastic character Types of simulations Output analysis

More information

Practice Problems Section Problems

Practice Problems Section Problems Practice Problems Section 4-4-3 4-4 4-5 4-6 4-7 4-8 4-10 Supplemental Problems 4-1 to 4-9 4-13, 14, 15, 17, 19, 0 4-3, 34, 36, 38 4-47, 49, 5, 54, 55 4-59, 60, 63 4-66, 68, 69, 70, 74 4-79, 81, 84 4-85,

More information

Machine Learning. Probability Basics. Marc Toussaint University of Stuttgart Summer 2014

Machine Learning. Probability Basics. Marc Toussaint University of Stuttgart Summer 2014 Machine Learning Probability Basics Basic definitions: Random variables, joint, conditional, marginal distribution, Bayes theorem & examples; Probability distributions: Binomial, Beta, Multinomial, Dirichlet,

More information

Simulation. Where real stuff starts

Simulation. Where real stuff starts 1 Simulation Where real stuff starts ToC 1. What is a simulation? 2. Accuracy of output 3. Random Number Generators 4. How to sample 5. Monte Carlo 6. Bootstrap 2 1. What is a simulation? 3 What is a simulation?

More information

APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 651 APPENDIX B. BIBLIOGRAPHY 677 APPENDIX C. ANSWERS TO SELECTED EXERCISES 679

APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 651 APPENDIX B. BIBLIOGRAPHY 677 APPENDIX C. ANSWERS TO SELECTED EXERCISES 679 APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 1 Table I Summary of Common Probability Distributions 2 Table II Cumulative Standard Normal Distribution Table III Percentage Points, 2 of the Chi-Squared

More information

BAYESIAN MODEL CRITICISM

BAYESIAN MODEL CRITICISM Monte via Chib s BAYESIAN MODEL CRITICM Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes

More information

COMS 4721: Machine Learning for Data Science Lecture 16, 3/28/2017

COMS 4721: Machine Learning for Data Science Lecture 16, 3/28/2017 COMS 4721: Machine Learning for Data Science Lecture 16, 3/28/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University SOFT CLUSTERING VS HARD CLUSTERING

More information

Bayesian Modeling of Accelerated Life Tests with Random Effects

Bayesian Modeling of Accelerated Life Tests with Random Effects Bayesian Modeling of Accelerated Life Tests with Random Effects Ramón V. León Avery J. Ashby Jayanth Thyagarajan Joint Statistical Meeting August, 00 Toronto, Canada Abstract We show how to use Bayesian

More information

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours

More information

Theoretical and Computational Approaches to Dempster-Shafer (DS) Inference

Theoretical and Computational Approaches to Dempster-Shafer (DS) Inference Theoretical and Computational Approaches to Dempster-Shafer (DS) Inference Chuanhai Liu Department of Statistics Purdue University Joint work with A. P. Dempster Dec. 11, 2006 What is DS Analysis (DSA)?

More information

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

Bayesian Inference. Chapter 4: Regression and Hierarchical Models Bayesian Inference Chapter 4: Regression and Hierarchical Models Conchi Ausín and Mike Wiper Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative

More information

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected 4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X

More information