Session 2B: Some basic simulation methods

Size: px
Start display at page:

Download "Session 2B: Some basic simulation methods"

Transcription

1 Session 2B: Some basic simulation methods John Geweke Bayesian Econometrics and its Applications August 14, 2012 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

2 Motivation Cannot access p (ω j y o, A) analytically ω j Instead, simulate ω (1) s p y o, A Approximate posterior moments E [h (ω) j y o, A] Bayes actions ba = arg min a2a E [L (a, ω) j y o, A] ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

3 Motivation Cannot access p (ω j y o, A) analytically ω j Instead, simulate ω (1) s p y o, A Approximate posterior moments E [h (ω) j y o, A] Bayes actions ba = arg min a2a E [L (a, ω) j y o, A] ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

4 Notation The methods we discuss today can be used for a probability distribution of interest, I To this end: θ (m) sp (θ ji ) ω (m) sp ω j θ (m),i ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

5 Notation The methods we discuss today can be used for a probability distribution of interest, I To this end: θ (m) sp (θ ji ) ω (m) sp ω j θ (m),i ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

6 The setup Suppose the following is possible: θ (m) iid sp (θ ji ) ω (m)iid sp ω j θ (m),i Our setup (Theorem 4.1.1) n θ (m), ω (m)o is i.i.d.; θ (m) 2 Θ, ω (m) 2 Ω, θ (m) s p (θ j I ) h : Ω! R 1 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

7 The setup Suppose the following is possible: θ (m) iid sp (θ ji ) ω (m)iid sp ω j θ (m),i Our setup (Theorem 4.1.1) n θ (m), ω (m)o is i.i.d.; θ (m) 2 Θ, ω (m) 2 Ω, θ (m) s p (θ j I ) h : Ω! R 1 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

8 Some conditions First momdent condition: E [h (ω) j I ] = h; Second moment condition: var [h (ω) j I ] = σ 2 ; Probability mass condition: For given p 2 (0, 1), there is a unique h p such that the statements P [h (ω) h p j I ] p and P [h (ω) h p j I ] 1 p are both true; Positive p.d.f. condition: For the unique h p corresponding to p, p [h (ω) = h p j I ] > 0. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

9 Some conditions First momdent condition: E [h (ω) j I ] = h; Second moment condition: var [h (ω) j I ] = σ 2 ; Probability mass condition: For given p 2 (0, 1), there is a unique h p such that the statements P [h (ω) h p j I ] p and P [h (ω) h p j I ] 1 p are both true; Positive p.d.f. condition: For the unique h p corresponding to p, p [h (ω) = h p j I ] > 0. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

10 Given the setup, First result (a) ω (m) i.i.d. s p (ω j I ) whether or not any of conditions 1-4 are true. Given the rst moment condition, h (M ) = M 1 Second result (b) M h ω (m) a.s.! h. m=1 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

11 Given the setup, First result (a) ω (m) i.i.d. s p (ω j I ) whether or not any of conditions 1-4 are true. Given the rst moment condition, h (M ) = M 1 Second result (b) M h ω (m) a.s.! h. m=1 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

12 Third result (c) Given the rst and second moment conditions, M 1/2 h (M ) d! h N 0, σ 2 and bσ 2(M ) = M 1 M h h ω (m) m=1 h (M )i 2 a.s.! σ 2 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

13 Third result (c) Given the rst and second moment conditions, M 1/2 h (M ) d! h N 0, σ 2 and bσ 2(M ) = M 1 M h h ω (m) m=1 h (M )i 2 a.s.! σ 2 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

14 Notation for quantiles Let bh (M ) p be any real number such that M 1 M m=1 h I i,bh p (M ) h ω (m)i p and M 1 M h I h bh p (M ) h ω (m)i 1 p., m=1 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

15 Notation for quantiles Let bh (M ) p be any real number such that M 1 M m=1 h I i,bh p (M ) h ω (m)i p and M 1 M h I h bh p (M ) h ω (m)i 1 p., m=1 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

16 Fourth result (d) Given the probability mass condition, bh (M ) p a.s.! h p Fifth result (e) Given the probabilty mass and positive p.d.f. conditions, N h i M 1/2 bh p (M ) d! h p n 0, p (1 p) /p [h (ω) = h p j I ] 2o. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

17 Fourth result (d) Given the probability mass condition, bh (M ) p a.s.! h p Fifth result (e) Given the probabilty mass and positive p.d.f. conditions, N h i M 1/2 bh p (M ) d! h p n 0, p (1 p) /p [h (ω) = h p j I ] 2o. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

18 Assessment of approximation error Recall the third result (c): M 1/2 h (M ) d! h N 0, σ 2 and bσ 2(M ) = M 1 M h h ω (m) m=1 h (M )i 2 a.s.! σ 2 The numerical standard error of h (M ) is bσ 2(M ) /M 1/2 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

19 Assessment of approximation error Recall the third result (c): M 1/2 h (M ) d! h N 0, σ 2 and bσ 2(M ) = M 1 M h h ω (m) m=1 h (M )i 2 a.s.! σ 2 The numerical standard error of h (M ) is bσ 2(M ) /M 1/2 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

20 Approximation of Bayes actions by direct sampling: The setup θ (m) s p (θ ji ), ω (m) j θ (m), I s p ω j θ (m), I. L (a, ω) 0 is a loss function de ned on Ω A, where A is an open subset of R q. The risk function Z R (a) = Ω Z Θ L (a, ω) p (θ ji ) p (ω j θ, I ) dθdω has a strict global minimum at ba 2 A R m. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

21 Approximation of Bayes actions by direct sampling: The setup θ (m) s p (θ ji ), ω (m) j θ (m), I s p ω j θ (m), I. L (a, ω) 0 is a loss function de ned on Ω A, where A is an open subset of R q. The risk function Z R (a) = Ω Z Θ L (a, ω) p (θ ji ) p (ω j θ, I ) dθdω has a strict global minimum at ba 2 A R m. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

22 First result Under weak regularity conditions (Theorem 4.1.2, conditions (1-2) lim P inf (a ba) 0 (a ba) > ε j I = 0. M! a2a M ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

23 First result Under weak regularity conditions (Theorem 4.1.2, conditions (1-2) lim P inf (a ba) 0 (a ba) > ε j I = 0. M! a2a M ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

24 Second result This requires Conditions 1-6, which include B =var [ L (a, ω) / aj a=ba j I ] exists and is nite; H =E 2 L (a, ω) / a a 0 j a=ba j I exists and is nite and nonsingular. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

25 Second result This requires Conditions 1-6, which include B =var [ L (a, ω) / aj a=ba j I ] exists and is nite; H =E 2 L (a, ω) / a a 0 j a=ba j I exists and is nite and nonsingular. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

26 Second result (continued) Then M 1/2 (ba M ba)! d N 0, H 1 BH 1, M 1 M m=1 L a, ω (m) / aj a=bam L a, ω (m) / a 0 p j a=bam! B, M 1 M m=1 2 L a, ω (m) / a a 0 p j a=bam! H. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

27 Second result (continued) Then M 1/2 (ba M ba)! d N 0, H 1 BH 1, M 1 M m=1 L a, ω (m) / aj a=bam L a, ω (m) / a 0 p j a=bam! B, M 1 M m=1 2 L a, ω (m) / a a 0 p j a=bam! H. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

28 Importance sampling: Motivation Suppose that we cannot derive a method for drawing θ (m) iid s p (θ ji ) but we can simulate θ (m) iid s p (θ js) where p (θ js) is similar to p (θ ji ). ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

29 Importance sampling: Motivation Suppose that we cannot derive a method for drawing θ (m) iid s p (θ ji ) but we can simulate θ (m) iid s p (θ js) where p (θ js) is similar to p (θ ji ). ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

30 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

31 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

32 Main results: The setup n u θ (m), ω (m)o is independent and identically distributed, θ (m) s p (θ js) ω (m) s p ω j θ (m), I. De ne the weighting function As before,h : Ω! R 1 w (θ) = p (θ j I ) /p (θ j S) ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

33 Main results: The setup n u θ (m), ω (m)o is independent and identically distributed, θ (m) s p (θ js) ω (m) s p ω j θ (m), I. De ne the weighting function As before,h : Ω! R 1 w (θ) = p (θ j I ) /p (θ j S) ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

34 Some conditions First moment condition: E [h (ω) j I ] = h exists Second moment condition: var [h (ω) j I ] = σ 2 exists Essential condition: The support of p (θ js) includes Θ. Bounded weight condtion: w (θ) = p (θ j I ) /p (θ j S) is bounded above on Θ. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

35 Some conditions First moment condition: E [h (ω) j I ] = h exists Second moment condition: var [h (ω) j I ] = σ 2 exists Essential condition: The support of p (θ js) includes Θ. Bounded weight condtion: w (θ) = p (θ j I ) /p (θ j S) is bounded above on Θ. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

36 More notation Kernels k (θ ji ) = c I p (θ ji ) k (θ js) = c S p (θ js) ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

37 More notation Kernels k (θ ji ) = c I p (θ ji ) k (θ js) = c S p (θ js) ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

38 First result Given the rst moment condition and the essential condition, h (M ) = M m=1 w θ (m) h ω (m) a.s. M m=1 w θ (m)! h The proof is straightforward. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

39 First result Given the rst moment condition and the essential condition, h (M ) = M m=1 w θ (m) h ω (m) a.s. M m=1 w θ (m)! h The proof is straightforward. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

40 Second result Given all four conditons, M 1/2 h (M ) d! h N 0, τ 2 and h bτ 2(M ) = M M m=1 h ω (m) h (M )i 2 w θ (m) 2 h M m=1 w θ (m)i 2 a.s.! τ 2. The essential part of the proof is the delta method. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

41 Second result Given all four conditons, M 1/2 h (M ) d! h N 0, τ 2 and h bτ 2(M ) = M M m=1 h ω (m) h (M )i 2 w θ (m) 2 h M m=1 w θ (m)i 2 a.s.! τ 2. The essential part of the proof is the delta method. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

42 Bayes actions These results can be extended to provide simulation approximations of the Bayes action ba = arg min E [l (ω) j y o, A] and the Bayes risk Z R (a) = (Theorem 4.2.3) Ω Z Θ L (a, ω) p (θ ji ) p (ω j θ, I ) dθdω. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

43 Bayes actions These results can be extended to provide simulation approximations of the Bayes action ba = arg min E [l (ω) j y o, A] and the Bayes risk Z R (a) = (Theorem 4.2.3) Ω Z Θ L (a, ω) p (θ ji ) p (ω j θ, I ) dθdω. ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

44 A particularly useful property of importance sampling Recall the marginal likelihood Z p (y o j A) = p (θ A j A) p (y o j θ A, A) dθ A Θ A Z p (θ = A j A) p (y o j θ A, A) p (θ A j S) dθ A Θ A p (θ A j S) p (θa j A) p (y = o j θ A, A) E p (θ A j S) if θ A s p (θ A j S). ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

45 A particularly useful property of importance sampling Recall the marginal likelihood Z p (y o j A) = p (θ A j A) p (y o j θ A, A) dθ A Θ A Z p (θ = A j A) p (y o j θ A, A) p (θ A j S) dθ A Θ A p (θ A j S) p (θa j A) p (y = o j θ A, A) E p (θ A j S) if θ A s p (θ A j S). ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

46 If θ (m) A Useful property (continued) p p (y o (θa j A) p (y j A) = o j θ A, A) E p (θ A j S) iid s p (θa j S) then = E 2 E 4 1 M " 1 M M p θ (m) A j A p m=1 M w m=1 θ (m) A p (θ m A j S) #. y o j θ (m) A, A 3 5 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

47 If θ (m) A Useful property (continued) p p (y o (θa j A) p (y j A) = o j θ A, A) E p (θ A j S) iid s p (θa j S) then = E 2 E 4 1 M " 1 M M p θ (m) A j A p m=1 M w m=1 θ (m) A p (θ m A j S) #. y o j θ (m) A, A 3 5 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation methods August 14, / 24

Session 3A: Markov chain Monte Carlo (MCMC)

Session 3A: Markov chain Monte Carlo (MCMC) Session 3A: Markov chain Monte Carlo (MCMC) John Geweke Bayesian Econometrics and its Applications August 15, 2012 ohn Geweke Bayesian Econometrics and its Session Applications 3A: Markov () chain Monte

More information

Combining Macroeconomic Models for Prediction

Combining Macroeconomic Models for Prediction Combining Macroeconomic Models for Prediction John Geweke University of Technology Sydney 15th Australasian Macro Workshop April 8, 2010 Outline 1 Optimal prediction pools 2 Models and data 3 Optimal pools

More information

Modeling conditional distributions with mixture models: Applications in finance and financial decision-making

Modeling conditional distributions with mixture models: Applications in finance and financial decision-making Modeling conditional distributions with mixture models: Applications in finance and financial decision-making John Geweke University of Iowa, USA Journal of Applied Econometrics Invited Lecture Università

More information

Session 5B: A worked example EGARCH model

Session 5B: A worked example EGARCH model Session 5B: A worked example EGARCH model John Geweke Bayesian Econometrics and its Applications August 7, worked example EGARCH model August 7, / 6 EGARCH Exponential generalized autoregressive conditional

More information

Markov Chain Monte Carlo Methods

Markov Chain Monte Carlo Methods Markov Chain Monte Carlo Methods John Geweke University of Iowa, USA 2005 Institute on Computational Economics University of Chicago - Argonne National Laboaratories July 22, 2005 The problem p (θ, ω I)

More information

Bayesian Modeling of Conditional Distributions

Bayesian Modeling of Conditional Distributions Bayesian Modeling of Conditional Distributions John Geweke University of Iowa Indiana University Department of Economics February 27, 2007 Outline Motivation Model description Methods of inference Earnings

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian

More information

Bayesian Semiparametric GARCH Models

Bayesian Semiparametric GARCH Models Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods

More information

MODEL COMPARISON CHRISTOPHER A. SIMS PRINCETON UNIVERSITY

MODEL COMPARISON CHRISTOPHER A. SIMS PRINCETON UNIVERSITY ECO 513 Fall 2008 MODEL COMPARISON CHRISTOPHER A. SIMS PRINCETON UNIVERSITY SIMS@PRINCETON.EDU 1. MODEL COMPARISON AS ESTIMATING A DISCRETE PARAMETER Data Y, models 1 and 2, parameter vectors θ 1, θ 2.

More information

CS 540: Machine Learning Lecture 2: Review of Probability & Statistics

CS 540: Machine Learning Lecture 2: Review of Probability & Statistics CS 540: Machine Learning Lecture 2: Review of Probability & Statistics AD January 2008 AD () January 2008 1 / 35 Outline Probability theory (PRML, Section 1.2) Statistics (PRML, Sections 2.1-2.4) AD ()

More information

E cient Importance Sampling

E cient Importance Sampling E cient David N. DeJong University of Pittsburgh Spring 2008 Our goal is to calculate integrals of the form Z G (Y ) = ϕ (θ; Y ) dθ. Special case (e.g., posterior moment): Z G (Y ) = Θ Θ φ (θ; Y ) p (θjy

More information

Lecture 7 October 13

Lecture 7 October 13 STATS 300A: Theory of Statistics Fall 2015 Lecture 7 October 13 Lecturer: Lester Mackey Scribe: Jing Miao and Xiuyuan Lu 7.1 Recap So far, we have investigated various criteria for optimal inference. We

More information

Bayesian Semiparametric GARCH Models

Bayesian Semiparametric GARCH Models Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods

More information

Statistics: Learning models from data

Statistics: Learning models from data DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial

More information

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003 Statistical Approaches to Learning and Discovery Week 4: Decision Theory and Risk Minimization February 3, 2003 Recall From Last Time Bayesian expected loss is ρ(π, a) = E π [L(θ, a)] = L(θ, a) df π (θ)

More information

Model comparison. Christopher A. Sims Princeton University October 18, 2016

Model comparison. Christopher A. Sims Princeton University October 18, 2016 ECO 513 Fall 2008 Model comparison Christopher A. Sims Princeton University sims@princeton.edu October 18, 2016 c 2016 by Christopher A. Sims. This document may be reproduced for educational and research

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics 9. Model Selection - Theory David Giles Bayesian Econometrics One nice feature of the Bayesian analysis is that we can apply it to drawing inferences about entire models, not just parameters. Can't do

More information

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012 Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood

More information

Approximate Bayesian computation for spatial extremes via open-faced sandwich adjustment

Approximate Bayesian computation for spatial extremes via open-faced sandwich adjustment Approximate Bayesian computation for spatial extremes via open-faced sandwich adjustment Ben Shaby SAMSI August 3, 2010 Ben Shaby (SAMSI) OFS adjustment August 3, 2010 1 / 29 Outline 1 Introduction 2 Spatial

More information

GAUSSIAN PROCESS REGRESSION

GAUSSIAN PROCESS REGRESSION GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

Classical and Bayesian inference

Classical and Bayesian inference Classical and Bayesian inference AMS 132 Claudia Wehrhahn (UCSC) Classical and Bayesian inference January 8 1 / 11 The Prior Distribution Definition Suppose that one has a statistical model with parameter

More information

Theory of Maximum Likelihood Estimation. Konstantin Kashin

Theory of Maximum Likelihood Estimation. Konstantin Kashin Gov 2001 Section 5: Theory of Maximum Likelihood Estimation Konstantin Kashin February 28, 2013 Outline Introduction Likelihood Examples of MLE Variance of MLE Asymptotic Properties What is Statistical

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Prediction Using Several Macroeconomic Models

Prediction Using Several Macroeconomic Models Gianni Amisano European Central Bank 1 John Geweke University of Technology Sydney, Erasmus University, and University of Colorado European Central Bank May 5, 2012 1 The views expressed do not represent

More information

An Introduction to Bayesian Linear Regression

An Introduction to Bayesian Linear Regression An Introduction to Bayesian Linear Regression APPM 5720: Bayesian Computation Fall 2018 A SIMPLE LINEAR MODEL Suppose that we observe explanatory variables x 1, x 2,..., x n and dependent variables y 1,

More information

Modeling conditional distributions with mixture models: Theory and Inference

Modeling conditional distributions with mixture models: Theory and Inference Modeling conditional distributions with mixture models: Theory and Inference John Geweke University of Iowa, USA Journal of Applied Econometrics Invited Lecture Università di Venezia Italia June 2, 2005

More information

CSC321 Lecture 18: Learning Probabilistic Models

CSC321 Lecture 18: Learning Probabilistic Models CSC321 Lecture 18: Learning Probabilistic Models Roger Grosse Roger Grosse CSC321 Lecture 18: Learning Probabilistic Models 1 / 25 Overview So far in this course: mainly supervised learning Language modeling

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

The Normal Linear Regression Model with Natural Conjugate Prior. March 7, 2016

The Normal Linear Regression Model with Natural Conjugate Prior. March 7, 2016 The Normal Linear Regression Model with Natural Conjugate Prior March 7, 2016 The Normal Linear Regression Model with Natural Conjugate Prior The plan Estimate simple regression model using Bayesian methods

More information

STAT J535: Introduction

STAT J535: Introduction David B. Hitchcock E-Mail: hitchcock@stat.sc.edu Spring 2012 Chapter 1: Introduction to Bayesian Data Analysis Bayesian statistical inference uses Bayes Law (Bayes Theorem) to combine prior information

More information

A Very Brief Summary of Statistical Inference, and Examples

A Very Brief Summary of Statistical Inference, and Examples A Very Brief Summary of Statistical Inference, and Examples Trinity Term 2008 Prof. Gesine Reinert 1 Data x = x 1, x 2,..., x n, realisations of random variables X 1, X 2,..., X n with distribution (model)

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

Chapter 9: Interval Estimation and Confidence Sets Lecture 16: Confidence sets and credible sets

Chapter 9: Interval Estimation and Confidence Sets Lecture 16: Confidence sets and credible sets Chapter 9: Interval Estimation and Confidence Sets Lecture 16: Confidence sets and credible sets Confidence sets We consider a sample X from a population indexed by θ Θ R k. We are interested in ϑ, a vector-valued

More information

Minimax Estimation of Kernel Mean Embeddings

Minimax Estimation of Kernel Mean Embeddings Minimax Estimation of Kernel Mean Embeddings Bharath K. Sriperumbudur Department of Statistics Pennsylvania State University Gatsby Computational Neuroscience Unit May 4, 2016 Collaborators Dr. Ilya Tolstikhin

More information

... Econometric Methods for the Analysis of Dynamic General Equilibrium Models

... Econometric Methods for the Analysis of Dynamic General Equilibrium Models ... Econometric Methods for the Analysis of Dynamic General Equilibrium Models 1 Overview Multiple Equation Methods State space-observer form Three Examples of Versatility of state space-observer form:

More information

1 Hypothesis Testing and Model Selection

1 Hypothesis Testing and Model Selection A Short Course on Bayesian Inference (based on An Introduction to Bayesian Analysis: Theory and Methods by Ghosh, Delampady and Samanta) Module 6: From Chapter 6 of GDS 1 Hypothesis Testing and Model Selection

More information

Solutions of the Financial Risk Management Examination

Solutions of the Financial Risk Management Examination Solutions of the Financial Risk Management Examination Thierry Roncalli January 9 th 03 Remark The first five questions are corrected in TR-GDR and in the document of exercise solutions, which is available

More information

MAS3301 Bayesian Statistics

MAS3301 Bayesian Statistics MAS3301 Bayesian Statistics M. Farrow School of Mathematics and Statistics Newcastle University Semester 2, 2008-9 1 11 Conjugate Priors IV: The Dirichlet distribution and multinomial observations 11.1

More information

STAT215: Solutions for Homework 2

STAT215: Solutions for Homework 2 STAT25: Solutions for Homework 2 Due: Wednesday, Feb 4. (0 pt) Suppose we take one observation, X, from the discrete distribution, x 2 0 2 Pr(X x θ) ( θ)/4 θ/2 /2 (3 θ)/2 θ/4, 0 θ Find an unbiased estimator

More information

Some Asymptotic Bayesian Inference (background to Chapter 2 of Tanner s book)

Some Asymptotic Bayesian Inference (background to Chapter 2 of Tanner s book) Some Asymptotic Bayesian Inference (background to Chapter 2 of Tanner s book) Principal Topics The approach to certainty with increasing evidence. The approach to consensus for several agents, with increasing

More information

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails GMM-based inference in the AR() panel data model for parameter values where local identi cation fails Edith Madsen entre for Applied Microeconometrics (AM) Department of Economics, University of openhagen,

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian

More information

Eco517 Fall 2014 C. Sims MIDTERM EXAM

Eco517 Fall 2014 C. Sims MIDTERM EXAM Eco57 Fall 204 C. Sims MIDTERM EXAM You have 90 minutes for this exam and there are a total of 90 points. The points for each question are listed at the beginning of the question. Answer all questions.

More information

Introduction to Bayesian Inference

Introduction to Bayesian Inference University of Pennsylvania EABCN Training School May 10, 2016 Bayesian Inference Ingredients of Bayesian Analysis: Likelihood function p(y φ) Prior density p(φ) Marginal data density p(y ) = p(y φ)p(φ)dφ

More information

Introduction to Probabilistic Machine Learning

Introduction to Probabilistic Machine Learning Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning

More information

Massachusetts Institute of Technology Department of Economics Time Series Lecture 6: Additional Results for VAR s

Massachusetts Institute of Technology Department of Economics Time Series Lecture 6: Additional Results for VAR s Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture 6: Additional Results for VAR s 6.1. Confidence Intervals for Impulse Response Functions There

More information

Module 22: Bayesian Methods Lecture 9 A: Default prior selection

Module 22: Bayesian Methods Lecture 9 A: Default prior selection Module 22: Bayesian Methods Lecture 9 A: Default prior selection Peter Hoff Departments of Statistics and Biostatistics University of Washington Outline Jeffreys prior Unit information priors Empirical

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Statistical Inference: Maximum Likelihood and Bayesian Approaches

Statistical Inference: Maximum Likelihood and Bayesian Approaches Statistical Inference: Maximum Likelihood and Bayesian Approaches Surya Tokdar From model to inference So a statistical analysis begins by setting up a model {f (x θ) : θ Θ} for data X. Next we observe

More information

Gaussian kernel GARCH models

Gaussian kernel GARCH models Gaussian kernel GARCH models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics 7 June 2013 Motivation A regression model is often

More information

Short Questions (Do two out of three) 15 points each

Short Questions (Do two out of three) 15 points each Econometrics Short Questions Do two out of three) 5 points each ) Let y = Xβ + u and Z be a set of instruments for X When we estimate β with OLS we project y onto the space spanned by X along a path orthogonal

More information

Inverse Statistical Learning

Inverse Statistical Learning Inverse Statistical Learning Minimax theory, adaptation and algorithm avec (par ordre d apparition) C. Marteau, M. Chichignoud, C. Brunet and S. Souchet Dijon, le 15 janvier 2014 Inverse Statistical Learning

More information

Introduction to Machine Learning. Lecture 2

Introduction to Machine Learning. Lecture 2 Introduction to Machine Learning Lecturer: Eran Halperin Lecture 2 Fall Semester Scribe: Yishay Mansour Some of the material was not presented in class (and is marked with a side line) and is given for

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 5. Bayesian Computation Historically, the computational "cost" of Bayesian methods greatly limited their application. For instance, by Bayes' Theorem: p(θ y) = p(θ)p(y

More information

Likelihood-free MCMC

Likelihood-free MCMC Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte

More information

Bayesian and Frequentist Methods in Bandit Models

Bayesian and Frequentist Methods in Bandit Models Bayesian and Frequentist Methods in Bandit Models Emilie Kaufmann, Telecom ParisTech Bayes In Paris, ENSAE, October 24th, 2013 Emilie Kaufmann (Telecom ParisTech) Bayesian and Frequentist Bandits BIP,

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and Estimation I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 22, 2015

More information

Asymptotics of minimax stochastic programs

Asymptotics of minimax stochastic programs Asymptotics of minimax stochastic programs Alexander Shapiro Abstract. We discuss in this paper asymptotics of the sample average approximation (SAA) of the optimal value of a minimax stochastic programming

More information

Variational Bayes. A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M

Variational Bayes. A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M PD M = PD θ, MPθ Mdθ Lecture 14 : Variational Bayes where θ are the parameters of the model and Pθ M is

More information

Lecture 1: Bayesian Framework Basics

Lecture 1: Bayesian Framework Basics Lecture 1: Bayesian Framework Basics Melih Kandemir melih.kandemir@iwr.uni-heidelberg.de April 21, 2014 What is this course about? Building Bayesian machine learning models Performing the inference of

More information

Stat 451 Lecture Notes Numerical Integration

Stat 451 Lecture Notes Numerical Integration Stat 451 Lecture Notes 03 12 Numerical Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 5 in Givens & Hoeting, and Chapters 4 & 18 of Lange 2 Updated: February 11, 2016 1 / 29

More information

MIT Spring 2016

MIT Spring 2016 MIT 18.655 Dr. Kempthorne Spring 2016 1 MIT 18.655 Outline 1 2 MIT 18.655 Decision Problem: Basic Components P = {P θ : θ Θ} : parametric model. Θ = {θ}: Parameter space. A{a} : Action space. L(θ, a) :

More information

I. Bayesian econometrics

I. Bayesian econometrics I. Bayesian econometrics A. Introduction B. Bayesian inference in the univariate regression model C. Statistical decision theory D. Large sample results E. Diffuse priors F. Numerical Bayesian methods

More information

3.0.1 Multivariate version and tensor product of experiments

3.0.1 Multivariate version and tensor product of experiments ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 3: Minimax risk of GLM and four extensions Lecturer: Yihong Wu Scribe: Ashok Vardhan, Jan 28, 2016 [Ed. Mar 24]

More information

MIT Spring 2016

MIT Spring 2016 MIT 18.655 Dr. Kempthorne Spring 2016 1 MIT 18.655 Outline 1 2 MIT 18.655 3 Decision Problem: Basic Components P = {P θ : θ Θ} : parametric model. Θ = {θ}: Parameter space. A{a} : Action space. L(θ, a)

More information

Decision theory. 1 We may also consider randomized decision rules, where δ maps observed data D to a probability distribution over

Decision theory. 1 We may also consider randomized decision rules, where δ maps observed data D to a probability distribution over Point estimation Suppose we are interested in the value of a parameter θ, for example the unknown bias of a coin. We have already seen how one may use the Bayesian method to reason about θ; namely, we

More information

ESTIMATION of a DSGE MODEL

ESTIMATION of a DSGE MODEL ESTIMATION of a DSGE MODEL Paris, October 17 2005 STÉPHANE ADJEMIAN stephane.adjemian@ens.fr UNIVERSITÉ DU MAINE & CEPREMAP Slides.tex ESTIMATION of a DSGE MODEL STÉPHANE ADJEMIAN 16/10/2005 21:37 p. 1/3

More information

Bayesian estimation of bandwidths for a nonparametric regression model with a flexible error density

Bayesian estimation of bandwidths for a nonparametric regression model with a flexible error density ISSN 1440-771X Australia Department of Econometrics and Business Statistics http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/ Bayesian estimation of bandwidths for a nonparametric regression model

More information

9 Bayesian inference. 9.1 Subjective probability

9 Bayesian inference. 9.1 Subjective probability 9 Bayesian inference 1702-1761 9.1 Subjective probability This is probability regarded as degree of belief. A subjective probability of an event A is assessed as p if you are prepared to stake pm to win

More information

Chapter 9: Hypothesis Testing Sections

Chapter 9: Hypothesis Testing Sections Chapter 9: Hypothesis Testing Sections 9.1 Problems of Testing Hypotheses 9.2 Testing Simple Hypotheses 9.3 Uniformly Most Powerful Tests Skip: 9.4 Two-Sided Alternatives 9.6 Comparing the Means of Two

More information

Learning Theory. Ingo Steinwart University of Stuttgart. September 4, 2013

Learning Theory. Ingo Steinwart University of Stuttgart. September 4, 2013 Learning Theory Ingo Steinwart University of Stuttgart September 4, 2013 Ingo Steinwart University of Stuttgart () Learning Theory September 4, 2013 1 / 62 Basics Informal Introduction Informal Description

More information

Bayesian Indirect Inference and the ABC of GMM

Bayesian Indirect Inference and the ABC of GMM Bayesian Indirect Inference and the ABC of GMM Michael Creel, Jiti Gao, Han Hong, Dennis Kristensen Universitat Autónoma, Barcelona Graduate School of Economics, and MOVE Monash University Stanford University

More information

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist

More information

DSGE Methods. Estimation of DSGE models: Maximum Likelihood & Bayesian. Willi Mutschler, M.Sc.

DSGE Methods. Estimation of DSGE models: Maximum Likelihood & Bayesian. Willi Mutschler, M.Sc. DSGE Methods Estimation of DSGE models: Maximum Likelihood & Bayesian Willi Mutschler, M.Sc. Institute of Econometrics and Economic Statistics University of Münster willi.mutschler@uni-muenster.de Summer

More information

Using Simulation Methods for Bayesian Econometric Models: Inference, Development, and Communication

Using Simulation Methods for Bayesian Econometric Models: Inference, Development, and Communication Federal Reserve Bank of Minneapolis Research Department Staff Report 49 June 998 Using Simulation Methods for Bayesian Econometric Models: Inference, Development, and Communication John Geweke Federal

More information

Peter Hoff Minimax estimation October 31, Motivation and definition. 2 Least favorable prior 3. 3 Least favorable prior sequence 11

Peter Hoff Minimax estimation October 31, Motivation and definition. 2 Least favorable prior 3. 3 Least favorable prior sequence 11 Contents 1 Motivation and definition 1 2 Least favorable prior 3 3 Least favorable prior sequence 11 4 Nonparametric problems 15 5 Minimax and admissibility 18 6 Superefficiency and sparsity 19 Most of

More information

Gibbs Sampling in Linear Models #2

Gibbs Sampling in Linear Models #2 Gibbs Sampling in Linear Models #2 Econ 690 Purdue University Outline 1 Linear Regression Model with a Changepoint Example with Temperature Data 2 The Seemingly Unrelated Regressions Model 3 Gibbs sampling

More information

Minimum Message Length Analysis of the Behrens Fisher Problem

Minimum Message Length Analysis of the Behrens Fisher Problem Analysis of the Behrens Fisher Problem Enes Makalic and Daniel F Schmidt Centre for MEGA Epidemiology The University of Melbourne Solomonoff 85th Memorial Conference, 2011 Outline Introduction 1 Introduction

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

General Bayesian Inference I

General Bayesian Inference I General Bayesian Inference I Outline: Basic concepts, One-parameter models, Noninformative priors. Reading: Chapters 10 and 11 in Kay-I. (Occasional) Simplified Notation. When there is no potential for

More information

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference 1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE

More information

Eco517 Fall 2004 C. Sims MIDTERM EXAM

Eco517 Fall 2004 C. Sims MIDTERM EXAM Eco517 Fall 2004 C. Sims MIDTERM EXAM Answer all four questions. Each is worth 23 points. Do not devote disproportionate time to any one question unless you have answered all the others. (1) We are considering

More information

Estimation Tasks. Short Course on Image Quality. Matthew A. Kupinski. Introduction

Estimation Tasks. Short Course on Image Quality. Matthew A. Kupinski. Introduction Estimation Tasks Short Course on Image Quality Matthew A. Kupinski Introduction Section 13.3 in B&M Keep in mind the similarities between estimation and classification Image-quality is a statistical concept

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department

More information

Bayesian semiparametric GARCH models

Bayesian semiparametric GARCH models ISSN 1440-771X Australia Department of Econometrics and Business Statistics http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/ Bayesian semiparametric GARCH models Xibin Zhang and Maxwell L. King

More information

Foundations of Statistical Inference

Foundations of Statistical Inference Foundations of Statistical Inference Julien Berestycki Department of Statistics University of Oxford MT 2016 Julien Berestycki (University of Oxford) SB2a MT 2016 1 / 20 Lecture 6 : Bayesian Inference

More information

Peter Hoff Minimax estimation November 12, Motivation and definition. 2 Least favorable prior 3. 3 Least favorable prior sequence 11

Peter Hoff Minimax estimation November 12, Motivation and definition. 2 Least favorable prior 3. 3 Least favorable prior sequence 11 Contents 1 Motivation and definition 1 2 Least favorable prior 3 3 Least favorable prior sequence 11 4 Nonparametric problems 15 5 Minimax and admissibility 18 6 Superefficiency and sparsity 19 Most of

More information

g-priors for Linear Regression

g-priors for Linear Regression Stat60: Bayesian Modeling and Inference Lecture Date: March 15, 010 g-priors for Linear Regression Lecturer: Michael I. Jordan Scribe: Andrew H. Chan 1 Linear regression and g-priors In the last lecture,

More information

DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling

DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling DS-GA 1003: Machine Learning and Computational Statistics Homework 7: Bayesian Modeling Due: Tuesday, May 10, 2016, at 6pm (Submit via NYU Classes) Instructions: Your answers to the questions below, including

More information

1 of 7 7/16/2009 6:12 AM Virtual Laboratories > 7. Point Estimation > 1 2 3 4 5 6 1. Estimators The Basic Statistical Model As usual, our starting point is a random experiment with an underlying sample

More information

Lecture 11 and 12: Basic estimation theory

Lecture 11 and 12: Basic estimation theory Lecture ad 2: Basic estimatio theory Sprig 202 - EE 94 Networked estimatio ad cotrol Prof. Kha March 2 202 I. MAXIMUM-LIKELIHOOD ESTIMATORS The maximum likelihood priciple is deceptively simple. Louis

More information

Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee

Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee Stochastic Processes Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee 1 Outline Methods of Mean Squared Error Bias and Unbiasedness Best Unbiased Estimators CR-Bound for variance

More information

2014 Preliminary Examination

2014 Preliminary Examination 014 reliminary Examination 1) Standard error consistency and test statistic asymptotic normality in linear models Consider the model for the observable data y t ; x T t n Y = X + U; (1) where is a k 1

More information

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let

More information

Estimation of moment-based models with latent variables

Estimation of moment-based models with latent variables Estimation of moment-based models with latent variables work in progress Ra aella Giacomini and Giuseppe Ragusa UCL/Cemmap and UCI/Luiss UPenn, 5/4/2010 Giacomini and Ragusa (UCL/Cemmap and UCI/Luiss)Moments

More information

Foundations of Statistical Inference

Foundations of Statistical Inference Foundations of Statistical Inference Julien Berestycki Department of Statistics University of Oxford MT 2016 Julien Berestycki (University of Oxford) SB2a MT 2016 1 / 32 Lecture 14 : Variational Bayes

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 Suggested Projects: www.cs.ubc.ca/~arnaud/projects.html First assignement on the web this afternoon: capture/recapture.

More information