Optimal Uncertainty Quantification of Quantiles

Size: px
Start display at page:

Download "Optimal Uncertainty Quantification of Quantiles"

Transcription

1 Optimal Uncertainty Quantification of Quantiles Application to Industrial Risk Management. Merlin Keller 1, Jérôme Stenger 12 1 EDF R&D, France 2 Institut Mathématique de Toulouse, France ETICS 2018, Roscoff 1 / 12

2 1 Introduction 2 Robust Bounds on Quantiles 2 / 12

3 Optimal Uncertainty Quantification (OUQ) : [Owhadi et al., 2013] Principle Find optimal bounds for a quantity of interest Q(µ ), functional of an uncertain probability measure µ, known only to lie in some subset A of M 1(X ) : with : Q(A) = inf µ A Q(µ) Q(A) = sup µ A Q(µ) Q(A) Q(µ ) Q(A), A = {µ M 1(X ) Φ j (µ) c j, j = 1,..., N} the admissible subset, Distributionally Robust : Useful in risk-adverse situations when parametric assumptions are hard to justify Many applications in Industrial Risk Management (find two : exercise 1) 3 / 12

4 Optimal Uncertainty Quantification (OUQ) : [Owhadi et al., 2013] Principle Find optimal bounds for a quantity of interest Q(µ ), functional of an uncertain probability measure µ, known only to lie in some subset A of M 1(X ) : with : Q(A) = inf µ A Q(µ) Q(A) = sup µ A Q(µ) Q(A) Q(µ ) Q(A), A = {µ M 1(X ) Φ j (µ) c j, j = 1,..., N} the admissible subset, Distributionally Robust : Useful in risk-adverse situations when parametric assumptions are hard to justify Many applications in Industrial Risk Management (find two : exercise 1) Dyke Conception / Reliability Assessment 3 / 12

5 Optimal Uncertainty Quantification (OUQ) : [Owhadi et al., 2013] Principle Find optimal bounds for a quantity of interest Q(µ ), functional of an uncertain probability measure µ, known only to lie in some subset A of M 1(X ) : with : Q(A) = inf µ A Q(µ) Q(A) = sup µ A Q(µ) Q(A) Q(µ ) Q(A), A = {µ M 1(X ) Φ j (µ) c j, j = 1,..., N} the admissible subset, Distributionally Robust : Useful in risk-adverse situations when parametric assumptions are hard to justify Many applications in Industrial Risk Management (find two : exercise 1) Dyke Conception / Reliability Assessment : Q(µ) := P µ[z h > 0], Z µ water level, h dyke height, Z h overflow 3 / 12

6 Optimal Uncertainty Quantification (OUQ) : [Owhadi et al., 2013] Principle Find optimal bounds for a quantity of interest Q(µ ), functional of an uncertain probability measure µ, known only to lie in some subset A of M 1(X ) : with : Q(A) = inf µ A Q(µ) Q(A) = sup µ A Q(µ) Q(A) Q(µ ) Q(A), A = {µ M 1(X ) Φ j (µ) c j, j = 1,..., N} the admissible subset, Distributionally Robust : Useful in risk-adverse situations when parametric assumptions are hard to justify Many applications in Industrial Risk Management (find two : exercise 1) Dyke Conception / Reliability Assessment : Q(µ) := P µ[z h > 0], Z µ water level, h dyke height, Z h overflow Nuclear Accident Risk Assessment 3 / 12

7 Optimal Uncertainty Quantification (OUQ) : [Owhadi et al., 2013] Principle Find optimal bounds for a quantity of interest Q(µ ), functional of an uncertain probability measure µ, known only to lie in some subset A of M 1(X ) : with : Q(A) = inf µ A Q(µ) Q(A) = sup µ A Q(µ) Q(A) Q(µ ) Q(A), A = {µ M 1(X ) Φ j (µ) c j, j = 1,..., N} the admissible subset, Distributionally Robust : Useful in risk-adverse situations when parametric assumptions are hard to justify Many applications in Industrial Risk Management (find two : exercise 1) Dyke Conception / Reliability Assessment : Q(µ) := P µ[z h > 0], Z µ water level, h dyke height, Z h overflow Nuclear Accident Risk Assessment : Q(µ) := sup{t > 0 P µ[t t] p}, T µ maximal temperature inside nuclear reactor, p safety requirement 3 / 12

8 Optimal Uncertainty Quantification (OUQ) : [Owhadi et al., 2013] Principle Find optimal bounds for a quantity of interest Q(µ ), functional of an uncertain probability measure µ, known only to lie in some subset A of M 1(X ) : with : Q(A) = inf µ A Q(µ) Q(A) = sup µ A Q(µ) Q(A) Q(µ ) Q(A), A = {µ M 1(X ) Φ j (µ) c j, j = 1,..., N} the admissible subset, Distributionally Robust : Useful in risk-adverse situations when parametric assumptions are hard to justify Many applications in Industrial Risk Management (find two : exercise 1) Dyke Conception / Reliability Assessment : Q(µ) := P µ[z h > 0], Z µ water level, h dyke height, Z h overflow Nuclear Accident Risk Assessment : Q(µ) := sup{t > 0 P µ[t t] p}, T µ maximal temperature inside nuclear reactor, p safety requirement and also : Probabilistic Seismic Hazard Assessment (PSHA), Extreme weather forecasting, Environmental risk assessment, Ecotoxicology,... 3 / 12

9 Robust Bayesian Inference (RBI) : [Rios Insua and Ruggeri, 2000] RBI as a special case of OUQ µ prior distribution on parameter θ in statistical model Y f θ dλ, belonging to a class A of admissible priors P Y (µ) posterior distribution of θ given Y, according to Bayes theorem : dp Y (µ)(θ) = f θ (Y )dµ(θ) ν fν(y )dµ(ν) (1) Derive optimal bounds on interest quantity of posterior distribution replace Q(µ) by Q(P Y (µ)) (or Q by Q P Y ) 4 / 12

10 Robust Bayesian Inference (RBI) : [Rios Insua and Ruggeri, 2000] RBI as a special case of OUQ µ prior distribution on parameter θ in statistical model Y f θ dλ, belonging to a class A of admissible priors P Y (µ) posterior distribution of θ given Y, according to Bayes theorem : dp Y (µ)(θ) = f θ (Y )dµ(θ) ν fν(y )dµ(ν) (1) Derive optimal bounds on interest quantity of posterior distribution replace Q(µ) by Q(P Y (µ)) (or Q by Q P Y ) OUQ as a special case of RBI OUQ corresponds to the no-data case (f θ := f does not depend on θ) Proof : (1) reduces to P Y (µ), whence Q(P Y (µ)) reduces to Q(µ) both formulations are equivalent, and special cases of the Dempster Schaeffer Theory (DST) (M. Couplet, private conversation) 4 / 12

11 Main result Theorem (Measure affine functionals over generalized moment classes) If : Q(µ) is measure affine (e.g. Q(µ) := E µ[q], q bounded above or below) A = {µ M 1(X ) E µ[ϕ j ] c j, j = 1,..., N} for measurable functions ϕ j A = {µ A µ = N 0=1 w iδ xi } extremal admissible probability measures Then : Q(A) = Q(A ) ; Q(A) = Q(A ) 5 / 12

12 Main result Theorem (Measure affine functionals over generalized moment classes) If : Q(µ) is measure affine (e.g. Q(µ) := E µ[q], q bounded above or below) A = {µ M 1(X ) E µ[ϕ j ] c j, j = 1,..., N} for measurable functions ϕ j A = {µ A µ = N 0=1 w iδ xi } extremal admissible probability measures Then : Q(A) = Q(A ) ; Q(A) = Q(A ) Implementation To find Q(A) (resp.q(a)) : Minimize (resp. Maximize) Q(µ) = N i=0 w iq(x i ) wrt : (w i, x i ) 0 i N subject to : N i=0 w iϕ j (x i ) c j, for j = 1,..., N Constrained optimization problem, solvabe by (almost) off-the shelf methods if q, ϕ j analytical functions : Mystic framework [McKerns et al., 2012] 5 / 12

13 1 Introduction 2 Robust Bounds on Quantiles 6 / 12

14 A Simple Result Theorem (Quantile-CDF duality) Assume X = R + Let : Then : F µ(x) = P µ[x x] pointwise cdf evaluation functional Q µ(p) = inf{x > 0 F µ(x) p} order-p quantile Q A(p) = inf{x > 0 F A(x) p} 7 / 12

15 A Simple Result Theorem (Quantile-CDF duality) Assume X = R + Let : Then : Proof. F µ(x) = P µ[x x] pointwise cdf evaluation functional Q µ(p) = inf{x > 0 F µ(x) p} order-p quantile Q A(p) = inf{x > 0 F A(x) p} µ A : {x > 0 F A(x) p} {x > 0 F µ(x) p} inf{x > 0 F A(x) p} Q µ(p) inf{x > 0 F A(x) p} Q A(p). Assuming a strict inequality, x 0, s.t. : x 0 < inf{x > 0 F A (x) p} F A (x 0 ) < p µ, F µ(x 0 ) < p Q A (p) < x 0 Q µ(p) < x 0 F µ(x 0 ) p, leading to a contradiction 7 / 12

16 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) 8 / 12

17 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) For t = 1, 2,... : Choose x t (a t 1, b t 1 ) Calculate F A (x t) If F A (x t) p, set a t := x t If F A (x t) > p, set b t := x t Stop when b t a t < ε b t is then an ε-approximation of Q A(p) 8 / 12

18 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) For t = 1, 2,... : Choose x t (a t 1, b t 1 ) Calculate F A (x t) If F A (x t) p, set a t := x t If F A (x t) > p, set b t := x t Stop when b t a t < ε b t is then an ε-approximation of Q A(p) 8 / 12

19 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) For t = 1, 2,... : Choose x t (a t 1, b t 1 ) Calculate F A (x t) If F A (x t) p, set a t := x t If F A (x t) > p, set b t := x t Stop when b t a t < ε b t is then an ε-approximation of Q A(p) 8 / 12

20 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) For t = 1, 2,... : Choose x t (a t 1, b t 1 ) Calculate F A (x t) If F A (x t) p, set a t := x t If F A (x t) > p, set b t := x t Stop when b t a t < ε b t is then an ε-approximation of Q A(p) 8 / 12

21 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) For t = 1, 2,... : Choose x t (a t 1, b t 1 ) Calculate F A (x t) If F A (x t) p, set a t := x t If F A (x t) > p, set b t := x t Stop when b t a t < ε b t is then an ε-approximation of Q A(p) Challenges 8 / 12

22 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) For t = 1, 2,... : Choose x t (a t 1, b t 1 ) Calculate F A (x t) If F A (x t) p, set a t := x t If F A (x t) > p, set b t := x t Stop when b t a t < ε b t is then an ε-approximation of Q A(p) Challenges How to choose x t (a t 1, b t 1)? 8 / 12

23 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) For t = 1, 2,... : Choose x t (a t 1, b t 1 ) Calculate F A (x t) If F A (x t) p, set a t := x t If F A (x t) > p, set b t := x t Stop when b t a t < ε b t is then an ε-approximation of Q A(p) Challenges How to choose x t (a t 1, b t 1)? Can we guarantee Q A(p) is nontrivial (< )? 8 / 12

24 Application : Inversion of CDF Bounds Sequential construction of Quantile Upper Bound Initialization (t = 0) : (a t, b t) = (0, + ) For t = 1, 2,... : Choose x t (a t 1, b t 1 ) Calculate F A (x t) If F A (x t) p, set a t := x t If F A (x t) > p, set b t := x t Stop when b t a t < ε b t is then an ε-approximation of Q A(p) Challenges How to choose x t (a t 1, b t 1)? Can we guarantee Q A(p) is nontrivial (< )? What if Q(µ) is an extreme quantile on Y = G(X ), with X µ and G a costly computer model? Can we develop an EGO-like approach? 8 / 12

25 Alternative : Direct quantile optimization Q µ(p) not a measure affine functional However, quantile-cdf duality ensures that main result still applies : OUQ for quantiles To find Q A(p) (resp.q A(p)) : Minimize (resp. Maximize) Q µ(p) = x (i ) wrt : (w i, x i ) 0 i N where : i = min 0 i N il=0 w (i) p x (0)... x (N) w i > 0 and N i=1 w i = 1 subject to : N i=0 w iϕ j (x i ) c j, for j = 1,..., N 9 / 12

26 Alternative : Direct quantile optimization Q µ(p) not a measure affine functional However, quantile-cdf duality ensures that main result still applies : OUQ for quantiles To find Q A(p) (resp.q A(p)) : Minimize (resp. Maximize) Q µ(p) = x (i ) wrt : (w i, x i ) 0 i N where : i = min 0 i N il=0 w (i) p x (0)... x (N) w i > 0 and N i=1 w i = 1 subject to : N i=0 w iϕ j (x i ) c j, for j = 1,..., N Main difficulty Objective function Q p(µ) = Q p((w i, x i ) 0 i N ) irregular and non convex 9 / 12

27 Case study : Quantile of nonlinear transform under product measure Problem specification (simplified version) Q µ(p) = sup{y > 0, P µ[g(x ) y] p}, where : X = (X 1,..., X d ) µ over [0, 1] d G : [0, 1] d R + potentially costly A = {µ = d k=1µ k E µk [X k ] = m k, 1 k d} Extreme set of product measures [Owhadi et al., 2013] show that Q(A) = Q(A ) where : Q(µ) measure affine (extendable to quantiles by duality with CDF) ( ) A = {µ = d k=1 wk δ xk,0 + (1 w k )δ xk,1 wk x k,0 + (1 w k )x k,1 = m k } 10 / 12

28 To be continued... THANKS FOR YOUR ATTENTION!! 11 / 12

29 References McKerns, M., Owhadi, H., Scovel, C., Sullivan, T. J., and Ortiz, M. (2012). The optimal uncertainty algorithm in the mystic framework. CoRR, abs/ Owhadi, H., Scovel, C., Sullivan, T. J., McKerns, M., and Ortiz, M. (2013). Optimal Uncertainty Quantification. SIAM Rev., 2(55) : Rios Insua, D. and Ruggeri, F. (2000). Robust Bayesian Analysis. Springer-Verlag. 12 / 12

Lecture 8. Strong Duality Results. September 22, 2008

Lecture 8. Strong Duality Results. September 22, 2008 Strong Duality Results September 22, 2008 Outline Lecture 8 Slater Condition and its Variations Convex Objective with Linear Inequality Constraints Quadratic Objective over Quadratic Constraints Representation

More information

Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization

Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization Instructor: Farid Alizadeh Author: Ai Kagawa 12/12/2012

More information

Lagrange duality. The Lagrangian. We consider an optimization program of the form

Lagrange duality. The Lagrangian. We consider an optimization program of the form Lagrange duality Another way to arrive at the KKT conditions, and one which gives us some insight on solving constrained optimization problems, is through the Lagrange dual. The dual is a maximization

More information

Bayesian Brittleness

Bayesian Brittleness Bayesian Brittleness Houman Owhadi Bayesian Brittleness. H. Owhadi, C. Scovel, T. Sullivan. arxiv:1304.6772 Brittleness of Bayesian inference and new Selberg formulas. H. Owhadi, C. Scovel. arxiv:1304.7046

More information

14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness.

14. Duality. ˆ Upper and lower bounds. ˆ General duality. ˆ Constraint qualifications. ˆ Counterexample. ˆ Complementary slackness. CS/ECE/ISyE 524 Introduction to Optimization Spring 2016 17 14. Duality ˆ Upper and lower bounds ˆ General duality ˆ Constraint qualifications ˆ Counterexample ˆ Complementary slackness ˆ Examples ˆ Sensitivity

More information

Lecture notes on statistical decision theory Econ 2110, fall 2013

Lecture notes on statistical decision theory Econ 2110, fall 2013 Lecture notes on statistical decision theory Econ 2110, fall 2013 Maximilian Kasy March 10, 2014 These lecture notes are roughly based on Robert, C. (2007). The Bayesian choice: from decision-theoretic

More information

Quantifying Stochastic Model Errors via Robust Optimization

Quantifying Stochastic Model Errors via Robust Optimization Quantifying Stochastic Model Errors via Robust Optimization IPAM Workshop on Uncertainty Quantification for Multiscale Stochastic Systems and Applications Jan 19, 2016 Henry Lam Industrial & Operations

More information

AN ALGORITHM FOR CALCULATING Ɣ-MINIMAX DECISION RULES UNDER GENERALIZED MOMENT CONDITIONS 1

AN ALGORITHM FOR CALCULATING Ɣ-MINIMAX DECISION RULES UNDER GENERALIZED MOMENT CONDITIONS 1 The Annals of Statistics 2001, Vol. 29, No. 4, 1094 1116 AN ALGORITHM FOR CALCULATING Ɣ-MINIMAX DECISION RULES UNDER GENERALIZED MOMENT CONDITIONS 1 By Roger Fandom Noubiap and Wilfried Seidel Universität

More information

SOME HISTORY OF STOCHASTIC PROGRAMMING

SOME HISTORY OF STOCHASTIC PROGRAMMING SOME HISTORY OF STOCHASTIC PROGRAMMING Early 1950 s: in applications of Linear Programming unknown values of coefficients: demands, technological coefficients, yields, etc. QUOTATION Dantzig, Interfaces

More information

Introduction to Correspondences

Introduction to Correspondences Division of the Humanities and Social Sciences Introduction to Correspondences KC Border September 2010 Revised November 2013 In these particular notes, all spaces are metric spaces. The set of extended

More information

Research Collection. Basics of structural reliability and links with structural design codes FBH Herbsttagung November 22nd, 2013.

Research Collection. Basics of structural reliability and links with structural design codes FBH Herbsttagung November 22nd, 2013. Research Collection Presentation Basics of structural reliability and links with structural design codes FBH Herbsttagung November 22nd, 2013 Author(s): Sudret, Bruno Publication Date: 2013 Permanent Link:

More information

Structural Design under Fuzzy Randomness

Structural Design under Fuzzy Randomness NSF workshop on Reliable Engineering Computing, September 15-17, 2004 Center for Reliable Engineering Computing, Georgia Tech Savannah, GA Structural Design under Fuzzy Randomness Michael Beer 1) Martin

More information

4. Convex optimization problems

4. Convex optimization problems Convex Optimization Boyd & Vandenberghe 4. Convex optimization problems optimization problem in standard form convex optimization problems quasiconvex optimization linear optimization quadratic optimization

More information

Characterisation of Accumulation Points. Convergence in Metric Spaces. Characterisation of Closed Sets. Characterisation of Closed Sets

Characterisation of Accumulation Points. Convergence in Metric Spaces. Characterisation of Closed Sets. Characterisation of Closed Sets Convergence in Metric Spaces Functional Analysis Lecture 3: Convergence and Continuity in Metric Spaces Bengt Ove Turesson September 4, 2016 Suppose that (X, d) is a metric space. A sequence (x n ) X is

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 3 A. d Aspremont. Convex Optimization M2. 1/49 Duality A. d Aspremont. Convex Optimization M2. 2/49 DMs DM par email: dm.daspremont@gmail.com A. d Aspremont. Convex Optimization

More information

A convergence result for an Outer Approximation Scheme

A convergence result for an Outer Approximation Scheme A convergence result for an Outer Approximation Scheme R. S. Burachik Engenharia de Sistemas e Computação, COPPE-UFRJ, CP 68511, Rio de Janeiro, RJ, CEP 21941-972, Brazil regi@cos.ufrj.br J. O. Lopes Departamento

More information

Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016

Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 Lecture 1: Entropy, convexity, and matrix scaling CSE 599S: Entropy optimality, Winter 2016 Instructor: James R. Lee Last updated: January 24, 2016 1 Entropy Since this course is about entropy maximization,

More information

Statistics 612: L p spaces, metrics on spaces of probabilites, and connections to estimation

Statistics 612: L p spaces, metrics on spaces of probabilites, and connections to estimation Statistics 62: L p spaces, metrics on spaces of probabilites, and connections to estimation Moulinath Banerjee December 6, 2006 L p spaces and Hilbert spaces We first formally define L p spaces. Consider

More information

Lecture Note 5: Semidefinite Programming for Stability Analysis

Lecture Note 5: Semidefinite Programming for Stability Analysis ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State

More information

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers

Optimization for Communications and Networks. Poompat Saengudomlert. Session 4 Duality and Lagrange Multipliers Optimization for Communications and Networks Poompat Saengudomlert Session 4 Duality and Lagrange Multipliers P Saengudomlert (2015) Optimization Session 4 1 / 14 24 Dual Problems Consider a primal convex

More information

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003

Statistical Approaches to Learning and Discovery. Week 4: Decision Theory and Risk Minimization. February 3, 2003 Statistical Approaches to Learning and Discovery Week 4: Decision Theory and Risk Minimization February 3, 2003 Recall From Last Time Bayesian expected loss is ρ(π, a) = E π [L(θ, a)] = L(θ, a) df π (θ)

More information

Constrained optimization

Constrained optimization Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained

More information

Solving Dual Problems

Solving Dual Problems Lecture 20 Solving Dual Problems We consider a constrained problem where, in addition to the constraint set X, there are also inequality and linear equality constraints. Specifically the minimization problem

More information

ST5215: Advanced Statistical Theory

ST5215: Advanced Statistical Theory Department of Statistics & Applied Probability Wednesday, October 5, 2011 Lecture 13: Basic elements and notions in decision theory Basic elements X : a sample from a population P P Decision: an action

More information

Convex optimization problems. Optimization problem in standard form

Convex optimization problems. Optimization problem in standard form Convex optimization problems optimization problem in standard form convex optimization problems linear optimization quadratic optimization geometric programming quasiconvex optimization generalized inequality

More information

Econ 2148, spring 2019 Statistical decision theory

Econ 2148, spring 2019 Statistical decision theory Econ 2148, spring 2019 Statistical decision theory Maximilian Kasy Department of Economics, Harvard University 1 / 53 Takeaways for this part of class 1. A general framework to think about what makes a

More information

ORIGINS OF STOCHASTIC PROGRAMMING

ORIGINS OF STOCHASTIC PROGRAMMING ORIGINS OF STOCHASTIC PROGRAMMING Early 1950 s: in applications of Linear Programming unknown values of coefficients: demands, technological coefficients, yields, etc. QUOTATION Dantzig, Interfaces 20,1990

More information

arxiv: v1 [math.oc] 27 Nov 2013

arxiv: v1 [math.oc] 27 Nov 2013 Submitted, SIAM Journal on Optimization (SIOPT) http://www.cds.caltech.edu/~murray/papers/han+13-siopt.html Convex Optimal Uncertainty Quantification arxiv:1311.7130v1 [math.oc] 27 Nov 2013 Shuo Han 1,

More information

Bayesian vs frequentist techniques for the analysis of binary outcome data

Bayesian vs frequentist techniques for the analysis of binary outcome data 1 Bayesian vs frequentist techniques for the analysis of binary outcome data By M. Stapleton Abstract We compare Bayesian and frequentist techniques for analysing binary outcome data. Such data are commonly

More information

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization

Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Extreme Abridgment of Boyd and Vandenberghe s Convex Optimization Compiled by David Rosenberg Abstract Boyd and Vandenberghe s Convex Optimization book is very well-written and a pleasure to read. The

More information

Statistics: Learning models from data

Statistics: Learning models from data DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial

More information

3 Measurable Functions

3 Measurable Functions 3 Measurable Functions Notation A pair (X, F) where F is a σ-field of subsets of X is a measurable space. If µ is a measure on F then (X, F, µ) is a measure space. If µ(x) < then (X, F, µ) is a probability

More information

Nonlinear Programming 3rd Edition. Theoretical Solutions Manual Chapter 6

Nonlinear Programming 3rd Edition. Theoretical Solutions Manual Chapter 6 Nonlinear Programming 3rd Edition Theoretical Solutions Manual Chapter 6 Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts 1 NOTE This manual contains

More information

Robust Optimization for Risk Control in Enterprise-wide Optimization

Robust Optimization for Risk Control in Enterprise-wide Optimization Robust Optimization for Risk Control in Enterprise-wide Optimization Juan Pablo Vielma Department of Industrial Engineering University of Pittsburgh EWO Seminar, 011 Pittsburgh, PA Uncertainty in Optimization

More information

Convergence Rates of Kernel Quadrature Rules

Convergence Rates of Kernel Quadrature Rules Convergence Rates of Kernel Quadrature Rules Francis Bach INRIA - Ecole Normale Supérieure, Paris, France ÉCOLE NORMALE SUPÉRIEURE NIPS workshop on probabilistic integration - Dec. 2015 Outline Introduction

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

GAMMA-ADMISSIBILITY IN A NON-REGULAR FAMILY WITH SQUARED-LOG ERROR LOSS

GAMMA-ADMISSIBILITY IN A NON-REGULAR FAMILY WITH SQUARED-LOG ERROR LOSS GAMMA-ADMISSIBILITY IN A NON-REGULAR FAMILY WITH SQUARED-LOG ERROR LOSS Authors: Shirin Moradi Zahraie Department of Statistics, Yazd University, Yazd, Iran (sh moradi zahraie@stu.yazd.ac.ir) Hojatollah

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition.

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition. Christian P. Robert The Bayesian Choice From Decision-Theoretic Foundations to Computational Implementation Second Edition With 23 Illustrations ^Springer" Contents Preface to the Second Edition Preface

More information

Introduction to optimal transport

Introduction to optimal transport Introduction to optimal transport Nicola Gigli May 20, 2011 Content Formulation of the transport problem The notions of c-convexity and c-cyclical monotonicity The dual problem Optimal maps: Brenier s

More information

Lagrangian Duality and Convex Optimization

Lagrangian Duality and Convex Optimization Lagrangian Duality and Convex Optimization David Rosenberg New York University February 11, 2015 David Rosenberg (New York University) DS-GA 1003 February 11, 2015 1 / 24 Introduction Why Convex Optimization?

More information

Bayesian Decision Theory

Bayesian Decision Theory Bayesian Decision Theory Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Bayesian Decision Theory Bayesian classification for normal distributions Error Probabilities

More information

Global Optimality Conditions in Maximizing a Convex Quadratic Function under Convex Quadratic Constraints

Global Optimality Conditions in Maximizing a Convex Quadratic Function under Convex Quadratic Constraints Journal of Global Optimization 21: 445 455, 2001. 2001 Kluwer Academic Publishers. Printed in the Netherlands. 445 Global Optimality Conditions in Maximizing a Convex Quadratic Function under Convex Quadratic

More information

LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE

LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE LECTURE 25: REVIEW/EPILOGUE LECTURE OUTLINE CONVEX ANALYSIS AND DUALITY Basic concepts of convex analysis Basic concepts of convex optimization Geometric duality framework - MC/MC Constrained optimization

More information

Uncertainty quantification via codimension one domain partitioning and a new concentration inequality

Uncertainty quantification via codimension one domain partitioning and a new concentration inequality Uncertainty quantification via codimension one domain partitioning and a new concentration inequality Tim Sullivan, Ufuk Topcu, Mike McKerns & Houman Owhadi tjs@..., utopcu@..., mmckerns@...& owhadi@caltech.edu

More information

A SECOND ORDER STOCHASTIC DOMINANCE PORTFOLIO EFFICIENCY MEASURE

A SECOND ORDER STOCHASTIC DOMINANCE PORTFOLIO EFFICIENCY MEASURE K Y B E R N E I K A V O L U M E 4 4 ( 2 0 0 8 ), N U M B E R 2, P A G E S 2 4 3 2 5 8 A SECOND ORDER SOCHASIC DOMINANCE PORFOLIO EFFICIENCY MEASURE Miloš Kopa and Petr Chovanec In this paper, we introduce

More information

Constrained Optimization and Lagrangian Duality

Constrained Optimization and Lagrangian Duality CIS 520: Machine Learning Oct 02, 2017 Constrained Optimization and Lagrangian Duality Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture. They may or may

More information

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Brian Williams and Rick Picard LA-UR-12-22467 Statistical Sciences Group, Los Alamos National Laboratory Abstract Importance

More information

Session 2B: Some basic simulation methods

Session 2B: Some basic simulation methods Session 2B: Some basic simulation methods John Geweke Bayesian Econometrics and its Applications August 14, 2012 ohn Geweke Bayesian Econometrics and its Applications Session 2B: Some () basic simulation

More information

Włodzimierz Ogryczak. Warsaw University of Technology, ICCE ON ROBUST SOLUTIONS TO MULTI-OBJECTIVE LINEAR PROGRAMS. Introduction. Abstract.

Włodzimierz Ogryczak. Warsaw University of Technology, ICCE ON ROBUST SOLUTIONS TO MULTI-OBJECTIVE LINEAR PROGRAMS. Introduction. Abstract. Włodzimierz Ogryczak Warsaw University of Technology, ICCE ON ROBUST SOLUTIONS TO MULTI-OBJECTIVE LINEAR PROGRAMS Abstract In multiple criteria linear programming (MOLP) any efficient solution can be found

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

On the Complexity of Best Arm Identification with Fixed Confidence

On the Complexity of Best Arm Identification with Fixed Confidence On the Complexity of Best Arm Identification with Fixed Confidence Discrete Optimization with Noise Aurélien Garivier, Emilie Kaufmann COLT, June 23 th 2016, New York Institut de Mathématiques de Toulouse

More information

arxiv: v3 [math.oc] 25 Apr 2018

arxiv: v3 [math.oc] 25 Apr 2018 Problem-driven scenario generation: an analytical approach for stochastic programs with tail risk measure Jamie Fairbrother *, Amanda Turner *, and Stein W. Wallace ** * STOR-i Centre for Doctoral Training,

More information

Bayesian statistics: Inference and decision theory

Bayesian statistics: Inference and decision theory Bayesian statistics: Inference and decision theory Patric Müller und Francesco Antognini Seminar über Statistik FS 28 3.3.28 Contents 1 Introduction and basic definitions 2 2 Bayes Method 4 3 Two optimalities:

More information

4. Convex optimization problems

4. Convex optimization problems Convex Optimization Boyd & Vandenberghe 4. Convex optimization problems optimization problem in standard form convex optimization problems quasiconvex optimization linear optimization quadratic optimization

More information

HW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given.

HW1 solutions. 1. α Ef(x) β, where Ef(x) is the expected value of f(x), i.e., Ef(x) = n. i=1 p if(a i ). (The function f : R R is given. HW1 solutions Exercise 1 (Some sets of probability distributions.) Let x be a real-valued random variable with Prob(x = a i ) = p i, i = 1,..., n, where a 1 < a 2 < < a n. Of course p R n lies in the standard

More information

Thomas Knispel Leibniz Universität Hannover

Thomas Knispel Leibniz Universität Hannover Optimal long term investment under model ambiguity Optimal long term investment under model ambiguity homas Knispel Leibniz Universität Hannover knispel@stochastik.uni-hannover.de AnStAp0 Vienna, July

More information

Brittleness and Robustness of Bayesian Inference

Brittleness and Robustness of Bayesian Inference Brittleness and Robustness of Bayesian Inference Tim Sullivan with Houman Owhadi and Clint Scovel (Caltech) Mathematics Institute, University of Warwick Predictive Modelling Seminar University of Warwick,

More information

Geometric problems. Chapter Projection on a set. The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as

Geometric problems. Chapter Projection on a set. The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as Chapter 8 Geometric problems 8.1 Projection on a set The distance of a point x 0 R n to a closed set C R n, in the norm, is defined as dist(x 0,C) = inf{ x 0 x x C}. The infimum here is always achieved.

More information

Continuous updating rules for imprecise probabilities

Continuous updating rules for imprecise probabilities Continuous updating rules for imprecise probabilities Marco Cattaneo Department of Statistics, LMU Munich WPMSIIP 2013, Lugano, Switzerland 6 September 2013 example X {1, 2, 3} Marco Cattaneo @ LMU Munich

More information

ON STATISTICAL INFERENCE UNDER ASYMMETRIC LOSS. Abstract. We introduce a wide class of asymmetric loss functions and show how to obtain

ON STATISTICAL INFERENCE UNDER ASYMMETRIC LOSS. Abstract. We introduce a wide class of asymmetric loss functions and show how to obtain ON STATISTICAL INFERENCE UNDER ASYMMETRIC LOSS FUNCTIONS Michael Baron Received: Abstract We introduce a wide class of asymmetric loss functions and show how to obtain asymmetric-type optimal decision

More information

Probabilistic Graphical Models for Image Analysis - Lecture 4

Probabilistic Graphical Models for Image Analysis - Lecture 4 Probabilistic Graphical Models for Image Analysis - Lecture 4 Stefan Bauer 12 October 2018 Max Planck ETH Center for Learning Systems Overview 1. Repetition 2. α-divergence 3. Variational Inference 4.

More information

Gamma-admissibility of generalized Bayes estimators under LINEX loss function in a non-regular family of distributions

Gamma-admissibility of generalized Bayes estimators under LINEX loss function in a non-regular family of distributions Hacettepe Journal of Mathematics Statistics Volume 44 (5) (2015), 1283 1291 Gamma-admissibility of generalized Bayes estimators under LINEX loss function in a non-regular family of distributions SH. Moradi

More information

Chapter 5. Measurable Functions

Chapter 5. Measurable Functions Chapter 5. Measurable Functions 1. Measurable Functions Let X be a nonempty set, and let S be a σ-algebra of subsets of X. Then (X, S) is a measurable space. A subset E of X is said to be measurable if

More information

Hypothesis Testing via Convex Optimization

Hypothesis Testing via Convex Optimization Hypothesis Testing via Convex Optimization Arkadi Nemirovski Joint research with Alexander Goldenshluger Haifa University Anatoli Iouditski Grenoble University Information Theory, Learning and Big Data

More information

Calculus of Variations. Final Examination

Calculus of Variations. Final Examination Université Paris-Saclay M AMS and Optimization January 18th, 018 Calculus of Variations Final Examination Duration : 3h ; all kind of paper documents (notes, books...) are authorized. The total score of

More information

On Kusuoka Representation of Law Invariant Risk Measures

On Kusuoka Representation of Law Invariant Risk Measures MATHEMATICS OF OPERATIONS RESEARCH Vol. 38, No. 1, February 213, pp. 142 152 ISSN 364-765X (print) ISSN 1526-5471 (online) http://dx.doi.org/1.1287/moor.112.563 213 INFORMS On Kusuoka Representation of

More information

8. Geometric problems

8. Geometric problems 8. Geometric problems Convex Optimization Boyd & Vandenberghe extremal volume ellipsoids centering classification placement and facility location 8 Minimum volume ellipsoid around a set Löwner-John ellipsoid

More information

Convex Optimization Theory. Chapter 4 Exercises and Solutions: Extended Version

Convex Optimization Theory. Chapter 4 Exercises and Solutions: Extended Version Convex Optimization Theory Chapter 4 Exercises and Solutions: Extended Version Dimitri P. Bertsekas Massachusetts Institute of Technology Athena Scientific, Belmont, Massachusetts http://www.athenasc.com

More information

Optimal Uncertainty Quantification

Optimal Uncertainty Quantification SIAM REVIEW Vol. 55, No. 2, pp. 271 345 c 2013 Society for Industrial and Applied Mathematics Optimal Uncertainty Quantification H. Owhadi C. Scovel T. J. Sullivan M. McKerns M. Ortiz Abstract. We propose

More information

Case study: stochastic simulation via Rademacher bootstrap

Case study: stochastic simulation via Rademacher bootstrap Case study: stochastic simulation via Rademacher bootstrap Maxim Raginsky December 4, 2013 In this lecture, we will look at an application of statistical learning theory to the problem of efficient stochastic

More information

Distributionally Robust Discrete Optimization with Entropic Value-at-Risk

Distributionally Robust Discrete Optimization with Entropic Value-at-Risk Distributionally Robust Discrete Optimization with Entropic Value-at-Risk Daniel Zhuoyu Long Department of SEEM, The Chinese University of Hong Kong, zylong@se.cuhk.edu.hk Jin Qi NUS Business School, National

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

STAT Sample Problem: General Asymptotic Results

STAT Sample Problem: General Asymptotic Results STAT331 1-Sample Problem: General Asymptotic Results In this unit we will consider the 1-sample problem and prove the consistency and asymptotic normality of the Nelson-Aalen estimator of the cumulative

More information

How to predict the probability of a major nuclear accident after Fukushima Da

How to predict the probability of a major nuclear accident after Fukushima Da How to predict the probability of a major nuclear accident after Fukushima Dai-ichi? CERNA Mines ParisTech March 14, 2012 1 2 Frequentist approach Bayesian approach Issues 3 Allowing safety progress Dealing

More information

SEQUENTIAL ESTIMATION OF A COMMON LOCATION PARAMETER OF TWO POPULATIONS

SEQUENTIAL ESTIMATION OF A COMMON LOCATION PARAMETER OF TWO POPULATIONS REVSTAT Statistical Journal Volume 14, Number 3, June 016, 97 309 SEQUENTIAL ESTIMATION OF A COMMON LOCATION PARAMETER OF TWO POPULATIONS Authors: Agnieszka St epień-baran Institute of Applied Mathematics

More information

Optimization Tools in an Uncertain Environment

Optimization Tools in an Uncertain Environment Optimization Tools in an Uncertain Environment Michael C. Ferris University of Wisconsin, Madison Uncertainty Workshop, Chicago: July 21, 2008 Michael Ferris (University of Wisconsin) Stochastic optimization

More information

Lecture: Convex Optimization Problems

Lecture: Convex Optimization Problems 1/36 Lecture: Convex Optimization Problems http://bicmr.pku.edu.cn/~wenzw/opt-2015-fall.html Acknowledgement: this slides is based on Prof. Lieven Vandenberghe s lecture notes Introduction 2/36 optimization

More information

Imprecise Probability

Imprecise Probability Imprecise Probability Alexander Karlsson University of Skövde School of Humanities and Informatics alexander.karlsson@his.se 6th October 2006 0 D W 0 L 0 Introduction The term imprecise probability refers

More information

Math 273a: Optimization Convex Conjugacy

Math 273a: Optimization Convex Conjugacy Math 273a: Optimization Convex Conjugacy Instructor: Wotao Yin Department of Mathematics, UCLA Fall 2015 online discussions on piazza.com Convex conjugate (the Legendre transform) Let f be a closed proper

More information

Distributionally robust optimization techniques in batch bayesian optimisation

Distributionally robust optimization techniques in batch bayesian optimisation Distributionally robust optimization techniques in batch bayesian optimisation Nikitas Rontsis June 13, 2016 1 Introduction This report is concerned with performing batch bayesian optimization of an unknown

More information

Stratégies bayésiennes et fréquentistes dans un modèle de bandit

Stratégies bayésiennes et fréquentistes dans un modèle de bandit Stratégies bayésiennes et fréquentistes dans un modèle de bandit thèse effectuée à Telecom ParisTech, co-dirigée par Olivier Cappé, Aurélien Garivier et Rémi Munos Journées MAS, Grenoble, 30 août 2016

More information

Uncertainty Quantification for Inverse Problems. November 7, 2011

Uncertainty Quantification for Inverse Problems. November 7, 2011 Uncertainty Quantification for Inverse Problems November 7, 2011 Outline UQ and inverse problems Review: least-squares Review: Gaussian Bayesian linear model Parametric reductions for IP Bias, variance

More information

An introduction to Bayesian reasoning in particle physics

An introduction to Bayesian reasoning in particle physics An introduction to Bayesian reasoning in particle physics Graduiertenkolleg seminar, May 15th 2013 Overview Overview A concrete example Scientific reasoning Probability and the Bayesian interpretation

More information

Chapter 8. General Countably Additive Set Functions. 8.1 Hahn Decomposition Theorem

Chapter 8. General Countably Additive Set Functions. 8.1 Hahn Decomposition Theorem Chapter 8 General Countably dditive Set Functions In Theorem 5.2.2 the reader saw that if f : X R is integrable on the measure space (X,, µ) then we can define a countably additive set function ν on by

More information

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Konstantin Zuev Institute for Risk and Uncertainty University of Liverpool http://www.liv.ac.uk/risk-and-uncertainty/staff/k-zuev/

More information

On deterministic reformulations of distributionally robust joint chance constrained optimization problems

On deterministic reformulations of distributionally robust joint chance constrained optimization problems On deterministic reformulations of distributionally robust joint chance constrained optimization problems Weijun Xie and Shabbir Ahmed School of Industrial & Systems Engineering Georgia Institute of Technology,

More information

Time Series and Dynamic Models

Time Series and Dynamic Models Time Series and Dynamic Models Section 1 Intro to Bayesian Inference Carlos M. Carvalho The University of Texas at Austin 1 Outline 1 1. Foundations of Bayesian Statistics 2. Bayesian Estimation 3. The

More information

Uniqueness of Generalized Equilibrium for Box Constrained Problems and Applications

Uniqueness of Generalized Equilibrium for Box Constrained Problems and Applications Uniqueness of Generalized Equilibrium for Box Constrained Problems and Applications Alp Simsek Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology Asuman E.

More information

Contents: 1. Minimization. 2. The theorem of Lions-Stampacchia for variational inequalities. 3. Γ -Convergence. 4. Duality mapping.

Contents: 1. Minimization. 2. The theorem of Lions-Stampacchia for variational inequalities. 3. Γ -Convergence. 4. Duality mapping. Minimization Contents: 1. Minimization. 2. The theorem of Lions-Stampacchia for variational inequalities. 3. Γ -Convergence. 4. Duality mapping. 1 Minimization A Topological Result. Let S be a topological

More information

AN EXTENSION OF THE PROBABILITY LOGIC LP P 2. Tatjana Stojanović 1, Ana Kaplarević-Mališić 1 and Zoran Ognjanović 2

AN EXTENSION OF THE PROBABILITY LOGIC LP P 2. Tatjana Stojanović 1, Ana Kaplarević-Mališić 1 and Zoran Ognjanović 2 45 Kragujevac J. Math. 33 (2010) 45 62. AN EXTENSION OF THE PROBABILITY LOGIC LP P 2 Tatjana Stojanović 1, Ana Kaplarević-Mališić 1 and Zoran Ognjanović 2 1 University of Kragujevac, Faculty of Science,

More information

Estimates for probabilities of independent events and infinite series

Estimates for probabilities of independent events and infinite series Estimates for probabilities of independent events and infinite series Jürgen Grahl and Shahar evo September 9, 06 arxiv:609.0894v [math.pr] 8 Sep 06 Abstract This paper deals with finite or infinite sequences

More information

Bayesian and Frequentist Methods in Bandit Models

Bayesian and Frequentist Methods in Bandit Models Bayesian and Frequentist Methods in Bandit Models Emilie Kaufmann, Telecom ParisTech Bayes In Paris, ENSAE, October 24th, 2013 Emilie Kaufmann (Telecom ParisTech) Bayesian and Frequentist Bandits BIP,

More information

Solutions Chapter 5. The problem of finding the minimum distance from the origin to a line is written as. min 1 2 kxk2. subject to Ax = b.

Solutions Chapter 5. The problem of finding the minimum distance from the origin to a line is written as. min 1 2 kxk2. subject to Ax = b. Solutions Chapter 5 SECTION 5.1 5.1.4 www Throughout this exercise we will use the fact that strong duality holds for convex quadratic problems with linear constraints (cf. Section 3.4). The problem of

More information

Convex Optimization in Classification Problems

Convex Optimization in Classification Problems New Trends in Optimization and Computational Algorithms December 9 13, 2001 Convex Optimization in Classification Problems Laurent El Ghaoui Department of EECS, UC Berkeley elghaoui@eecs.berkeley.edu 1

More information

Robustness and duality of maximum entropy and exponential family distributions

Robustness and duality of maximum entropy and exponential family distributions Chapter 7 Robustness and duality of maximum entropy and exponential family distributions In this lecture, we continue our study of exponential families, but now we investigate their properties in somewhat

More information

MIT Spring 2016

MIT Spring 2016 Decision Theoretic Framework MIT 18.655 Dr. Kempthorne Spring 2016 1 Outline Decision Theoretic Framework 1 Decision Theoretic Framework 2 Decision Problems of Statistical Inference Estimation: estimating

More information

2 Chance constrained programming

2 Chance constrained programming 2 Chance constrained programming In this Chapter we give a brief introduction to chance constrained programming. The goals are to motivate the subject and to give the reader an idea of the related difficulties.

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Brittleness and Robustness of Bayesian Inference

Brittleness and Robustness of Bayesian Inference Brittleness and Robustness of Bayesian Inference Tim Sullivan 1 with Houman Owhadi 2 and Clint Scovel 2 1 Free University of Berlin / Zuse Institute Berlin, Germany 2 California Institute of Technology,

More information