Optimization Tools in an Uncertain Environment

Similar documents
Stochastic Programming, Equilibria and Extended Mathematical Programming

Applying Bayesian Estimation to Noisy Simulation Optimization

Lecture 1. Stochastic Optimization: Introduction. January 8, 2018

Chance Constrained Programming

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C.

Chance constrained optimization - applications, properties and numerical issues

Chance constrained optimization

Complexity of two and multi-stage stochastic programming problems

Stochastic Decomposition

Importance sampling in scenario generation

Convex optimization problems. Optimization problem in standard form

EE 227A: Convex Optimization and Applications April 24, 2008

MOPEC: Multiple Optimization Problems with Equilibrium Constraints

Stochastic Dual Dynamic Programming with CVaR Risk Constraints Applied to Hydrothermal Scheduling. ICSP 2013 Bergamo, July 8-12, 2012

Birgit Rudloff Operations Research and Financial Engineering, Princeton University

Reformulation of chance constrained problems using penalty functions

Handout 8: Dealing with Data Uncertainty

arxiv: v3 [math.oc] 25 Apr 2018

RISK AND RELIABILITY IN OPTIMIZATION UNDER UNCERTAINTY

Stochastic Programming: From statistical data to optimal decisions

Approximate Bayesian Computation

Scenario Generation and Sampling Methods

SOME HISTORY OF STOCHASTIC PROGRAMMING

MEASURES OF RISK IN STOCHASTIC OPTIMIZATION

4. Convex optimization problems

An Adaptive Partition-based Approach for Solving Two-stage Stochastic Programs with Fixed Recourse

Stochastic Programming Models in Design OUTLINE

Modeling, equilibria, power and risk

Multi-Objective Stochastic Optimization of Magnetic Fields. David S Bindel, Cornell University

Monte Carlo Methods for Stochastic Programming

Stochastic Integer Programming

Semidefinite and Second Order Cone Programming Seminar Fall 2012 Project: Robust Optimization and its Application of Robust Portfolio Optimization

Convex Optimization and Modeling

Estimating functional uncertainty using polynomial chaos and adjoint equations

Modeling Uncertainty in Linear Programs: Stochastic and Robust Linear Programming

Quantifying Stochastic Model Errors via Robust Optimization

Tutorial on Approximate Bayesian Computation

Stochastic Subgradient Methods

Quadratic Support Functions and Extended Mathematical Programs

Lagrangean Decomposition for Mean-Variance Combinatorial Optimization

A SECOND ORDER STOCHASTIC DOMINANCE PORTFOLIO EFFICIENCY MEASURE

Lecture: Convex Optimization Problems

CVaR and Examples of Deviation Risk Measures

4. Convex optimization problems

ORIGINS OF STOCHASTIC PROGRAMMING

Stochastic Programming: Basic Theory

Branch-and-cut Approaches for Chance-constrained Formulations of Reliable Network Design Problems

Stochastic Generation Expansion Planning

Generation and Representation of Piecewise Polyhedral Value Functions

Scenario Grouping and Decomposition Algorithms for Chance-constrained Programs

Valid Inequalities and Restrictions for Stochastic Programming Problems with First Order Stochastic Dominance Constraints

Splitting Techniques in the Face of Huge Problem Sizes: Block-Coordinate and Block-Iterative Approaches

Selected Examples of CONIC DUALITY AT WORK Robust Linear Optimization Synthesis of Linear Controllers Matrix Cube Theorem A.

Network Localization via Schatten Quasi-Norm Minimization

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization

Stochastic Integer Programming An Algorithmic Perspective

On deterministic reformulations of distributionally robust joint chance constrained optimization problems

Probing the covariance matrix

Motivation General concept of CVaR Optimization Comparison. VaR and CVaR. Přemysl Bejda.

Convex Optimization. (EE227A: UC Berkeley) Lecture 6. Suvrit Sra. (Conic optimization) 07 Feb, 2013

On a class of minimax stochastic programs

Stochastic Dual Dynamic Integer Programming

Uncertainty quantification for Wavefield Reconstruction Inversion

Uncertainty modeling for robust verifiable design. Arnold Neumaier University of Vienna Vienna, Austria

Coupled Optimization Models for Planning and Operation of Power Systems on Multiple Scales

Decomposition Algorithms for Two-Stage Chance-Constrained Programs

CS6375: Machine Learning Gautam Kunapuli. Support Vector Machines

Stochastic Analogues to Deterministic Optimizers

On the Power of Robust Solutions in Two-Stage Stochastic and Adaptive Optimization Problems

Risk Management of Portfolios by CVaR Optimization

Reformulation and Sampling to Solve a Stochastic Network Interdiction Problem

Distributionally Robust Discrete Optimization with Entropic Value-at-Risk

Sharp bounds on the VaR for sums of dependent risks

Trust Region Problems with Linear Inequality Constraints: Exact SDP Relaxation, Global Optimality and Robust Optimization

On the Power of Robust Solutions in Two-Stage Stochastic and Adaptive Optimization Problems

Bilevel Integer Optimization: Theory and Algorithms

Lecture 18: Optimization Programming

Robustness in Stochastic Programs with Risk Constraints

R O B U S T E N E R G Y M AN AG E M E N T S Y S T E M F O R I S O L AT E D M I C R O G R I D S

Kernel-based Approximation. Methods using MATLAB. Gregory Fasshauer. Interdisciplinary Mathematical Sciences. Michael McCourt.

Stochastic Optimization One-stage problem

Stochastic Programming Approach to Optimization under Uncertainty

A. Shapiro Introduction

The L-Shaped Method. Operations Research. Anthony Papavasiliou 1 / 44

Worst-Case Violation of Sampled Convex Programs for Optimization with Uncertainty

Efficient Likelihood-Free Inference

Włodzimierz Ogryczak. Warsaw University of Technology, ICCE ON ROBUST SOLUTIONS TO MULTI-OBJECTIVE LINEAR PROGRAMS. Introduction. Abstract.

Numerical Solutions of ODEs by Gaussian (Kalman) Filtering

Quadratic Two-Stage Stochastic Optimization with Coherent Measures of Risk

Bayesian Methods for Machine Learning

Robustness and bootstrap techniques in portfolio efficiency tests

Optimality, Duality, Complementarity for Constrained Optimization

Variational Inference via Stochastic Backpropagation

Financial Optimization ISE 347/447. Lecture 21. Dr. Ted Ralphs

A Benders Algorithm for Two-Stage Stochastic Optimization Problems With Mixed Integer Recourse

Mathematical Optimization Models and Applications

M E M O R A N D U M. Faculty Senate approved November 1, 2018

Lagrange Relaxation: Introduction and Applications

A Stochastic-Oriented NLP Relaxation for Integer Programming

Constrained Optimization Theory

Transcription:

Optimization Tools in an Uncertain Environment Michael C. Ferris University of Wisconsin, Madison Uncertainty Workshop, Chicago: July 21, 2008 Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 1 / 15

Optimization of a model under uncertainty Modeler: assumes knowledge of distribution Often formulated mathematically as min f (x) = E[F (x, ξ)] = F (x, ξ)p(ξ)dξ x X (p is probability distribution). Can think of this as optimization with noisy function evaluations Traditional Stochastic Optimization approaches: (Robinson/Munro, Keifer/Wolfowitz) Often require estimating gradients: IPA, finite differences Stochastic neighborhood search ξ Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 2 / 15

Example: Two stage stochastic LP with recourse min x R ct x + E[Q(x, ξ)] s.t. Ax = b, x 0 n Q(x, ξ) = min y q T y s.t. Tx + Wy = h, y 0 ξ = (q, h, T, W ) (some are random). Expectation wrt ξ. x are first stage vars, y are second stage vars. Special case: discrete distribution Ω = {ξ i : i = 1, 2,..., K} A T W T Deterministic equivalent igure problem Constraints matrix structure of 15) Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 3 / 15

non-anticipativity is represented by the constraint XX=i Hx = 0 where H = (,. is a suitable matrix. The block structure of the constraints matrix of formulation (1 igure 2. Separability of 16) can be achieved when removing the non-ant Key-idea: Non-anticipativity seen in constraints T W Replace x with x 1, x 2,..., x K Non-anticipativity: (x 1, x 2,..., x K ) L T (a subspace) - the H H constraints Computational methods exploit the separability of these constraints, essentially by dualization of the non-anticipativity constraints. igure Constraints matrix of the scenario formulation 16) conditions from the constraints. This leads to considering the following agrangian of 16) Primal and dual decompositions (Lagrangian relaxation, progressive hedging, etc) L shaped method (Benders (\ min decomposition { J2 {x y : applied A < 6 tog det. X, equiv.) Trust region methods and/or regularized decomposition W < h, y 1,... r} where Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 4 / 15

Complications Multistage problems recursive application of above, scenario trees dynamic programming approaches - see Judd talk reinforcement learning, neuro-dynamic programming real options Stochastic integer programming Stochastic variational inequalities / complementarity problems Nonlinear (convex or otherwise) recourse models Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 5 / 15

Sampling methods But what if the number of scenarios is too big (or the probability distribution is not discrete)? use sample average approximation (SAA) Take sample ξ 1,..., ξ N of N realizations of random vector ξ viewed as historical data of N observations of ξ, or generated via Monte Carlo sampling for any x X estimate f (x) by averaging values F (x, ξ j ) (SAA): min x X ˆf N (x) := 1 N F (x, ξ j ) N j=1 Nice theoretical asymptotic properties Can use standard optimization tools to solve the SAA problem Implementation uses common random numbers, distributed computation Monte Carlo Sampling (Quasi-Monte Carlo Sampling) Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 6 / 15

Variance reduction Choose ξ 1, ξ 2,..., ξ N carefully (essentially exploting properties of numerical integration) to reduce the variance of ˆf N (x) Latin Hypercube Sampling Importance Sampling, Likelihood ratio methods Significantly improves performance of optimization codes Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 7 / 15

Example: Robust Linear Programming Data in LP not known with certainty: min c T x s.t. a T i x b i, i = 1, 2,..., m Suppose the vectors a i are known to be lie in the ellipsoids (no distribution) a i ε i := {ā i + P i u : u 2 1} where P i R n n (and could be singular, or even 0). Conservative approach: robust linear program min c T x s.t. a T i x b i, for all a i ε i, i = 1, 2,..., m Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 8 / 15

Robust Linear Programming as SOCP The constraints can be rewritten as: } b i sup {a i T x : a i ε i { } = āi T x + sup u T Pi T x : u 2 1 = āi T x + P T i x 2 Thus the robust linear program can be written as min c T x s.t. āi T x + x b i, i = 1, 2,..., m 2 P T i min c T x s.t. (b i ā T i x, P T i x) C where C represents the second-order cone. Solution (as SOCP) by Mosek or Sedumi, CVX, etc Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 9 / 15

Example: Simulation Optimization Computer simulations are used as substitutes to understand or predict the behavior of a complex system when exposed to a variety of realistic, stochastic input scenarios Simulations are widely applied in epidemiology, engineering design, manufacturing, supply chain management, medical treatment and many other fields Optimization applications: calibration, parameter tuning, inverse optimization, pde-constrained optimization min f (x) = E[F (x, ξ)], x X The sample response function F (x, ξ) typically does not have a closed form, thus cannot provide gradient or Hessian information is normally computationally expensive is affected by uncertain factors in simulation Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 10 / 15

Bayesian approach The underlying objective function f (x) still has to be estimated. Denote the mean of the simulation output for each system as µ i = f (x i ) = E[F (x i, ξ)] In a Bayesian perspective, the means are considered as Gaussian random variables whose posterior distributions can be estimated as µ i X N( µ i, ˆσ 2 i /N i ), where µ i is sample mean and ˆσ i 2 is sample variance. The above formulation is one type of posterior distribution. Instrument existing optimization codes to use this derived distribution information Derivative free optimization, surrogate optimization Response surface methodology Evolutionary methods Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 11 / 15

Example: Chance Constrained Problems min f (x) s.t. Prob(C(x, ξ) > 0) α x X α is some threshold parameter, C is vector valued joint probabilistic constraint: all constraints satisfied simultaneously - possible dependence between random variables in different rows extensive literature linear programs with probabilistic constraints are still largely intractable (except for a few very special cases) for a given x X, the quantity Prob(C(x, ξ) > 0) requires multi-dimensional integration the feasible region defined by a probabilistic constraint is not convex Recent work by Ahmed, Leudtke, Nemhauser and Shapiro Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 12 / 15

Example: Risk Measures Classical: utility/disutility function u( ): min f (x) = E[u(F (x, ξ))], x X Modern approach to modeling risk aversion uses concept of risk measures mean-risk semi-deviations mean deviations from quantiles, VaR, CVaR Römish, Schultz, Rockafellar, Urasyev (in Math Prog literature) Much more in mathematical economics and finance literature Optimization approaches still valid, different objectives Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 13 / 15

CVaR constraints: mean excess dose (radiotherapy) VaR, CVaR, CVaR + and CVaR - Frequency VaR Probability 1 α Maximum loss CVaR Loss Move mean of tail to the left! Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 14 / 15

So what s my point? Modeling and optimization model building is key! Economic models versus scientific engineering models: philosophical differences in usage Many different optimization approaches to treat (model) uncertainties How much do I know about distribution of data? Specific models needed for these applications Modeling systems (GAMS, AMPL, AIMMS) have had limited impact due to no common input/model format Stochastic model implementation and interfaces to these tools are needed Michael Ferris (University of Wisconsin) Stochastic optimization Chicago: July 21, 2008 15 / 15