Metamodeling-Based Optimization

Size: px
Start display at page:

Download "Metamodeling-Based Optimization"

Transcription

1 Metamodeling-Based Optimization Erdem Acar Postdoctoral Research Associate Mississippi State University

2 Outline What is a metamodel? How do we construct a metamodel? Motivation for using metamodels in optimization Different metamodel types Polynomial response surface Radial basis functions Kriging Ensemble of metamodels for better accuracy and less volatility Application problems Concluding remarks 2

3 What is a Metamodel? The design of advanced vehicular systems such as aircraft and automobiles requires performing high fidelity analyses to ensure a high level of accuracy. Numerically expensive FE or CFD simulations Even though computer processing power, memory and storage space have increased drastically throughout the last 30 years, Venkataraman and Haftka* (2004) reported that analysis models of acceptable accuracy have required at least six to eight hours of computer time (an overnight run). Because the fidelity and complexity of analysis models required by the designers have increased. There is a growing interest in replacing these slow, expensive and mostly noisy simulations with smooth approximate models that produce fast results. These approximate models are referred to as metamodels or surrogate models. * S. Venkataraman and R.T. Haftka, 2004, Structural optimization complexity: what has Moore s law done for us?, Structural and Multidisciplinary Optimization, Vol. 28, pp

4 Metamodel construction Metamodel mathematical relationship between design variables and response y f(x) constructed with few simulations Advantages computationally inexpensive easy coupling with optimization software can filter out numerical noise and smooth over slope discontinuities provide differentiable functions 400 Design of experiments Run simulations at selected points Construct metamodel (i.e., find parameters of mathematical function) 200 Assess quality of fit Metamodel construction 0 0 4

5 Design of experiments Design of experiments (DOE) is a sampling plan in design space (i.e., locations where we conduct simulations) Number of points: thumb rule, times the number of coefficients of fitting polynomial, adaptive sampling Considerations reduce the influence of noise, bias errors (insufficient model), extrapolation Types of DOEs: face-centered central composite design (FCCD), Latin hypercube sampling (LHS), orthogonal array (OA), D-optimal design, combinations (LHS+FCCD), CCD LHS 5

6 Curse of dimensionality Order of polynomial Nv Number of coefficients grows rapidly as order of polynomial increases 6

7 Motivation for the use in optimization Issues with simulations discretization and convergence problems lead to numerical noise high computational cost because of high fidelity required by an analyst Design optimization requires evaluating many designs very high cost! Identify design variables x, objectives f, and constraints g Select models to evaluate (f, g) for designs (x) Optimization Optimization process Use metamodeling based methods for optimization 7

8 Metamodeling based optimization concept Define optimization problem Min f(x1, x2 xn) s.t. g(x) 0 h(x) = 0 Select DOE and evaluate f(x), g(x), h(x) x1 x2 xn Metamodel Numerical simulations f g h Refine design space No Perform optimization Validation? Yes Stop 8

9 Polynomial response surface approximation y ( x) 0 Nv i 1 i xi yβ Xε,y Nv i 1 Nv ii xi2 i j 2 ij xi x j K ˆ b X b= XX Xy -1 Assumption: Normally distributed uncorrelated noise In presence of noise, response surface may be more accurate than observed data Questions to ask: Does the data have noise, and/or can the chosen order polynomial approximate the behavior? y Sampling data points Response surface x 9

10 Kriging (KR) y (x) yˆ (x) Nv i 1 C Z (x), Z (sθ ), Systematic departure i i ( x) Z ( x) Trend model Nv i 1 exp i ( xi si )2 Named after a South African mining engineer D. G. Krige Assumption: Systematic departures Z(x) are correlated, noise is small Gaussian correlation function C(x, s,θ) is most popular Computationally expensive for large problems (N>200) y Sampling data points Systematic Departure Linear Trend Model Kriging x 10

11 Radial basis functions (RBF) fˆ (x) N i 1 i x xi λ are found from φ r : radially symmetric functions [ A ] { λ } ={ f } Aij =φ x j x i =φ r where 3 2 φ r =r log cr φ r =exp cr Gaussian φ r =1/ r c 1.5 φ(r) φ r = r 2 c 2 2 Thin-plate spline Thin plate Gaussian Multiquadric Inverse multiquadric 2.5 Multiquadric 2 Inverse multiquadric where 0 c r Found to be good for modeling fast-changing responses Successful application in crashworthiness responses 11

12 Some issues with metamodel selection Many types of metamodels difficult to know which is the best for a given problem Performance of a metamodel depends on number of samples choice of sampling scheme (DOE) GP nature of the problem RBF Main issue best metamodel is problem and DOE dependent uncertainty in predictions since metamodels are inexpensive to construct, can we do better by using several types of metamodels at once? Advantages PRS: Polynomial response surface KRG: Kriging RBF: Radial basis functions protect from choosing the wrong metamodel GP: estimate uncertainty in predictions Gaussian process 12

13 Weighted average model Weighted sum of responses from NM metamodels NM y WA x = w i y i x i =1 Choice of weights wi should reflect the confidence in metamodels accurate metamodel high weight 13

14 Goel et al. s parametric weighting strategy Parametric weighting strategy based on error Ei w * i Ei Eavg Eavg NSM i 1 1, wi wi* i Ei N SM ; 0 1 wi* Protects against data modeling surrogate Here 0.05 Ei PRESS Use PRESS-based weighting strategy Example of PRESS-based weighted surrogate (PWS) with yˆ pws ( x) w yˆ prs (x) wkrg yˆ krg (x) wrbnn yˆ rbnn (x) PRS, kriging, and prs RBNN 14

15 Acar et al. s weighting strategy Using a parametric weighting strategy is not optimal The choice of parameters are experience based and can change one problem to another We determine the weights by solving min f =Err { y WA x, y act x } NM s.t. wi =1 i=1 NM where y WA x = w i y i x i =1 Will provide example problems later on. 15

16 Accuracy of metamodels How to determine if surrogate models are accurate enough to be used for analysis and optimization? Error on test points should be small compared to the response limitation due to the cost of simulations Cross-validation error: Use the data to estimate errors, e.g., leave-one-out-error or PRESS For polynomials, we have adjusted coefficient of determination R2adj, or root mean square error at data points Estimate of approximation errors is given by pointwise measures like, prediction variance and mean square error 16

17 Major issues in MBO Success of MBO depends on the accuracy of surrogate models f (x) fˆ (x) Sources of uncertainties in surrogate model predictions noise in data accuracy of numerical models and simulations sampling strategies choice of surrogate model (e.g. kriging vs. polynomial response surface) error prediction capability 17

18 Application problems (1) Automotive design problem Design for crashworthiness Design of side rails of an automobile (2) Redesign of Al 356 Cast Control Arm Baseline design is based on experience Redesign using optimization techniques (3) Ensemble of metamodels Application to benchmark functions Application to problem (1) 18

19 (1) Automotive design problem 19

20 Optimization problem and design variables Minimize Mass (Y) Such that Intrusion distances (Y) distance allowables Accelerations (Y) acceleration allowables Y = {DV1-4,thickness) 20

21 Metamodels 21

22 Optimization results Single objective and multiobjective optimization problems considered 6 f c X,Y = [ k=1 wk Ak X, Y Atk X,Y A wk X, Y Atk X,Y 2 ] 22

23 (2) Optimization of Al 356 Cast Control Arm Sizing and shape optimization Control arm of Corvette made of Al 356 Design for panic brake and pothole conditions Total 13 design variables (Y1-13) controlling size and shape GENESIS is used to generate perturbed mesh 23

24 Optimization problem Min W(Y) s.t. g(y) 0 YL Y YU Design for minimum weight The choice of material model determines the constraints If a multiscale material model used If a plasticity model used g Y =D Y Dcr 0 g Y =s vm Y s cr 0 Calculation of D(Y) or σvm(y) is through expensive FEA The use of metamodels for W(Y) and g(y) alleviates the computational burden 24

25 Metamodels PRS RBF GP KR FFNN SVR 25

26 Optimization results Min W(Y) s.t. g(y) 0 YL Y YU The optimization problem is solved with MATLAB fmincon function that uses Sequential quadratic programming (SQP) With multiscale material model Without multiscale material model Normalized weight = Max von Mises = MPa Damage (pred.) = 0.01 Damage (actual) = Normalized weight = Max von Mises (pred.) = MPa Max von Mises (actual) = MPa Damage =

27 (3) Ensemble of metamodels In complex engineering problems, the number of responses of interest is more than one. Example: crashworthiness problems cost (or structural weight) intrusion distances and accelerations at different locations loads transmitted to the passengers Different types of metamodels best for different responses PRS is best for mass RBF is best for floor pan displacement at FFI SVR is best for floor pan displacement at OFI As new information comes, efficiency of metamodels can change Hence, the use of a single metamodel is risky! 27

28 Evaluation of accuracy of metamodels Individual Metamodel errors PRS RBF KR Branin-Hoo Hartman 1.63 CRASH GP Ensemble errors SVR E1 E2 E CRASH: 1.06 ACC_DS_FFI Normalized errors E1: ensemble based on simple averaging 1.25 proposed by E2: parametric ensemble Goel et al. (2006) 1.00 E3: our proposed ensemble PRS RBF KR GP SVR E1 E2 E3 28

29 Ensemble of metamodels: more results CRASH: ACC_DS_FFI Normalized errors Normalized error Branin-Hoo function PRS RBF KR GP SVR E1 E2 E3 PRS RBF KR GP SVR E1 E2 E3 29

30 Concluding remarks Motivation for the use of metamodels in a design optimization framework was given Different metamodeling types and construction procedures were discussed The concept of ensemble of metamodels was presented to reduce the volatility of metamodel predictions while increasing accuracy Several application problems were provided to illustrate the MBO approach The accuracy and efficiency of MBO was demonstrated 30

31 Thank you! 31

32 Gaussian Process (GP) Main assumption: The joint probability distribution of the response follows Gaussian distribution P f N C N, X N = f x =k T C N 1 1 2π N C N fn 1 [ 2 l=1 ] f N = f n x 1, x 2,, x { n n nl } N n=1 CN: covariance matrix with elements Cij x x 1 C =θ exp ij 1 T f N μ C 1 f N μ N 2 Prediction at the N+1 the point k = [ C 1, N 1,,C N, N 1 ] L [ exp i l j r l2 Interpolation mode l 2 ] θ2 x x 1 C =θ exp ij 1 [ L 2 l=1 i l j r l l2 Regression mode 2 ] θ 2 δ ij θ 3 eliminates noise Hyperparameters (θi, ri) are obtained via optimization - e.g., maximizing the marginal likelihood 1 1 N L= log C N f T C 1 f N log 2π 2 2 N N 2 Good for modeling nonlinear responses Good for eliminating noise in regression mode 32

33 Support Vector Regression (SVR) f x = w x b When linear regression is used, prediction is performed via < > : dot product l We want to make the prediction as flat as possible 1 2 w 2 s. t. y i w x i b ε w x i b y i ε Min 1 2 w C ξ i ξ i 2 i=1 s. t. y i w x i b ε ξ i w x i b y i ε ξ Min soft formulation i ξ i,ξ 0 i - Write Laplacian - Write KKT conditions - Substitute from KKT to Lagrangian to get dual form l l l 1 Max α i α α j α x i x j ε α i α y i α i α i j i i 2 i, j=1 i=1 i=1 l s. t. αi αi =0, αi αi [ 0,C ] i=1 l f x = α α x x b i i i=1 i In nonlinear regression, replace the dot product with Kernel functions (e.g., Gaussian kernel) Based on generalized portrait algorithm (Vapnik and Lerner, 1963, Vapnik and Chervonekis (1963, 1974). Russia. 33

34 Metamodeling of Nonlinear and Noisy Functions Metamodeling of nonlinear and noisy responses When the critical response is noisy, optimization using the exact simulation results can not be pursued Metamodeling techniques helps to wipe out the noise (e.g., polynomial response surface approximations) However, when the response is also nonlinear, then more advanced metamodeling techniques needs to be utilized. Challenges and Good news stress damage DV1 DV3 DV2 RBF GP KR FFNN SVR % Error Metamodel RS RBF GP KR FFNN SVR % Error Summary Damage DV2 RS The best metamodel has 29% error. However, at the optimum, damage prediction is more accurate and also conservative damage(pred.) = 0.01 damage(actual) = Nonlinear and noisy behavior of damage Damage is function of many nonlinear and interacting terms Calculation of damage via FEA makes it amenable to numerical noise due to effect of mesh distortion In the control arm problem, some design variables are found to have a nonlinear and noisy influence on the damage. Metamodel Six different metamodels are investigated to model nonlinear and noisy functions Polynomial response surface Radial basis functions Kriging Gaussian process Feed Forward neural networks Support vector regression Gaussian process is found to be the best GP has a regression mode along with interpolation mode that filters the noise DV5 34

35 Journal papers submitted/on the line We used metamodeling based optimization concept in the following papers. 1. Rais-Rohani, M., Solanki, K., Acar, E., Eamon, C., Reliability-Based Design Optimization of Automotive Structures under Crash Loads 2. Solanki, K., Acar, E., Rais-Rohani, M., Eamon, C., and Horstemeyer, M.F., Reliability-based Structural Optimization using a Multiscale Material Model 3. Acar, E., Solanki, K., Rais-Rohani, M., Metamodeling of Nonlinear and Noisy Functions 4. Acar, E., and Rais-Rohani, M., Enhanced Surrogate Modeling via Optimum Ensemble of Metamodels 5. Acar, E., and Solanki, K., Improving accuracy of vehicle crashworthiness response predictions using ensemble of metamodels" Acknowledgements Dr. Rais-Rohani (Aerospace Engineering Department, Mississippi State University) Kiran Solanki (Center for Advanced Vehicular Systems, Mississippi State University) Dr. Eamon, Ali, Mohammad, Bulakorn (CE, AE, Mississippi State University) 35

36 (3) Probabilistic Product-Process Design Optimization Objective Solution procedure The efficiency of products depends on the design at macrostructural design variables as well as microstructural features Combine shape and sizing optimization with multiscale analysis reduced weight increased energy absorption, reduced acceleration manufacturability (reduced rejection rate) Example problem: design of stamped side rail Define DVs, DVm, RV Probabilistic Optimization Design of Experiments (DVs, DVm, RV) ABAQUS stamping Extract manuf. res. (e.g., springback, localized thinning) Metamodel Fm (DVs,DVm,RV) Update element damage indices ABAQUS crash Metamodel Fs (DVs,DVm,RV) W (DVs,DVm) Accomplishments and Future Work Multiobjective optimization Accomplishments Product along with the process to be optimized structural responses (e.g., weight, crashworthiness) manufacturability (e.g., springback, localized thinning) Pareto Frontier ABAQUS finite element models are generated I/O Collaboration of different softwares established e.g., NESSUS-ABAQUS interaction Future work 1 f2(x) Manufacturing objective Extract structural resp. (e.g., weight, energy absorption) Perform stamping and crash analysis for given design of experiments Construct metamodels for structural and manufacturing responses Solve probabilistic optimization problem Structural objective f1(x)

Support Vector Regression (SVR) Descriptions of SVR in this discussion follow that in Refs. (2, 6, 7, 8, 9). The literature

Support Vector Regression (SVR) Descriptions of SVR in this discussion follow that in Refs. (2, 6, 7, 8, 9). The literature Support Vector Regression (SVR) Descriptions of SVR in this discussion follow that in Refs. (2, 6, 7, 8, 9). The literature suggests the design variables should be normalized to a range of [-1,1] or [0,1].

More information

Outline Introduction OLS Design of experiments Regression. Metamodeling. ME598/494 Lecture. Max Yi Ren

Outline Introduction OLS Design of experiments Regression. Metamodeling. ME598/494 Lecture. Max Yi Ren 1 / 34 Metamodeling ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 1, 2015 2 / 34 1. preliminaries 1.1 motivation 1.2 ordinary least square 1.3 information

More information

Reliability Monitoring Using Log Gaussian Process Regression

Reliability Monitoring Using Log Gaussian Process Regression COPYRIGHT 013, M. Modarres Reliability Monitoring Using Log Gaussian Process Regression Martin Wayne Mohammad Modarres PSA 013 Center for Risk and Reliability University of Maryland Department of Mechanical

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II 1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector

More information

Error Measures for Noise-free Surrogate Approximations

Error Measures for Noise-free Surrogate Approximations 46th AIAA Aerospace Sciences Meeting and Exhibit 7-0 January 008, Reno, Nevada AIAA 008-90 Error Measures for Noise-free Surrogate Approximations Tushar Goel, Raphael T Haftka *, Wei Shyy 3 Livermore Software

More information

Linear & nonlinear classifiers

Linear & nonlinear classifiers Linear & nonlinear classifiers Machine Learning Hamid Beigy Sharif University of Technology Fall 1396 Hamid Beigy (Sharif University of Technology) Linear & nonlinear classifiers Fall 1396 1 / 44 Table

More information

Introduction to SVM and RVM

Introduction to SVM and RVM Introduction to SVM and RVM Machine Learning Seminar HUS HVL UIB Yushu Li, UIB Overview Support vector machine SVM First introduced by Vapnik, et al. 1992 Several literature and wide applications Relevance

More information

Effect of Characterization Test Matrix on Design Errors: Repetition or Exploration?

Effect of Characterization Test Matrix on Design Errors: Repetition or Exploration? 10 th World Congress on Structural and Multidisciplinary Optimization May 19-24, 2013, Orlando, Florida, USA Effect of Characterization Test Matrix on Design Errors: Repetition or Exploration? Taiki Matsumura

More information

Linear & nonlinear classifiers

Linear & nonlinear classifiers Linear & nonlinear classifiers Machine Learning Hamid Beigy Sharif University of Technology Fall 1394 Hamid Beigy (Sharif University of Technology) Linear & nonlinear classifiers Fall 1394 1 / 34 Table

More information

Stochastic optimization - how to improve computational efficiency?

Stochastic optimization - how to improve computational efficiency? Stochastic optimization - how to improve computational efficiency? Christian Bucher Center of Mechanics and Structural Dynamics Vienna University of Technology & DYNARDO GmbH, Vienna Presentation at Czech

More information

Sensitivity analysis using the Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will

Sensitivity analysis using the Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will Lectures Sensitivity analysis using the Metamodel of Optimal Prognosis Thomas Most & Johannes Will presented at the Weimar Optimization and Stochastic Days 2011 Source: www.dynardo.de/en/library Sensitivity

More information

AEROSPACE structures have traditionally been designed using

AEROSPACE structures have traditionally been designed using JOURNAL OF AIRCRAFT Vol. 44, No. 3, May June 2007 Reliability-Based Aircraft Structural Design Pays, Even with Limited Statistical Data Erdem Acar and Raphael T. Haftka University of Florida, Gainesville,

More information

A Structural Reliability Analysis Method Based on Radial Basis Function

A Structural Reliability Analysis Method Based on Radial Basis Function Copyright 2012 Tech Science Press CMC, vol.27, no.2, pp.128-142, 2012 A Structural Reliability Analysis Method Based on Radial Basis Function M. Q. Chau 1,2, X. Han 1, Y. C. Bai 1 and C. Jiang 1 Abstract:

More information

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition NONLINEAR CLASSIFICATION AND REGRESSION Nonlinear Classification and Regression: Outline 2 Multi-Layer Perceptrons The Back-Propagation Learning Algorithm Generalized Linear Models Radial Basis Function

More information

Jeff Howbert Introduction to Machine Learning Winter

Jeff Howbert Introduction to Machine Learning Winter Classification / Regression Support Vector Machines Jeff Howbert Introduction to Machine Learning Winter 2012 1 Topics SVM classifiers for linearly separable classes SVM classifiers for non-linearly separable

More information

Progressive Failure and Energy Absorption of Aluminum Extrusion Damage

Progressive Failure and Energy Absorption of Aluminum Extrusion Damage Energy Science and Technology Vol. 2, No., 20, pp. 5-56 DOI:0.3968/j.est.923847920020.05 ISSN 923-8460[PRINT] ISSN 923-8479[ONLINE] www.cscanada.net www.cscanada.org Progressive Failure and Energy Absorption

More information

A GENERAL FORMULATION FOR SUPPORT VECTOR MACHINES. Wei Chu, S. Sathiya Keerthi, Chong Jin Ong

A GENERAL FORMULATION FOR SUPPORT VECTOR MACHINES. Wei Chu, S. Sathiya Keerthi, Chong Jin Ong A GENERAL FORMULATION FOR SUPPORT VECTOR MACHINES Wei Chu, S. Sathiya Keerthi, Chong Jin Ong Control Division, Department of Mechanical Engineering, National University of Singapore 0 Kent Ridge Crescent,

More information

RESPONSE SURFACE METHODS FOR STOCHASTIC STRUCTURAL OPTIMIZATION

RESPONSE SURFACE METHODS FOR STOCHASTIC STRUCTURAL OPTIMIZATION Meccanica dei Materiali e delle Strutture Vol. VI (2016), no.1, pp. 99-106 ISSN: 2035-679X Dipartimento di Ingegneria Civile, Ambientale, Aerospaziale, Dei Materiali DICAM RESPONSE SURFACE METHODS FOR

More information

STUDY ON THE IMPROVED KRIGING-BASED OPTIMIZATION DESIGN METHOD

STUDY ON THE IMPROVED KRIGING-BASED OPTIMIZATION DESIGN METHOD 7 TH INTERNATIONAL CONGRESS OF THE AERONAUTICAL SCIENCES STUDY ON THE IMPROVED KRIGING-BASED OPTIMIZATION DESIGN METHOD Song Wenping, Xu Ruifei, Han Zhonghua (National Key Laboratory of Science and Technology

More information

Midterm Review CS 7301: Advanced Machine Learning. Vibhav Gogate The University of Texas at Dallas

Midterm Review CS 7301: Advanced Machine Learning. Vibhav Gogate The University of Texas at Dallas Midterm Review CS 7301: Advanced Machine Learning Vibhav Gogate The University of Texas at Dallas Supervised Learning Issues in supervised learning What makes learning hard Point Estimation: MLE vs Bayesian

More information

Lecture 18: Kernels Risk and Loss Support Vector Regression. Aykut Erdem December 2016 Hacettepe University

Lecture 18: Kernels Risk and Loss Support Vector Regression. Aykut Erdem December 2016 Hacettepe University Lecture 18: Kernels Risk and Loss Support Vector Regression Aykut Erdem December 2016 Hacettepe University Administrative We will have a make-up lecture on next Saturday December 24, 2016 Presentations

More information

Advanced Methods of Stochastic and Optimization in Industrial Applications. Lectures. Dirk Roos

Advanced Methods of Stochastic and Optimization in Industrial Applications. Lectures. Dirk Roos Lectures Advanced Methods of Stochastic and Optimization in Industrial Applications Dirk Roos presented at the Weimar Optimization and Stochastic Days 008 Source: www.dynardo.de/en/library ADVANCED METHODS

More information

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Sequential Importance Sampling for Rare Event Estimation with Computer Experiments Brian Williams and Rick Picard LA-UR-12-22467 Statistical Sciences Group, Los Alamos National Laboratory Abstract Importance

More information

Calibrating Environmental Engineering Models and Uncertainty Analysis

Calibrating Environmental Engineering Models and Uncertainty Analysis Models and Cornell University Oct 14, 2008 Project Team Christine Shoemaker, co-pi, Professor of Civil and works in applied optimization, co-pi Nikolai Blizniouk, PhD student in Operations Research now

More information

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University FEATURE EXPANSIONS FEATURE EXPANSIONS

More information

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

PILCO: A Model-Based and Data-Efficient Approach to Policy Search PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol

More information

Support Vector Machine (continued)

Support Vector Machine (continued) Support Vector Machine continued) Overlapping class distribution: In practice the class-conditional distributions may overlap, so that the training data points are no longer linearly separable. We need

More information

Introduction Dual Representations Kernel Design RBF Linear Reg. GP Regression GP Classification Summary. Kernel Methods. Henrik I Christensen

Introduction Dual Representations Kernel Design RBF Linear Reg. GP Regression GP Classification Summary. Kernel Methods. Henrik I Christensen Kernel Methods Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Kernel Methods 1 / 37 Outline

More information

A New Trust Region Algorithm Using Radial Basis Function Models

A New Trust Region Algorithm Using Radial Basis Function Models A New Trust Region Algorithm Using Radial Basis Function Models Seppo Pulkkinen University of Turku Department of Mathematics July 14, 2010 Outline 1 Introduction 2 Background Taylor series approximations

More information

Optimisation séquentielle et application au design

Optimisation séquentielle et application au design Optimisation séquentielle et application au design d expériences Nicolas Vayatis Séminaire Aristote, Ecole Polytechnique - 23 octobre 2014 Joint work with Emile Contal (computer scientist, PhD student)

More information

Iterative Gaussian Process Regression for Potential Energy Surfaces. Matthew Shelley University of York ISNET-5 Workshop 6th November 2017

Iterative Gaussian Process Regression for Potential Energy Surfaces. Matthew Shelley University of York ISNET-5 Workshop 6th November 2017 Iterative Gaussian Process Regression for Potential Energy Surfaces Matthew Shelley University of York ISNET-5 Workshop 6th November 2017 Outline Motivation: Calculation of potential energy surfaces (PES)

More information

Simulation optimization via bootstrapped Kriging: Survey

Simulation optimization via bootstrapped Kriging: Survey Simulation optimization via bootstrapped Kriging: Survey Jack P.C. Kleijnen Department of Information Management / Center for Economic Research (CentER) Tilburg School of Economics & Management (TiSEM)

More information

SUPPORT VECTOR MACHINE FOR THE SIMULTANEOUS APPROXIMATION OF A FUNCTION AND ITS DERIVATIVE

SUPPORT VECTOR MACHINE FOR THE SIMULTANEOUS APPROXIMATION OF A FUNCTION AND ITS DERIVATIVE SUPPORT VECTOR MACHINE FOR THE SIMULTANEOUS APPROXIMATION OF A FUNCTION AND ITS DERIVATIVE M. Lázaro 1, I. Santamaría 2, F. Pérez-Cruz 1, A. Artés-Rodríguez 1 1 Departamento de Teoría de la Señal y Comunicaciones

More information

Multiple Tail Median and Bootstrap Techniques for Conservative Reliability Estimates in Design Optimization

Multiple Tail Median and Bootstrap Techniques for Conservative Reliability Estimates in Design Optimization 50th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference17th 4-7 May 2009, Palm Springs, California AIAA 2009-2256 Multiple Tail Median and Bootstrap Techniques for Conservative

More information

Exam 2. Average: 85.6 Median: 87.0 Maximum: Minimum: 55.0 Standard Deviation: Numerical Methods Fall 2011 Lecture 20

Exam 2. Average: 85.6 Median: 87.0 Maximum: Minimum: 55.0 Standard Deviation: Numerical Methods Fall 2011 Lecture 20 Exam 2 Average: 85.6 Median: 87.0 Maximum: 100.0 Minimum: 55.0 Standard Deviation: 10.42 Fall 2011 1 Today s class Multiple Variable Linear Regression Polynomial Interpolation Lagrange Interpolation Newton

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

Data Mining - SVM. Dr. Jean-Michel RICHER Dr. Jean-Michel RICHER Data Mining - SVM 1 / 55

Data Mining - SVM. Dr. Jean-Michel RICHER Dr. Jean-Michel RICHER Data Mining - SVM 1 / 55 Data Mining - SVM Dr. Jean-Michel RICHER 2018 jean-michel.richer@univ-angers.fr Dr. Jean-Michel RICHER Data Mining - SVM 1 / 55 Outline 1. Introduction 2. Linear regression 3. Support Vector Machine 4.

More information

Kernel Methods. Machine Learning A W VO

Kernel Methods. Machine Learning A W VO Kernel Methods Machine Learning A 708.063 07W VO Outline 1. Dual representation 2. The kernel concept 3. Properties of kernels 4. Examples of kernel machines Kernel PCA Support vector regression (Relevance

More information

Introduction to emulators - the what, the when, the why

Introduction to emulators - the what, the when, the why School of Earth and Environment INSTITUTE FOR CLIMATE & ATMOSPHERIC SCIENCE Introduction to emulators - the what, the when, the why Dr Lindsay Lee 1 What is a simulator? A simulator is a computer code

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations

More information

Support Vector Machines

Support Vector Machines Support Vector Machines Le Song Machine Learning I CSE 6740, Fall 2013 Naïve Bayes classifier Still use Bayes decision rule for classification P y x = P x y P y P x But assume p x y = 1 is fully factorized

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

Midterm Review CS 6375: Machine Learning. Vibhav Gogate The University of Texas at Dallas

Midterm Review CS 6375: Machine Learning. Vibhav Gogate The University of Texas at Dallas Midterm Review CS 6375: Machine Learning Vibhav Gogate The University of Texas at Dallas Machine Learning Supervised Learning Unsupervised Learning Reinforcement Learning Parametric Y Continuous Non-parametric

More information

Use of Design Sensitivity Information in Response Surface and Kriging Metamodels

Use of Design Sensitivity Information in Response Surface and Kriging Metamodels Optimization and Engineering, 2, 469 484, 2001 c 2002 Kluwer Academic Publishers. Manufactured in The Netherlands. Use of Design Sensitivity Information in Response Surface and Kriging Metamodels J. J.

More information

Support Vector Machines (SVM) in bioinformatics. Day 1: Introduction to SVM

Support Vector Machines (SVM) in bioinformatics. Day 1: Introduction to SVM 1 Support Vector Machines (SVM) in bioinformatics Day 1: Introduction to SVM Jean-Philippe Vert Bioinformatics Center, Kyoto University, Japan Jean-Philippe.Vert@mines.org Human Genome Center, University

More information

Support Vector Machine (SVM) and Kernel Methods

Support Vector Machine (SVM) and Kernel Methods Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2016 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

Gaussian Process Regression

Gaussian Process Regression Gaussian Process Regression 4F1 Pattern Recognition, 21 Carl Edward Rasmussen Department of Engineering, University of Cambridge November 11th - 16th, 21 Rasmussen (Engineering, Cambridge) Gaussian Process

More information

Discriminative Models

Discriminative Models No.5 Discriminative Models Hui Jiang Department of Electrical Engineering and Computer Science Lassonde School of Engineering York University, Toronto, Canada Outline Generative vs. Discriminative models

More information

Efficient sensitivity analysis for virtual prototyping. Lectures. Thomas Most & Johannes Will

Efficient sensitivity analysis for virtual prototyping. Lectures. Thomas Most & Johannes Will Lectures Efficient sensitivity analysis for virtual prototyping Thomas Most & Johannes Will presented at the ECCOMAS Conference 2012 Source: www.dynardo.de/en/library European Congress on Computational

More information

Scalable kernel methods and their use in black-box optimization

Scalable kernel methods and their use in black-box optimization with derivatives Scalable kernel methods and their use in black-box optimization David Eriksson Center for Applied Mathematics Cornell University dme65@cornell.edu November 9, 2018 1 2 3 4 1/37 with derivatives

More information

Study on Optimal Design of Automotive Body Structure Crashworthiness

Study on Optimal Design of Automotive Body Structure Crashworthiness 7 th International LS-DYNA Users Conference Simulation Technology () Study on Optimal Design of Automotive Body Structure Crashworthiness Wang Hailiang Lin Zhongqin Jin Xianlong Institute for Automotive

More information

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods Pattern Recognition and Machine Learning Chapter 6: Kernel Methods Vasil Khalidov Alex Kläser December 13, 2007 Training Data: Keep or Discard? Parametric methods (linear/nonlinear) so far: learn parameter

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra Mini-Course 07 Kalman Particle Filters Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra Agenda State Estimation Problems & Kalman Filter Henrique Massard Steady State

More information

Machine Learning and Data Mining. Support Vector Machines. Kalev Kask

Machine Learning and Data Mining. Support Vector Machines. Kalev Kask Machine Learning and Data Mining Support Vector Machines Kalev Kask Linear classifiers Which decision boundary is better? Both have zero training error (perfect training accuracy) But, one of them seems

More information

Proceedings of the 2014 Winter Simulation Conference A. Tolk, S. Y. Diallo, I. O. Ryzhov, L. Yilmaz, S. Buckley, and J. A. Miller, eds.

Proceedings of the 2014 Winter Simulation Conference A. Tolk, S. Y. Diallo, I. O. Ryzhov, L. Yilmaz, S. Buckley, and J. A. Miller, eds. Proceedings of the 2014 Winter Simulation Conference A. Tolk, S. Y. Diallo, I. O. Ryzhov, L. Yilmaz, S. Buckley, and J. A. Miller, eds. ACCURACY VS. ROBUSTNESS: BI-CRITERIA OPTIMIZED ENSEMBLE OF METAMODELS

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes 1 Objectives to express prior knowledge/beliefs about model outputs using Gaussian process (GP) to sample functions from the probability measure defined by GP to build

More information

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines

Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines Econ 2148, fall 2017 Gaussian process priors, reproducing kernel Hilbert spaces, and Splines Maximilian Kasy Department of Economics, Harvard University 1 / 37 Agenda 6 equivalent representations of the

More information

Support Vector Machine (SVM) and Kernel Methods

Support Vector Machine (SVM) and Kernel Methods Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2014 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

EE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015

EE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015 EE613 Machine Learning for Engineers Kernel methods Support Vector Machines jean-marc odobez 2015 overview Kernel methods introductions and main elements defining kernels Kernelization of k-nn, K-Means,

More information

GWAS V: Gaussian processes

GWAS V: Gaussian processes GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011

More information

Pattern Recognition 2018 Support Vector Machines

Pattern Recognition 2018 Support Vector Machines Pattern Recognition 2018 Support Vector Machines Ad Feelders Universiteit Utrecht Ad Feelders ( Universiteit Utrecht ) Pattern Recognition 1 / 48 Support Vector Machines Ad Feelders ( Universiteit Utrecht

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

EFFICIENT AERODYNAMIC OPTIMIZATION USING HIERARCHICAL KRIGING COMBINED WITH GRADIENT

EFFICIENT AERODYNAMIC OPTIMIZATION USING HIERARCHICAL KRIGING COMBINED WITH GRADIENT EFFICIENT AERODYNAMIC OPTIMIZATION USING HIERARCHICAL KRIGING COMBINED WITH GRADIENT Chao SONG, Xudong YANG, Wenping SONG National Key Laboratory of Science and Technology on Aerodynamic Design and Research,

More information

Introduction to Smoothing spline ANOVA models (metamodelling)

Introduction to Smoothing spline ANOVA models (metamodelling) Introduction to Smoothing spline ANOVA models (metamodelling) M. Ratto DYNARE Summer School, Paris, June 215. Joint Research Centre www.jrc.ec.europa.eu Serving society Stimulating innovation Supporting

More information

A comparative study of design optimisation methodologies for side-impact crashworthiness, using injury-based versus energy-based criterion

A comparative study of design optimisation methodologies for side-impact crashworthiness, using injury-based versus energy-based criterion International Journal of Crashworthiness Vol. 14, No. 2, April 2009, 125 138 A comparative study of design optimisation methodologies for side-impact crashworthiness, using injury-based versus energy-based

More information

Gaussian Process Regression: Active Data Selection and Test Point. Rejection. Sambu Seo Marko Wallat Thore Graepel Klaus Obermayer

Gaussian Process Regression: Active Data Selection and Test Point. Rejection. Sambu Seo Marko Wallat Thore Graepel Klaus Obermayer Gaussian Process Regression: Active Data Selection and Test Point Rejection Sambu Seo Marko Wallat Thore Graepel Klaus Obermayer Department of Computer Science, Technical University of Berlin Franklinstr.8,

More information

Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012

Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012 Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Linear classifier Which classifier? x 2 x 1 2 Linear classifier Margin concept x 2

More information

Basics of Uncertainty Analysis

Basics of Uncertainty Analysis Basics of Uncertainty Analysis Chapter Six Basics of Uncertainty Analysis 6.1 Introduction As shown in Fig. 6.1, analysis models are used to predict the performances or behaviors of a product under design.

More information

Chapter 3 Numerical Methods

Chapter 3 Numerical Methods Chapter 3 Numerical Methods Part 2 3.2 Systems of Equations 3.3 Nonlinear and Constrained Optimization 1 Outline 3.2 Systems of Equations 3.3 Nonlinear and Constrained Optimization Summary 2 Outline 3.2

More information

Parameter Estimation Method Using Bayesian Statistics Considering Uncertainty of Information for RBDO

Parameter Estimation Method Using Bayesian Statistics Considering Uncertainty of Information for RBDO th World Congress on Structural and Multidisciplinary Optimization 7 th - 2 th, June 205, Sydney Australia Parameter Estimation Method Using Bayesian Statistics Considering Uncertainty of Information for

More information

SVMs: Non-Separable Data, Convex Surrogate Loss, Multi-Class Classification, Kernels

SVMs: Non-Separable Data, Convex Surrogate Loss, Multi-Class Classification, Kernels SVMs: Non-Separable Data, Convex Surrogate Loss, Multi-Class Classification, Kernels Karl Stratos June 21, 2018 1 / 33 Tangent: Some Loose Ends in Logistic Regression Polynomial feature expansion in logistic

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning 12. Gaussian Processes Alex Smola Carnegie Mellon University http://alex.smola.org/teaching/cmu2013-10-701 10-701 The Normal Distribution http://www.gaussianprocess.org/gpml/chapters/

More information

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Outline Ordinary Least Squares (OLS) Regression Generalized Linear Models

More information

Relevance Vector Machines for Earthquake Response Spectra

Relevance Vector Machines for Earthquake Response Spectra 2012 2011 American American Transactions Transactions on on Engineering Engineering & Applied Applied Sciences Sciences. American Transactions on Engineering & Applied Sciences http://tuengr.com/ateas

More information

Estimating functional uncertainty using polynomial chaos and adjoint equations

Estimating functional uncertainty using polynomial chaos and adjoint equations 0. Estimating functional uncertainty using polynomial chaos and adjoint equations February 24, 2011 1 Florida State University, Tallahassee, Florida, Usa 2 Moscow Institute of Physics and Technology, Moscow,

More information

CIS 520: Machine Learning Oct 09, Kernel Methods

CIS 520: Machine Learning Oct 09, Kernel Methods CIS 520: Machine Learning Oct 09, 207 Kernel Methods Lecturer: Shivani Agarwal Disclaimer: These notes are designed to be a supplement to the lecture They may or may not cover all the material discussed

More information

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET The Problem Identification of Linear and onlinear Dynamical Systems Theme : Curve Fitting Division of Automatic Control Linköping University Sweden Data from Gripen Questions How do the control surface

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Support Vector Machine (SVM) and Kernel Methods

Support Vector Machine (SVM) and Kernel Methods Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2015 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

Gaussian with mean ( µ ) and standard deviation ( σ)

Gaussian with mean ( µ ) and standard deviation ( σ) Slide from Pieter Abbeel Gaussian with mean ( µ ) and standard deviation ( σ) 10/6/16 CSE-571: Robotics X ~ N( µ, σ ) Y ~ N( aµ + b, a σ ) Y = ax + b + + + + 1 1 1 1 1 1 1 1 1 1, ~ ) ( ) ( ), ( ~ ), (

More information

Nonlinear System Identification using Support Vector Regression

Nonlinear System Identification using Support Vector Regression Nonlinear System Identification using Support Vector Regression Saneej B.C. PhD Student Department of Chemical and Materials Engineering University of Alberta Outline 2 1. Objectives 2. Nonlinearity in

More information

Mustafa H. Tongarlak Bruce E. Ankenman Barry L. Nelson

Mustafa H. Tongarlak Bruce E. Ankenman Barry L. Nelson Proceedings of the 0 Winter Simulation Conference S. Jain, R. R. Creasey, J. Himmelspach, K. P. White, and M. Fu, eds. RELATIVE ERROR STOCHASTIC KRIGING Mustafa H. Tongarlak Bruce E. Ankenman Barry L.

More information

Discriminative Models

Discriminative Models No.5 Discriminative Models Hui Jiang Department of Electrical Engineering and Computer Science Lassonde School of Engineering York University, Toronto, Canada Outline Generative vs. Discriminative models

More information

Reduction of Random Variables in Structural Reliability Analysis

Reduction of Random Variables in Structural Reliability Analysis Reduction of Random Variables in Structural Reliability Analysis S. ADHIKARI AND R. S. LANGLEY Cambridge University Engineering Department Cambridge, U.K. Random Variable Reduction in Reliability Analysis

More information

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 1 MACHINE LEARNING Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 2 Practicals Next Week Next Week, Practical Session on Computer Takes Place in Room GR

More information

Multi-fidelity co-kriging models

Multi-fidelity co-kriging models Application to Sequential design Loic Le Gratiet 12, Claire Cannamela 3 1 EDF R&D, Chatou, France 2 UNS CNRS, 69 Sophia Antipolis, France 3 CEA, DAM, DIF, F-91297 Arpajon, France ANR CHORUS April 3, 214

More information

Data Mining. Linear & nonlinear classifiers. Hamid Beigy. Sharif University of Technology. Fall 1396

Data Mining. Linear & nonlinear classifiers. Hamid Beigy. Sharif University of Technology. Fall 1396 Data Mining Linear & nonlinear classifiers Hamid Beigy Sharif University of Technology Fall 1396 Hamid Beigy (Sharif University of Technology) Data Mining Fall 1396 1 / 31 Table of contents 1 Introduction

More information

Statistical Methods for SVM

Statistical Methods for SVM Statistical Methods for SVM Support Vector Machines Here we approach the two-class classification problem in a direct way: We try and find a plane that separates the classes in feature space. If we cannot,

More information

Statistical Machine Learning from Data

Statistical Machine Learning from Data Samy Bengio Statistical Machine Learning from Data 1 Statistical Machine Learning from Data Support Vector Machines Samy Bengio IDIAP Research Institute, Martigny, Switzerland, and Ecole Polytechnique

More information

Kernel methods, kernel SVM and ridge regression

Kernel methods, kernel SVM and ridge regression Kernel methods, kernel SVM and ridge regression Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Collaborative Filtering 2 Collaborative Filtering R: rating matrix; U: user factor;

More information

Supervised Learning Coursework

Supervised Learning Coursework Supervised Learning Coursework John Shawe-Taylor Tom Diethe Dorota Glowacka November 30, 2009; submission date: noon December 18, 2009 Abstract Using a series of synthetic examples, in this exercise session

More information

Relevance Vector Machines

Relevance Vector Machines LUT February 21, 2011 Support Vector Machines Model / Regression Marginal Likelihood Regression Relevance vector machines Exercise Support Vector Machines The relevance vector machine (RVM) is a bayesian

More information

Least squares problems

Least squares problems Least squares problems How to state and solve them, then evaluate their solutions Stéphane Mottelet Université de Technologie de Compiègne 30 septembre 2016 Stéphane Mottelet (UTC) Least squares 1 / 55

More information

Support Vector Machines

Support Vector Machines Support Vector Machines Here we approach the two-class classification problem in a direct way: We try and find a plane that separates the classes in feature space. If we cannot, we get creative in two

More information

Regression: Lecture 2

Regression: Lecture 2 Regression: Lecture 2 Niels Richard Hansen April 26, 2012 Contents 1 Linear regression and least squares estimation 1 1.1 Distributional results................................ 3 2 Non-linear effects and

More information

Learning Gaussian Process Models from Uncertain Data

Learning Gaussian Process Models from Uncertain Data Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada

More information

Basis Expansion and Nonlinear SVM. Kai Yu

Basis Expansion and Nonlinear SVM. Kai Yu Basis Expansion and Nonlinear SVM Kai Yu Linear Classifiers f(x) =w > x + b z(x) = sign(f(x)) Help to learn more general cases, e.g., nonlinear models 8/7/12 2 Nonlinear Classifiers via Basis Expansion

More information