AASS, Örebro University

Size: px
Start display at page:

Download "AASS, Örebro University"

Transcription

1 Mobile Robotics Achim J. and Lilienthal Olfaction Lab, AASS, Örebro University 1

2 Contents 1. Gas-Sensitive Mobile Robots in the Real World 2. Gas Dispersal in Natural Environments 3. Gas Quantification with MOX Sensors in an OSS o Gaussian Processes (GP)» Weight-Space View Bayesian Linear Regression» Weight-Space View Bayesian Non-Linear Regression» Function-Space View» Examples, Drawing Samples from a GP» Predictions with Noise-Free Observations» Predictions with Noisy Observations 3. Gas Quantification with MOX Sensors in an OSS Part 2 4. Gas Dispersal Simulation 5. Literature 2

3 1 Gas-Sensitive Mobile Robots in the Real World 3

4 1. Gas-Sensitive Mobile Robots Basic Idea o to combine autonomous robots with... o gas sensing technology ("electronic nose") and o... eventually other relevant sensors + + = 4

5 1. Gas-Sensitive Mobile Robots Covered in this Lecture Chemical Sensing o under water o airborne (land, air) Airborne Chemical Sensing with Mobile Robots o first publications at the beginning of the 1990s 5

6 1. Gas-Sensitive Mobile Robots Basic Idea o autonomous robots + gas sensors + other relevant sensors 6

7 1. Gas-Sensitive Mobile Robots Basic Idea o autonomous robots + gas sensors + other relevant sensors Applications: "Smelling" Robots for... o pollution monitoring o security applications (detecting leaks, for example) o non-military surveillance, search and rescue o automatic humanitarian demining,... Basic Research: Learn More... o about the physics of gas distribution o smell related biological solutions (homing, communication,...) 7

8 1. Gas-Sensitive Mobile Robots Subtasks in Mobile Robot Olfaction o gas detection (gas finding)» detecting an increased concentration of a target gas 8

9 1. Gas-Sensitive Mobile Robots Subtasks in Mobile Robot Olfaction o gas detection (gas finding) o "odour" discrimination and concentration estimation» sensor calibration» pattern recognition 9

10 1. Gas-Sensitive Mobile Robots Subtasks in Mobile Robot Olfaction o gas detection (gas finding) o "odour" discrimination and concentration estimation o gas source tracking» following the cues from the sensed gas distribution (eventually using also other sensor modalities) towards the source [Lilienthal et al. 2006] 10

11 1. Gas-Sensitive Mobile Robots Subtasks in Mobile Robot Olfaction o gas detection (gas finding) o "odour" discrimination and concentration estimation o gas source tracking o gas source declaration» determining the certainty that the source has been found 11

12 1. Gas-Sensitive Mobile Robots Subtasks in Mobile Robot Olfaction o gas detection (gas finding) o "odour" discrimination and concentration estimation o gas source tracking o gas source declaration gas source localisation 12

13 1. Gas-Sensitive Mobile Robots Subtasks in Mobile Robot Olfaction o gas detection (gas finding) o "odour" discrimination and concentration estimation o gas source tracking o gas source declaration o trail guidance» trail following [Lilienthal et al. 2006] 13

14 1. Gas-Sensitive Mobile Robots Subtasks in Mobile Robot Olfaction o gas detection (gas finding) o "odour" discrimination and concentration estimation o gas source tracking o gas source declaration o trail guidance» trail following» trail avoiding strategies [Lilienthal et al. 2006] 14

15 1. Gas-Sensitive Mobile Robots Subtasks in Mobile Robot Olfaction o gas detection (gas finding) o "odour" discrimination and concentration estimation o gas source tracking o gas source declaration o trail guidance» trail following» trail avoiding strategies o gas distribution modelling (gas distribution mapping) 15

16 1. Gas-Sensitive Mobile Robots From "Electronic Nose" to "Mobile Nose" o space, power, weight restrictions o varying environmental conditions (temperature, humidity, ) o open sampling system» direct exposition of gas sensors to the environment» less controlled gas sampling» typically continuous sampling? 16

17 1. Gas-Sensitive Mobile Robots Differences to Other Sensors in Mobile Robotics o point measurement» sensitive sensor surface is typically small (often < 1cm 2 ) 17

18 1. Gas-Sensitive Mobile Robots Differences to Other Sensors in Mobile Robotics o point measurement» sensitive sensor surface is typically small (often < 1cm 2 )» effective sampling region also small and approx. spherical (fan) 18

19 1. Gas-Sensitive Mobile Robots Differences to Other Sensors in Mobile Robotics o point measurement o sensor dynamics» long response time, very long recovery time 19

20 1. Gas-Sensitive Mobile Robots Differences to Other Sensors in Mobile Robotics o point measurement o sensor dynamics o calibration» complicated "sensor response concentration" relation» dependent on other variables (temperature, humidity,...)» has to consider sensor dynamics» variation between individual sensors» long-term drift 20

21 1. Gas-Sensitive Mobile Robots Differences to Other Sensors in Mobile Robotics o point measurement o sensor dynamics o calibration o very dynamic reality» constantly changing chaotic concentration field» real-time gas distribution mapping changes at different time-scales rapid fluctuations slow changes of the overall structure of the average distribution 21

22 2 Gas Dispersal in Natural Environments 22

23 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal o diffusion [Smyth and Moum 2001] 23

24 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal o diffusion [Smyth and Moum, 2001] [Smyth and Moum 2001] 24

25 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal o diffusion o advective transport o turbulent transport [Smyth and Moum, 2001] [Smyth and Moum 2001] 25

26 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal o diffusion o advective transport o turbulent transport [Smyth and Moum, 2001] [Smyth and Moum 2001] 26

27 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal o diffusion o advective transport o turbulent transport video courtesy of Hiroshi Ishida 27

28 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal data collected by Yuichiro Fukazawa, Marco Trincavelli, Yuta Wada and Hiroshi Ishida 29

29 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal data collected by Yuichiro Fukazawa, Marco Trincavelli, Yuta Wada and Hiroshi Ishida 31

30 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal data collected by Yuichiro Fukazawa, Marco Trincavelli, Yuta Wada and Hiroshi Ishida 32

31 2. Gas Dispersal in Natural Environments Chaotic Gas Dispersal data collected by Yuichiro Fukazawa, Marco Trincavelli, Yuta Wada and Hiroshi Ishida 33

32 2. Gas Dispersal in Natural Environments Turbulent Flow Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o Reynold's number describes the balance between viscous and shear forces 35

33 2. Gas Dispersal in Natural Environments Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o Reynold's number describes the balance between viscous and shear forces Re = V L ν 36

34 2. Gas Dispersal in Natural Environments Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o Reynold's number describes the balance between viscous and shear forces rms fluctuation of velocity at one point Re = V L ν 37

35 2. Gas Dispersal in Natural Environments Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o Reynold's number describes the balance between viscous and shear forces Re = V L ν integral scale = correlation length of the velocity field (relates to dimensions of available space) 38

36 2. Gas Dispersal in Natural Environments Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o Reynold's number describes the balance between viscous and shear forces Re = V L ν kinematic viscosity ( m 2 /s for air at 20 C) 39

37 2. Gas Dispersal in Natural Environments Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o Reynold's number describes the balance between viscous and shear forces Re» laminar flow for Re < 2000» turbulent flow for Re > 10000» laminar or turbulent in between = V L ν 40

38 2. Gas Dispersal in Natural Environments Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o energy dissipates from larger to smaller eddies o largest eddy size given by Kolmogorov scale 41

39 2. Gas Dispersal in Natural Environments Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o energy dissipates from larger to smaller eddies o largest eddy size given by Kolmogorov scale L k 3 ν = ε 1 4 rate at which energy is passed from larger to smaller vortices (in simple cases V 3 / L) 42

40 2. Gas Dispersal in Natural Environments Turbulent Flow o caused by shear forces» shear forces e.g. through obstacles o shear forces generate torque vortices» strong viscous forces can prevent vortices to form (rotational energy heat) o energy dissipates from larger to smaller eddies o largest eddy size given by Kolmogorov scale o lower limit of vortex size» layer of laminar flow next to surfaces of solid objects» chemical compounds can only pass through this laminar layer by diffusion 43

41 2. Gas Dispersal in Natural Environments Turbulent Flow Characteristics [Roberts, Webster 2000] o turbulent flow is chaotic/unpredictable» instantaneous velocity/concentration at some instant of time is generally insufficient to predict the velocity some time later o high degree of vortical motion» large-scale eddies cause a meandering dispersal» small scale eddies stretch and twist the gas distribution resulting in a complicated patchy structure o turbulent transport is much faster than molecular diffusion» gaseous ethanol at 25 C and 1 atm: diffusion constant: cm 2 /s diffusion velocity: 20.7 cm/h 44

42 3 Gas Quantification with an Array of MOX Sensors in an Open Sampling System 46

43 3. Gas Quantification with MOX Sensors in Open Sampling Configuration Gas Quantification in an Open Sampling System o challenge 1: highly dynamic characteristic of turbulent airflow Gas Quantification with MOX Sensors o challenge 2: slow dynamics of the MOX sensors traditional three-phase sampling continuous (open) sampling with a mobile robot 47

44 3. Gas Quantification with MOX Sensors in Open Sampling Configuration Gas Quantification in an Open Sampling System o challenge 1: highly dynamic characteristic of turbulent airflow Gas Quantification with MOX Sensors o challenge 2: slow dynamics of the MOX sensors» steady state virtually never reached (!)» we must use transients 48

45 3. Gas Quantification with MOX Sensors in Open Sampling Configuration Gas Quantification in an Open Sampling System o challenge 1: highly dynamic characteristic of turbulent airflow Gas Quantification with MOX Sensors o challenge 2: slow dynamics of the MOX sensors» steady state virtually never reached (!)» we must use transients» we cannot use a deterministic calibration function that requires steady state information g i : steady state conductance of sensor c(t): gas concentration A i, α i : parameters of an exponential model 49

46 3. Gas Quantification with MOX Sensors in Open Sampling Configuration A Probabilistic Approach to MOX-OSS Gas Quantification o MOX-OSS: MOX sensors in an Open Sampling System o key idea: relate measurements with a posterior over concentrations find p(c r 1,..., r N ) 50

47 3. Gas Quantification with MOX Sensors in Open Sampling Configuration A Probabilistic Approach to MOX-OSS Gas Quantification o MOX-OSS: MOX sensors in an Open Sampling System o key idea: relate measurements with a posterior over concentrations» more general (consider non-chemosensor measurements) find p(c r 1,..., r N, T, r H ) 51

48 3. Gas Quantification with MOX Sensors in Open Sampling Configuration A Probabilistic Approach to MOX-OSS Gas Quantification o MOX-OSS: MOX sensors in an Open Sampling System o key idea: relate measurements with a posterior over concentrations» more general (consider last k+1 readings) find p(c(t n ) r 1 n-k:n,..., r N n-k:n, T n-k:n, r H n-k:n ) 52

49 3. Gas Quantification with MOX Sensors in Open Sampling Configuration A Probabilistic Approach to MOX-OSS Gas Quantification o MOX-OSS: MOX sensors in an Open Sampling System o key idea: relate measurements with a posterior over concentrations» here find p(c(t n ) r 1 n-k:n,..., r N n-k:n, T n-k:n, r H n-k:n ) 53

50 3. Gas Quantification with MOX Sensors in Open Sampling Configuration A Probabilistic Approach to MOX-OSS Gas Quantification o MOX-OSS: MOX sensors in an Open Sampling System o key idea: relate measurements with a posterior over concentrations» has hardly been done so far!» previous works with MOX-OSS typically neglect the problem!! find p(c(t n ) r 1 n-k:n,..., r N n-k:n, T n-k:n, r H n-k:n ) 54

51 3. Gas Quantification with MOX Sensors in Open Sampling Configuration A Probabilistic Approach to MOX-OSS Gas Quantification o MOX-OSS: MOX sensors in an Open Sampling System o key idea: relate measurements with a posterior over concentrations o approach using Gaussian Processes [Monroy et al. 2012]» uncertainty in MOX conductance concentration due to gas transport mechanisms inherent sensor dynamics environmental factors such as temperature or humidity (not explicitly addressed here) 55

52 3. Gas Quantification with MOX Sensors in Open Sampling Configuration A Probabilistic Approach to MOX-OSS Gas Quantification o MOX-OSS: MOX sensors in an Open Sampling System o key idea: relate measurements with a posterior over concentrations o approach using Gaussian Processes [Monroy et al. 2012] o use PID readings as ground truth» we know there is only one compound present (+ sensor responds quickly)» assumption: MOX sensors and PID are close enough so that they are exposed to the same concentrations data collected by Yuichiro Fukazawa, Marco Trincavelli, Yuta Wada and Hiroshi Ishida 56

53 3. Gas Quantification with MOX Sensors in Open Sampling Configuration A Probabilistic Approach to MOX-OSS Gas Quantification o MOX-OSS: MOX sensors in an Open Sampling System o key idea: relate measurements with a posterior over concentrations o approach using Gaussian Processes [Monroy et al. 2012] o use PID readings as ground truth o specific research questions» which sensors contribute most?» how much do we gain from using multiple sensors?» how much do we gain from using previous sensor readings? 57

54 3. Gas Quantification with MOX Sensors in Open Sampling Configuration Related Work o calibration on steady-state values» multivariate linear regression methods Principal Component Regression (PCR) Partial Least Squares Regression (PLSR)» non-linear methods Artificial Neural Networks (ANN)» kernel methods Support Vector Regression (SVR) o estimation of steady-state values based on transients o continuous recalibration on integral values C. D. Natale, S. Marco, F. Davide, A. D Amico. Sensor-array calibration time reduction by dynamic modelling. Sensors and Actuators B: Chemical 25 (1-3) (1995) » hourly regression using an ANN» use hourly average of conventional monitoring station with spectrometer as ground truth (compare to hourly average of sensor response) S. D. Vito, E. Massera, M. Piga, L. Martinotto, G. D. Francia. On field calibration of an electronic nose for benzene estimation in an urban pollution monitoring scenario. Sensors and Actuators B: Chemical 129 (2) (2008)

55 3. Gas Quantification with MOX Sensors in Open Sampling Configuration Related Work in MRO? o authors typically work on conductance directly o only [Ishida 1998] propose using sensor calibration based on steady state values as a rough approximation for an OSS o no method considers the transient nature of the signal o no probabilistic method! [Ishida 1998] H. Ishida, T. Nakamoto, T. Moriizumi. Remote sensing of gas/odor source location and concentration distribution using mobile system. Sensors and Actuators B: Chemical 49 (1-2) (1998)

56 3. Gas Quantification with MOX Sensors in Open Sampling Configuration GP Approach to "Probabilistic Calibration" for MOX-OSS o signal conditioning» compensate for long-term drift: r i (t) = R i (t) / R i (0)» alt 1: learn GP on r i (t)» alt 2: learn GP on log(r i (t)) 61

57 3. Gas Quantification with MOX Sensors in Open Sampling Configuration GP Approach to "Probabilistic Calibration" for MOX-OSS o signal conditioning o Gaussian Process (GP) learning 62

58 + Gaussian Processes (GP) following [Rasmussen, Williams 2006] using own slides refined by Marco Trincavelli 63

59 + Gaussian Processes What is a Gaussian Process? o stochastic process generalization of probability distribution to functions» probability distribution describes a finite-dimensional random variable» stochastic process describes a distribution over functions o a particular stochastic process» GPs are a particularly tractable stochastic process o a Bayesian version of an SVM o an infinitely large neural network o particularly effective method for placing prior distributions over the space of functions or over the space of model parameters» space of model parameters weight-space view» space of functions function-space view 64

60 + Gaussian Processes What is a Gaussian Process? 65

61 + Gaussian Processes (GP) Weight-Space View Bayesian Linear Regression 66

62 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o standard linear model ( ) T ( ) ( 2 x = x w y = f x ε, ε ~ N 0, σ ) f +, ε» additive, i.i.d. Gaussian noise 67

63 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o standard linear model ( ) T ( ) ( 2 x = x w y = f x ε, ε ~ N 0, σ ) f +, ε» non-vector notation ( x) = w0 + w1 x w D xd f + 68

64 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o standard linear model ( ) T ( ) ( 2 x = x w y = f x ε, ε ~ N 0, σ ) f +, ε» non-vector notation (D = 1) ( x) = w0 w1 x1 f + 69

65 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o standard linear model ( ) T ( ) ( 2 x = x w y = f x ε, ε ~ N 0, σ ) f +, ε» non-vector notation (D = 1) ( x) = w0 w1 x1 f + 70

66 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o standard linear model ( ) T ( ) ( 2 x = x w y = f x ε, ε ~ N 0, σ ) f +, ε» non-vector notation (D = 1) ( x) = w0 w1 x1 f + 71

67 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o likelihood of observations (normally distributed error) ε ε ε» given weights w and training data y» X: design matrix collection of D-dimensional input vectors (as column vectors) in the training set ε ε 72

68 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o introduce Gaussian prior over weights w ~ N ( 0, Σ ) p 73

69 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o introduce Gaussian prior over weights w ~ N ( 0, Σ ) p» example (D = 1) ( x) = w0 w1 x1 f + 74

70 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o introduce Gaussian prior over weights w ~ N ( 0, Σ ) p» example (D = 1) ( x) = w0 w1 x1 f + w 1 ( w) N( 0, I ) p = w 0 75

71 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule 76

72 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule» likelihood of training data ε ε 79

73 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule 1 ( ) ( ) T T ( T ) 1 T 1 p w Xy, exp y Xw y Xw exp Σ 2 w p w 2σ n ε 2 81

74 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule ( w Xy, ) ( y Xw, ) ( w) p p p 1 ( ) ( ) T T ( T ) 1 T 1 p w Xy, exp y Xw y Xw exp Σ 2 w p w 2σ n ε 2... p ( ) ( 2 1 1) 2 T 1 w X, y N σ A Xy, A, A = σ X X + Σ» posterior over weights is again Gaussian n ε n ε p 82

75 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o Gaussian prior over weights w ~ N ( 0, Σ ) p» example (D = 1) ( x) = w0 w1 x1 f + w 1 ( w) N( 0, I ) p = w 0 83

76 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule» example: we get 3 observations 84

77 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule» example: we get 3 observations» calculate observation likelihood assuming ε ~ p ( 2 0, ) N ( 0,1) σ ε N = ( ) ( T 2 y X, w = N X w, σ I ) ε w 1 w 0 85

78 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule» example: we get 3 observations» calculate observation likelihood assuming» compute posterior over weights p A ( ) ( w X, y ~ N σ A Xy, A ) ε 2 T 1 = σ ε X X + Σ p w 1 w 0 86

79 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule» example: we get 3 observations» calculate observation likelihood assuming» compute posterior over weights w 1 Prior Likelihood Posterior w 1 w 1 w 0 w 0 w 0 87

80 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule» example: we get 3 observations» calculate observation likelihood assuming» compute posterior over weights w 1 Prior Likelihood Posterior w 1 w 1 intercept reduced ( 0) w 0 w 0 w 0 88

81 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o compute posterior over weights using Bayes rule» example: we get 3 observations» calculate observation likelihood assuming» compute posterior over weights w 1 Prior Likelihood Posterior w 1 w 1 slope left unchanged w 0 w 0 w 0 89

82 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictions for a query point» = average predictions f * at x * over all parameter values (weights) weighted by their posterior predictive distribution 90

83 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictions for a query point» = average predictions f * at x * over all parameter values (weights) weighted by their posterior» predictive distribution is again Gaussian A 2 T 1 = σ ε X X + Σ p 91

84 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictions for a query point» = average predictions f * at x * over all parameter values (weights) weighted by their posterior predictive distribution» posterior distribution over weights predictive distribution over functions w 1 w 0 92

85 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o problem: linear model has limited expressiveness f ( ) T ( ) ( 2 x = x w y = f x + ε, ε N 0, σ ), ε» non-vector notation (D = 1) ( x) = w0 w1 x1 f + 93

86 + Gaussian Processes (GP) Weight-Space View Bayesian Non-Linear Regression 94

87 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o problem: linear model has limited expressiveness o apply Kernel trick 95

88 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o apply Kernel trick» project inputs into some high dimensional space using a set of basis functions and apply the linear model in the high dimensional space x φ, ( ) D N x R R 96

89 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o apply Kernel trick» project inputs into some high dimensional space using a set of basis functions and apply the linear model in the high dimensional space x φ, ( ) D N x R R! the model is still linear in w f ( ) T x x w ( x) ( x) T = f = φ w 97

90 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o apply Kernel trick» project inputs into some high dimensional space using a set of basis functions and apply the linear model in the high dimensional space x φ, ( ) D N x R R! the model is still linear in w f ( ) T x x w ( x) ( x) T = f = φ w! N can be infinite φ ( ) 2 x = 1, x, x,... 98

91 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o apply Kernel trick» project inputs into some high dimensional space using a set of basis functions and apply the linear model in the high dimensional space f ( ) T x x w ( x) ( x) T = f = φ w o apply linear model in feature space (to make predictions) 99

92 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o apply Kernel trick» project inputs into some high dimensional space using a set of basis functions and apply the linear model in the high dimensional space f ( ) T x x w ( x) ( x) T = f = φ w o apply linear model in feature space (to make predictions) o... predictive mean in feature space o... predictive covariance in feature space 105

93 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictive mean and covariance in feature space 106

94 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictive mean and covariance in feature space o identify dot products of the form φ ( ) T x Σ ( ) pφ 1 x2 107

95 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictive mean and covariance in feature space o identify dot products of the form φ ( ) T x Σ ( ) 1 x2» since Σ p is positive definite, we can find a Cholesky decomposition Σ p = pφ L T L 108

96 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictive mean and covariance in feature space o identify dot products of the form ( ) T x Σ ( ) 1 x2» since Σ p is positive definite, we can find a Cholesky decomposition» and define a kernel (or covariance function) as φ Σ p = pφ L T L 109

97 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictive mean and covariance in feature space o the feature space φ appears only within dot products as φ ( ) T x Σ ( ) 1 x2 o and therefore the feature space φ can be removed using a kernel pφ 110

98 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior o predictive mean and covariance in feature space ( ) 1 T 2 µ k x, X k XX, σ I y = ( ) ( ) + n T 1 2,, X XX, X, ( ) ( ) Σ= ( ) ( ) ( ) + σ n k x x k x k I k x p f x, Xy, = N µ, Σ ( ) ( ) 111

99 + Gaussian Processes, Weight-Space View Bayesian Linear Regression with a Gaussian Prior, Summary o define standard linear model (step 1) o likelihood of observations in standard linear model (step 2) o introduce Gaussian prior over weights (step 3) o compute weight posterior using Bayes rule (step 4) o integrate predictions over weights to get predictive distribution (step 5) o apply Kernel trick (step 6) 113

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

AASS, Örebro University

AASS, Örebro University Mobile Robotics Achim J. and Lilienthal Olfaction Lab, AASS, Örebro University 1 Contents 1. "Classical" Electronic Nose 2. Further Gas Sensing Technologies for Mobile Robots 3. (Signal Processing in Electronic

More information

Achim J. Lilienthal Erik Schaffernicht Achim J. Lilienthal. AASS, Örebro University

Achim J. Lilienthal Erik Schaffernicht Achim J. Lilienthal. AASS, Örebro University Achim J. Lilienthal Erik Schaffernicht Mobile Achim J. Achim J. Lilienthal Room T1222 Robotics and Lilienthal Olfaction Room T1211 Lab, Erik Schaffernicht achim.lilienthal@oru.se AASS, Örebro University

More information

Neutron inverse kinetics via Gaussian Processes

Neutron inverse kinetics via Gaussian Processes Neutron inverse kinetics via Gaussian Processes P. Picca Politecnico di Torino, Torino, Italy R. Furfaro University of Arizona, Tucson, Arizona Outline Introduction Review of inverse kinetics techniques

More information

GAUSSIAN PROCESS REGRESSION

GAUSSIAN PROCESS REGRESSION GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The

More information

Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes

Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan 1, M. Koval and P. Parashar 1 Applications of Gaussian

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

Gaussian Processes. 1 What problems can be solved by Gaussian Processes?

Gaussian Processes. 1 What problems can be solved by Gaussian Processes? Statistical Techniques in Robotics (16-831, F1) Lecture#19 (Wednesday November 16) Gaussian Processes Lecturer: Drew Bagnell Scribe:Yamuna Krishnamurthy 1 1 What problems can be solved by Gaussian Processes?

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

J. Szantyr Lecture No. 4 Principles of the Turbulent Flow Theory The phenomenon of two markedly different types of flow, namely laminar and

J. Szantyr Lecture No. 4 Principles of the Turbulent Flow Theory The phenomenon of two markedly different types of flow, namely laminar and J. Szantyr Lecture No. 4 Principles of the Turbulent Flow Theory The phenomenon of two markedly different types of flow, namely laminar and turbulent, was discovered by Osborne Reynolds (184 191) in 1883

More information

Statistical Techniques in Robotics (16-831, F12) Lecture#20 (Monday November 12) Gaussian Processes

Statistical Techniques in Robotics (16-831, F12) Lecture#20 (Monday November 12) Gaussian Processes Statistical Techniques in Robotics (6-83, F) Lecture# (Monday November ) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan Applications of Gaussian Processes (a) Inverse Kinematics

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II 1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

Turbulence Instability

Turbulence Instability Turbulence Instability 1) All flows become unstable above a certain Reynolds number. 2) At low Reynolds numbers flows are laminar. 3) For high Reynolds numbers flows are turbulent. 4) The transition occurs

More information

Modeling of turbulence in stirred vessels using large eddy simulation

Modeling of turbulence in stirred vessels using large eddy simulation Modeling of turbulence in stirred vessels using large eddy simulation André Bakker (presenter), Kumar Dhanasekharan, Ahmad Haidari, and Sung-Eun Kim Fluent Inc. Presented at CHISA 2002 August 25-29, Prague,

More information

The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan

The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan Background: Global Optimization and Gaussian Processes The Geometry of Gaussian Processes and the Chaining Trick Algorithm

More information

Multivariate Bayesian Linear Regression MLAI Lecture 11

Multivariate Bayesian Linear Regression MLAI Lecture 11 Multivariate Bayesian Linear Regression MLAI Lecture 11 Neil D. Lawrence Department of Computer Science Sheffield University 21st October 2012 Outline Univariate Bayesian Linear Regression Multivariate

More information

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE GRAVIS 2016 BASEL. Logistic Regression. Pattern Recognition 2016 Sandro Schönborn University of Basel

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE GRAVIS 2016 BASEL. Logistic Regression. Pattern Recognition 2016 Sandro Schönborn University of Basel Logistic Regression Pattern Recognition 2016 Sandro Schönborn University of Basel Two Worlds: Probabilistic & Algorithmic We have seen two conceptual approaches to classification: data class density estimation

More information

Machine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart

Machine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart Machine Learning Bayesian Regression & Classification learning as inference, Bayesian Kernel Ridge regression & Gaussian Processes, Bayesian Kernel Logistic Regression & GP classification, Bayesian Neural

More information

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes COMP 55 Applied Machine Learning Lecture 2: Gaussian processes Instructor: Ryan Lowe (ryan.lowe@cs.mcgill.ca) Slides mostly by: (herke.vanhoof@mcgill.ca) Class web page: www.cs.mcgill.ca/~hvanho2/comp55

More information

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,

More information

PREDICTING SOLAR GENERATION FROM WEATHER FORECASTS. Chenlin Wu Yuhan Lou

PREDICTING SOLAR GENERATION FROM WEATHER FORECASTS. Chenlin Wu Yuhan Lou PREDICTING SOLAR GENERATION FROM WEATHER FORECASTS Chenlin Wu Yuhan Lou Background Smart grid: increasing the contribution of renewable in grid energy Solar generation: intermittent and nondispatchable

More information

Expectation Propagation for Approximate Bayesian Inference

Expectation Propagation for Approximate Bayesian Inference Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given

More information

Nonparameteric Regression:

Nonparameteric Regression: Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Neil D. Lawrence GPSS 10th June 2013 Book Rasmussen and Williams (2006) Outline The Gaussian Density Covariance from Basis Functions Basis Function Representations Constructing

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations

More information

GWAS V: Gaussian processes

GWAS V: Gaussian processes GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011

More information

Ergodicity in data assimilation methods

Ergodicity in data assimilation methods Ergodicity in data assimilation methods David Kelly Andy Majda Xin Tong Courant Institute New York University New York NY www.dtbkelly.com April 15, 2016 ETH Zurich David Kelly (CIMS) Data assimilation

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: Extended Kalman Filter Dr. Kostas Alexis (CSE) These slides relied on the lectures from C. Stachniss, J. Sturm and the book Probabilistic Robotics from Thurn et al.

More information

STA414/2104 Statistical Methods for Machine Learning II

STA414/2104 Statistical Methods for Machine Learning II STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements

More information

Bayesian Machine Learning

Bayesian Machine Learning Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 2: Bayesian Basics https://people.orie.cornell.edu/andrew/orie6741 Cornell University August 25, 2016 1 / 17 Canonical Machine Learning

More information

Gaussian with mean ( µ ) and standard deviation ( σ)

Gaussian with mean ( µ ) and standard deviation ( σ) Slide from Pieter Abbeel Gaussian with mean ( µ ) and standard deviation ( σ) 10/6/16 CSE-571: Robotics X ~ N( µ, σ ) Y ~ N( aµ + b, a σ ) Y = ax + b + + + + 1 1 1 1 1 1 1 1 1 1, ~ ) ( ) ( ), ( ~ ), (

More information

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

PILCO: A Model-Based and Data-Efficient Approach to Policy Search PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol

More information

Based on slides by Richard Zemel

Based on slides by Richard Zemel CSC 412/2506 Winter 2018 Probabilistic Learning and Reasoning Lecture 3: Directed Graphical Models and Latent Variables Based on slides by Richard Zemel Learning outcomes What aspects of a model can we

More information

SCUOLA DI SPECIALIZZAZIONE IN FISICA MEDICA. Sistemi di Elaborazione dell Informazione. Regressione. Ruggero Donida Labati

SCUOLA DI SPECIALIZZAZIONE IN FISICA MEDICA. Sistemi di Elaborazione dell Informazione. Regressione. Ruggero Donida Labati SCUOLA DI SPECIALIZZAZIONE IN FISICA MEDICA Sistemi di Elaborazione dell Informazione Regressione Ruggero Donida Labati Dipartimento di Informatica via Bramante 65, 26013 Crema (CR), Italy http://homes.di.unimi.it/donida

More information

Introduction. Chapter 1

Introduction. Chapter 1 Chapter 1 Introduction In this book we will be concerned with supervised learning, which is the problem of learning input-output mappings from empirical data (the training dataset). Depending on the characteristics

More information

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian

More information

Model Selection for Gaussian Processes

Model Selection for Gaussian Processes Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal

More information

Learning Gaussian Process Models from Uncertain Data

Learning Gaussian Process Models from Uncertain Data Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada

More information

Gaussian Process Regression

Gaussian Process Regression Gaussian Process Regression 4F1 Pattern Recognition, 21 Carl Edward Rasmussen Department of Engineering, University of Cambridge November 11th - 16th, 21 Rasmussen (Engineering, Cambridge) Gaussian Process

More information

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University FEATURE EXPANSIONS FEATURE EXPANSIONS

More information

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Bayesian Learning. Tobias Scheffer, Niels Landwehr

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Bayesian Learning. Tobias Scheffer, Niels Landwehr Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning Tobias Scheffer, Niels Landwehr Remember: Normal Distribution Distribution over x. Density function with parameters

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Gaussian Processes Barnabás Póczos http://www.gaussianprocess.org/ 2 Some of these slides in the intro are taken from D. Lizotte, R. Parr, C. Guesterin

More information

Machine Learning Overview

Machine Learning Overview Machine Learning Overview Sargur N. Srihari University at Buffalo, State University of New York USA 1 Outline 1. What is Machine Learning (ML)? 2. Types of Information Processing Problems Solved 1. Regression

More information

Midterm Review CS 6375: Machine Learning. Vibhav Gogate The University of Texas at Dallas

Midterm Review CS 6375: Machine Learning. Vibhav Gogate The University of Texas at Dallas Midterm Review CS 6375: Machine Learning Vibhav Gogate The University of Texas at Dallas Machine Learning Supervised Learning Unsupervised Learning Reinforcement Learning Parametric Y Continuous Non-parametric

More information

A BAYESIAN APPROACH FOR PREDICTING BUILDING COOLING AND HEATING CONSUMPTION

A BAYESIAN APPROACH FOR PREDICTING BUILDING COOLING AND HEATING CONSUMPTION A BAYESIAN APPROACH FOR PREDICTING BUILDING COOLING AND HEATING CONSUMPTION Bin Yan, and Ali M. Malkawi School of Design, University of Pennsylvania, Philadelphia PA 19104, United States ABSTRACT This

More information

Introduction to Gaussian Process

Introduction to Gaussian Process Introduction to Gaussian Process CS 778 Chris Tensmeyer CS 478 INTRODUCTION 1 What Topic? Machine Learning Regression Bayesian ML Bayesian Regression Bayesian Non-parametric Gaussian Process (GP) GP Regression

More information

Recent Advances in Bayesian Inference Techniques

Recent Advances in Bayesian Inference Techniques Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian

More information

Midterm Review CS 7301: Advanced Machine Learning. Vibhav Gogate The University of Texas at Dallas

Midterm Review CS 7301: Advanced Machine Learning. Vibhav Gogate The University of Texas at Dallas Midterm Review CS 7301: Advanced Machine Learning Vibhav Gogate The University of Texas at Dallas Supervised Learning Issues in supervised learning What makes learning hard Point Estimation: MLE vs Bayesian

More information

Probabilistic numerics for deep learning

Probabilistic numerics for deep learning Presenter: Shijia Wang Department of Engineering Science, University of Oxford rning (RLSS) Summer School, Montreal 2017 Outline 1 Introduction Probabilistic Numerics 2 Components Probabilistic modeling

More information

Probabilistic & Unsupervised Learning

Probabilistic & Unsupervised Learning Probabilistic & Unsupervised Learning Gaussian Processes Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc ML/CSML, Dept Computer Science University College London

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Before we consider two canonical turbulent flows we need a general description of turbulence.

Before we consider two canonical turbulent flows we need a general description of turbulence. Chapter 2 Canonical Turbulent Flows Before we consider two canonical turbulent flows we need a general description of turbulence. 2.1 A Brief Introduction to Turbulence One way of looking at turbulent

More information

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian

More information

Variational Bayesian Logistic Regression

Variational Bayesian Logistic Regression Variational Bayesian Logistic Regression Sargur N. University at Buffalo, State University of New York USA Topics in Linear Models for Classification Overview 1. Discriminant Functions 2. Probabilistic

More information

CSC2541 Lecture 2 Bayesian Occam s Razor and Gaussian Processes

CSC2541 Lecture 2 Bayesian Occam s Razor and Gaussian Processes CSC2541 Lecture 2 Bayesian Occam s Razor and Gaussian Processes Roger Grosse Roger Grosse CSC2541 Lecture 2 Bayesian Occam s Razor and Gaussian Processes 1 / 55 Adminis-Trivia Did everyone get my e-mail

More information

DEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE

DEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE Data Provided: None DEPARTMENT OF COMPUTER SCIENCE Autumn Semester 203 204 MACHINE LEARNING AND ADAPTIVE INTELLIGENCE 2 hours Answer THREE of the four questions. All questions carry equal weight. Figures

More information

Gaussian Processes in Machine Learning

Gaussian Processes in Machine Learning Gaussian Processes in Machine Learning November 17, 2011 CharmGil Hong Agenda Motivation GP : How does it make sense? Prior : Defining a GP More about Mean and Covariance Functions Posterior : Conditioning

More information

Data assimilation in high dimensions

Data assimilation in high dimensions Data assimilation in high dimensions David Kelly Courant Institute New York University New York NY www.dtbkelly.com February 12, 2015 Graduate seminar, CIMS David Kelly (CIMS) Data assimilation February

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Gaussian Processes (10/16/13)

Gaussian Processes (10/16/13) STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs

More information

Project Topic. Simulation of turbulent flow laden with finite-size particles using LBM. Leila Jahanshaloo

Project Topic. Simulation of turbulent flow laden with finite-size particles using LBM. Leila Jahanshaloo Project Topic Simulation of turbulent flow laden with finite-size particles using LBM Leila Jahanshaloo Project Details Turbulent flow modeling Lattice Boltzmann Method All I know about my project Solid-liquid

More information

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2 Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ Bayesian paradigm Consistent use of probability theory

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 89 Part II

More information

Virtual Sensors and Large-Scale Gaussian Processes

Virtual Sensors and Large-Scale Gaussian Processes Virtual Sensors and Large-Scale Gaussian Processes Ashok N. Srivastava, Ph.D. Principal Investigator, IVHM Project Group Lead, Intelligent Data Understanding ashok.n.srivastava@nasa.gov Coauthors: Kamalika

More information

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2 Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ 1 Bayesian paradigm Consistent use of probability theory

More information

Computer Emulation With Density Estimation

Computer Emulation With Density Estimation Computer Emulation With Density Estimation Jake Coleman, Robert Wolpert May 8, 2017 Jake Coleman, Robert Wolpert Emulation and Density Estimation May 8, 2017 1 / 17 Computer Emulation Motivation Expensive

More information

Stochastic Processes, Kernel Regression, Infinite Mixture Models

Stochastic Processes, Kernel Regression, Infinite Mixture Models Stochastic Processes, Kernel Regression, Infinite Mixture Models Gabriel Huang (TA for Simon Lacoste-Julien) IFT 6269 : Probabilistic Graphical Models - Fall 2018 Stochastic Process = Random Function 2

More information

Lecture 5: GPs and Streaming regression

Lecture 5: GPs and Streaming regression Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X

More information

Kernel methods, kernel SVM and ridge regression

Kernel methods, kernel SVM and ridge regression Kernel methods, kernel SVM and ridge regression Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Collaborative Filtering 2 Collaborative Filtering R: rating matrix; U: user factor;

More information

Learning Gas Distribution Models using Sparse Gaussian Process Mixtures

Learning Gas Distribution Models using Sparse Gaussian Process Mixtures Autonomous Robots manuscript No. (will be inserted by the editor) Learning Gas Distribution Models using Sparse Gaussian Process Mixtures Cyrill Stachniss Christian Plagemann Achim J. Lilienthal Received:

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

Reliability Monitoring Using Log Gaussian Process Regression

Reliability Monitoring Using Log Gaussian Process Regression COPYRIGHT 013, M. Modarres Reliability Monitoring Using Log Gaussian Process Regression Martin Wayne Mohammad Modarres PSA 013 Center for Risk and Reliability University of Maryland Department of Mechanical

More information

Exercises in Combustion Technology

Exercises in Combustion Technology Exercises in Combustion Technology Exercise 4: Turbulent Premixed Flames Turbulent Flow: Task 1: Estimation of Turbulence Quantities Borghi-Peters diagram for premixed combustion Task 2: Derivation of

More information

Bayesian Deep Learning

Bayesian Deep Learning Bayesian Deep Learning Mohammad Emtiyaz Khan AIP (RIKEN), Tokyo http://emtiyaz.github.io emtiyaz.khan@riken.jp June 06, 2018 Mohammad Emtiyaz Khan 2018 1 What will you learn? Why is Bayesian inference

More information

Linear Classification

Linear Classification Linear Classification Lili MOU moull12@sei.pku.edu.cn http://sei.pku.edu.cn/ moull12 23 April 2015 Outline Introduction Discriminant Functions Probabilistic Generative Models Probabilistic Discriminative

More information

Mining Classification Knowledge

Mining Classification Knowledge Mining Classification Knowledge Remarks on NonSymbolic Methods JERZY STEFANOWSKI Institute of Computing Sciences, Poznań University of Technology SE lecture revision 2013 Outline 1. Bayesian classification

More information

Drift Reduction For Metal-Oxide Sensor Arrays Using Canonical Correlation Regression And Partial Least Squares

Drift Reduction For Metal-Oxide Sensor Arrays Using Canonical Correlation Regression And Partial Least Squares Drift Reduction For Metal-Oxide Sensor Arrays Using Canonical Correlation Regression And Partial Least Squares R Gutierrez-Osuna Computer Science Department, Wright State University, Dayton, OH 45435,

More information

Lecture 9. Time series prediction

Lecture 9. Time series prediction Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture

More information

PATTERN RECOGNITION AND MACHINE LEARNING

PATTERN RECOGNITION AND MACHINE LEARNING PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Iain Murray murray@cs.toronto.edu CSC255, Introduction to Machine Learning, Fall 28 Dept. Computer Science, University of Toronto The problem Learn scalar function of

More information

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable

More information

Naïve Bayes classification

Naïve Bayes classification Naïve Bayes classification 1 Probability theory Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. Examples: A person s height, the outcome of a coin toss

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche

More information

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles Myungsoo Jun and Raffaello D Andrea Sibley School of Mechanical and Aerospace Engineering Cornell University

More information

Bayesian Support Vector Machines for Feature Ranking and Selection

Bayesian Support Vector Machines for Feature Ranking and Selection Bayesian Support Vector Machines for Feature Ranking and Selection written by Chu, Keerthi, Ong, Ghahramani Patrick Pletscher pat@student.ethz.ch ETH Zurich, Switzerland 12th January 2006 Overview 1 Introduction

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

Gaussian processes and bayesian optimization Stanisław Jastrzębski. kudkudak.github.io kudkudak

Gaussian processes and bayesian optimization Stanisław Jastrzębski. kudkudak.github.io kudkudak Gaussian processes and bayesian optimization Stanisław Jastrzębski kudkudak.github.io kudkudak Plan Goal: talk about modern hyperparameter optimization algorithms Bayes reminder: equivalent linear regression

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes 1 Objectives to express prior knowledge/beliefs about model outputs using Gaussian process (GP) to sample functions from the probability measure defined by GP to build

More information

Principles of Convection

Principles of Convection Principles of Convection Point Conduction & convection are similar both require the presence of a material medium. But convection requires the presence of fluid motion. Heat transfer through the: Solid

More information

Uncertainties estimation and compensation on a robotic manipulator. Application on Barrett arm

Uncertainties estimation and compensation on a robotic manipulator. Application on Barrett arm Uncertainties estimation and compensation on a robotic manipulator. Application on Barrett arm Duong Dang December, 8 Abstract In this project, two different regression methods have been used to estimate

More information

Probabilistic Graphical Models Lecture 20: Gaussian Processes

Probabilistic Graphical Models Lecture 20: Gaussian Processes Probabilistic Graphical Models Lecture 20: Gaussian Processes Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 30, 2015 1 / 53 What is Machine Learning? Machine learning algorithms

More information

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics Recap: State Estimation using Kalman Filter Project state and error

More information

Bayesian Learning (II)

Bayesian Learning (II) Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning (II) Niels Landwehr Overview Probabilities, expected values, variance Basic concepts of Bayesian learning MAP

More information

DAY 19: Boundary Layer

DAY 19: Boundary Layer DAY 19: Boundary Layer flat plate : let us neglect the shape of the leading edge for now flat plate boundary layer: in blue we highlight the region of the flow where velocity is influenced by the presence

More information

Multiscale Adaptive Sensor Systems

Multiscale Adaptive Sensor Systems Multiscale Adaptive Sensor Systems Silvia Ferrari Sibley School of Mechanical and Aerospace Engineering Cornell University ONR Maritime Sensing D&I Review Naval Surface Warfare Center, Carderock 9-11 August

More information