Local Probabilistic Models: Continuous Variable CPDs

Similar documents
Introduction to Mobile Robotics Probabilistic Sensor Models

Robots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models. Domingo Gallardo

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

Artificial Intelligence

1 Kalman Filter Introduction

Dynamic models 1 Kalman filters, linearization,

Linear Dynamical Systems

Manipulators. Robotics. Outline. Non-holonomic robots. Sensors. Mobile Robots

Template-Based Representations. Sargur Srihari

MAP Examples. Sargur Srihari

Mobile Robot Localization

Probabilistic Graphical Models (Cmput 651): Hybrid Network. Matthew Brown 24/11/2008

Probability and Information Theory. Sargur N. Srihari

Introduction to continuous and hybrid. Bayesian networks

Sensors Fusion for Mobile Robotics localization. M. De Cecco - Robotics Perception and Action

Mobile Robot Localization

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

Introduction to Probabilistic Graphical Models

CSE 473: Artificial Intelligence

Rao-Blackwellized Particle Filtering for 6-DOF Estimation of Attitude and Position via GPS and Inertial Sensors

Localization. Howie Choset Adapted from slides by Humphrey Hu, Trevor Decker, and Brad Neuman

Gaussian with mean ( µ ) and standard deviation ( σ)

Need for Sampling in Machine Learning. Sargur Srihari

CS491/691: Introduction to Aerial Robotics

Computational Complexity of Inference

Reasoning Under Uncertainty Over Time. CS 486/686: Introduction to Artificial Intelligence

Inference as Optimization

Deep Learning Srihari. Deep Belief Nets. Sargur N. Srihari

From Bayes to Extended Kalman Filter

Robotics. Lecture 4: Probabilistic Robotics. See course website for up to date information.

Variational Bayes and Variational Message Passing

Basic Sampling Methods

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

Machine Learning Basics: Maximum Likelihood Estimation

Chris Bishop s PRML Ch. 8: Graphical Models

Lecture : Probabilistic Machine Learning

TSRT14: Sensor Fusion Lecture 9

Introduction to Mobile Robotics Probabilistic Motion Models

Autonomous Mobile Robot Design

Stat 521A Lecture 18 1

Sensors for mobile robots

Representation. Stefano Ermon, Aditya Grover. Stanford University. Lecture 2

Locating and supervising relief forces in buildings without the use of infrastructure

Decision-making, inference, and learning theory. ECE 830 & CS 761, Spring 2016

Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Bayesian Machine Learning

Bayesian Network Representation

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Autonomous Mobile Robot Design

Local Probability Models

From Distributions to Markov Networks. Sargur Srihari

Machine Learning for Signal Processing Bayes Classification and Regression

We provide two sections from the book (in preparation) Intelligent and Autonomous Road Vehicles, by Ozguner, Acarman and Redmill.

Learning with Noisy Labels. Kate Niehaus Reading group 11-Feb-2014

Probabilistic Graphical Models Redes Bayesianas: Definição e Propriedades Básicas

Motion Models (cont) 1 2/15/2012

Intro to Probability. Andrei Barbu

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

Linear Models for Regression. Sargur Srihari

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Linear Factor Models. Sargur N. Srihari

EE Mobile Robots

DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY

Learning Bayesian network : Given structure and completely observed data

2D Image Processing. Bayes filter implementation: Kalman filter

Bayesian Networks Inference with Probabilistic Graphical Models

GAUSSIAN PROCESS REGRESSION

Machine Learning Basics: Estimators, Bias and Variance

Mobile Robots Localization

From Bayesian Networks to Markov Networks. Sargur Srihari

ELEC4631 s Lecture 2: Dynamic Control Systems 7 March Overview of dynamic control systems

EE565:Mobile Robotics Lecture 6

AUTOMOTIVE ENVIRONMENT SENSORS

Variable Elimination: Basic Ideas

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

2 : Directed GMs: Bayesian Networks

Learning MN Parameters with Alternative Objective Functions. Sargur Srihari

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Recursive Bayes Filtering

STA 414/2104, Spring 2014, Practice Problem Set #1

Bayesian Methods in Positioning Applications

A Tree Search Approach to Target Tracking in Clutter

Markov Chain Monte Carlo Methods

Linear Classification: Probabilistic Generative Models

Probabilistic Graphical Models (I)

L03. PROBABILITY REVIEW II COVARIANCE PROJECTION. NA568 Mobile Robotics: Methods & Algorithms

Introduction: MLE, MAP, Bayesian reasoning (28/8/13)

Density Estimation. Seungjin Choi

Kalman Filters for Mapping and Localization

Nonparametric Bayesian Methods (Gaussian Processes)

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

MATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes

The Monte Carlo Method: Bayesian Networks

Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference

Bayesian Machine Learning

Multivariate Gaussians. Sargur Srihari

Transcription:

Local Probabilistic Models: Continuous Variable CPDs Sargur srihari@cedar.buffalo.edu 1

Topics 1. Simple discretizing loses continuity 2. Continuous Variable CPDs 3. Linear Gaussian Model Example of car movement 4. Case Study: Robot Motion and Sensors 5. Hybrid Models 6. CLG Network 7. Thermostat 2

Continuous Variables Some variables best modeled as continuous Ex: position, velocity, temperature, pressure We cannot use a table representation How about circumventing the issue by discretizing all continuous variables? Discretization will not do To get accurate model need fine discretization Applying PGM to robot navigation task with 15cm granularity in x and y coordinates results in each variable having a thousand values and more than one million discretized values for robot position CPDs of this magnitude are outside range of most systems

Discretization loses Continuity When we discretize a variable we lose much of the structure that characterizes it Million values of robot position cannot always be associated with arbitrary probability Basic continuity assumptions imply relationships between probabilities of nearby discretized values Such constraints are hard to capture in a discrete distribution where there is no notion that two values are close to each other 4

CPD as a distribution Nothing in formulation of Bayesian network requires restricting attention to discrete variables Only requirement is that the CPD P(X Pa X ) represent for every assignment of values Pa X, a distribution over X In this case X is continuous We might also have some of X s parents are continuous 5

Purely Continuous Case Many possible models one could use Most commonly used parametric form for continuous density functions is the Gaussian We can see how it can be used in the context of a Bayesian network 6

Both Parent/Child are Continuous Representing the dependency of a continuous variable Y on a continuous parent X A simple solution: Model Y as a Gaussian Whose parameters depend on X Common solution: Decide that mean of Y is a linear function of X For example: P(Y x) ~ N(-2x + 0.9; 1) This dependence is called a linear Gaussian model It extends to multiple continuous parents X Y

Linear Gaussian Example: Car motion Car moving along straight line Position at t th second: X t (at meter #510) Velocity at t th second: V t (meters/sec) Then X t+1 = X t + V t If V t =15 meters/sec, then X t+1 =#525 If there is stochasticity in motion, X t+1 ~ N(525,5) variance is 5 meters X t X t+1 V t 8

Definition of Linear Gaussian Model Let Y be a continuous variable with continuous parents X 1,.. X k We say that Y has a linear Gaussian model with parameters β 0,.. β k and σ 2 such that P(Y x 1,..x k ) ~ N(β 0 + β 1 x 1 +.. β k x k ; σ 2 ) In vector notation P(Y x) ~ N(β 0 +β T x; σ 2 ) Formulation says that Y is a linear function of the variables X 1,.. X k with added Gaussian noise with mean 0 and variance σ 2 : Y=β 0 + β 1 x 1 +.. β k x k +ε where ε is a Gaussian random variable with mean 0 and variance σ 2 X 1 X 2 X k Y 9

Properties of Linear Gaussian X 1 X 2 X 4 Y 10

Case Study: Robot Motion & Sensors Robot localization Robot must keep track of its location as it moves in an environment Obtains sensor readings as it moves Two local CPDs (both continuous) 1. Robot dynamics P(L L,A) Specifies distribution of its position L at next time step given its current position L and action taken A 2. Robot sensor model P(D L) D is distance between robot and nearest obstacle Distribution over its observed sensor reading S at the current time given its current location L L L D A 11

Robot Dynamics Model: P(L L,A) Specifies distribution of position at time step L given its current position L and action taken A Robot location L is given by (X,Y,θ) X,Y coordinates and angular orientation θ Action A specifies distance to travel and rotation from current θ P(L L,A) is product of two independent Gaussians P(θ θ,a) and P(X,Y X,Y,A)

Ex: Robot Dynamics Model Distributions for two different actions A L =current location A =action L =next location Two banana-shaped distributions 13

Robot Sensor model: P(D L) Obtain sensor readings that depend on location Sensors Contact sensors: Bumpers Internal sensors Accelerometers (spring-mounted masses) Gyroscopes (spinning mass, laser light) Compasses, inclinometers (earth magnetic field, gravity) Proximity sensors Sonar (time of flight) Radar (phase and frequency) Laser range-finders (triangulation, tof, phase) Infrared (intensity) Visual sensors: Cameras Satellite-based sensors: GPS D is distance between robot and nearest obstacle 14

Discrete Sensor Model Map, m D = distance between robot and obstacle along K directions of sensors (laser, sonar) D = {d 1,d 2,...,d K } Individual distances are independent given the robot location and map P m (D L)=N(o L ;σ 2 ) m=map L=location 15 15

Two Models: Dynamics and Sensors (2) Robot Dynamics Model P(L L,A) Sensor Model (map-mobstacle) P m (d i L) o L, obstacle location=230cm P u (d i L) P(d i L) Sensor Model (unknown-u- obstacle) Combined Sensor Model P m and P u Exponential distribution (higher probability for closer D) 16

Probabilistic Robotics Odometry Internal sensors (compass, gyroscope) Proximity sensors (sonar, radar) Odometry information is inherently noisy Called probabilistic kinematics 17

Hybrid Models Incorporate both discrete and continuous variables Case 1: Continuous child X If we ignore discrete parents, CPD of X can be represented as a linear Gaussian of continuous parents Simplest way of making continuous variable X depend on discrete variable U is to define a different set of parameters for every value of the discrete parent 18

Conditional Linear Gaussian (CLG) CLG CPD: X is continuous U={U 1,..U m } are discrete parents Y={Y 1,..,Y k } are continuous parents U 1 U m Y 1 Y k For every u we have k+1 coefficients a u,i such that k p(x u, y) = N 2 a u,0 + a u,i y i ;σ u i=1 CLG network: every discrete variable has only discrete parents and every continuous variable has a CLG CPD A continuous variable cannot have discrete children Distribution is a mixture: weighted average of Gaussians X 19

Discrete Child with a Continuous Parent Threshold model P(x 1 ) = P(x 0 )=1 - P(x 1 ) 0.9 y 65 0.05 otherwise Continuous Binary Val(X)={x 0,x 1 } P(x 0 )+P(x 1 )=1 Y is the temperature in Fahrenheit and X is the thermostat turning the heater on Threshold model has abrupt change in probability with Y, i.e., from 0.9 to 0.05 Can use the logistic model to fix this P(x 1 Y)= sigmoid(w 0 + w 1 Y) Y X 20

Generalized Linear Model for a Thermostat Sensor has three values: low, medium, high P(C X) Continuous X X: Continuous parent Three-valued C 21