ONR Mine Warfare Autonomy Virtual Program Review September 7, 2017

Size: px
Start display at page:

Download "ONR Mine Warfare Autonomy Virtual Program Review September 7, 2017"

Transcription

1 ONR Mine Warfare Autonomy Virtual Program Review September 7, 2017 Information-driven Guidance and Control for Adaptive Target Detection and Classification Silvia Ferrari Pingping Zhu and Bo Fu Mechanical and Aerospace Engineering Cornell University

2 Outline Introduction MCM Motivation Directional Information Gain UUV-sonar Imaging Frames of Reference UUV-sonar Feature Extraction CNN-SVM ATR UUV-sonar Bayesian Modeling UUV-sonar Information Value Learning Conclusions and Q&A 2

3 Vehicle Path Planning and Control Traditional paradigm: Proprioceptive and exteroceptive sensor (output) used as feedback to vehicle in support of vehicle navigation objectives. Guidance Vehicle Output Control 3

4 Sensor Navigation and Control New paradigm: Vehicle is used to gather information (output) to support sensing objectives, such as target acquisition, or DCLT. Guidance Sensor Output Control Research challenges: Represent sensor objectives in closed-form Computational geometry; information theory Environmental and target feedback (output) Significant uncertainties; Bayesian updates Information-driven guidance and control Couplings between sensor measurements and vehicle dynamics 4

5 MCM Motivation and Application 5

6 Classification-driven Path Planning

7 Influence of UUV Position and Orientation Forward-looking Sonar: Side-scan Sonar: The sensor FOV, denoted by S 3, is defined as a compact subset of the workspace, W, in which the robot can obtain sonar measurements. Motivation: Sonar is directional and the information gain depends on the sonar FOV geometry, position, and orientation relative to the target.

8 Surveillance Region and Oceanic Currents LISC Real CODAR-Measured Current Field (100 naut mi-nj Coast) [COOL, Rutgers University] 8

9 Directional Information Gain 9

10 Visibility Problem For a polygonal obstacle B in a workspace W, consider a sensor on A observing a target T. Determine obstacle-generated cone K. Then, for a sensor FOV, S, we seek to find a shadow region, D, such that the visibility region is R Rs Solution: Construct a polygon with sensor at point P: S = {P,V 1,,V N }, where V 1,,V N are the obstacle vertices. Compute the convex hull of S: S S CS conv( S) kixi ( i : ki 0 ki i 1 i 1 Visibility region for one target: R = CS\B R S = (S\K) ((S K)\(D B)) 1)

11 Visibility Region in Closed Form Obstacle cone in closed-form: Construct unit vectors k i, k j Find boundary unit vectors k 1 and k 2 : k 1 PV PV p p k 2 PV PV q q Compute obstacle cone K: K cone( kˆ, kˆ ) { ˆ ˆ 2 x x r c1k 1 c2k 2, c1, c2 D (S K)\(CS) 1 Visibility region for multiple targets (i) and obstacles (j): 0} R (i) = CS (i) \B (i); D (i) = (S K (i) )\CS (i) R S = i R S(i) = i {(S\K (i) ) ((S K (i) )\(D (i) B (i) )}

12 Distance in meters 5 10 Distance in meters 5 10 Example: Art Gallery Problem Sensor at Single Location Sensor at Multiple Location T Distance in meters Distance in meters :B :T :D :K :R s

13 UUV-Sonar Directional Information Gain 13

14 UUV Kinematics Frames of Reference F W y uuv (North) θ uuv O A Assume: v = constant UUV States: x(t)= [x uuv (t), y uuv (t), uuv (t)] T UUV Dynamics: x v cos( y uuv uuv v sin( uuv uuv ) ) F A (East) O W x uuv

15 Distance in meters UUV Frames Relative to Sonar Image UUV Heading Direction Target: T UUV at x(t 2 ), t 2 > t 1 r O A Vehicle Frame F A = F A (x(t)) UUV at x(t 1 ) O A Sonar FOV FOV = FOV(x(t)) Distance in meters

16 Vehicle Frame and Image Frame Consider the image segmentation as the target geometry T. Then, the actual target geometry, orientation, and shape are viewed as hidden random features. (North) UUV Heading θ uuv T O A O T O A O T F A : Vehicle Frame F T : Image Frame T = T (x): Target Image Region Distance in meters

17 Image Frame and Target Frame F W (North) F T : Image Frame θ uuv θ T F T : Target Frame y T y T O T O T T θ Aspect Angle: θ = θ uuv - θ T (East) O W x T x T

18 Image and Target Distance Vectors F W (North) F A : Vehicle Frame O A r F T : Image Frame F T : Target Frame r i O T r O T True Distance: r = r i + r = [(x T - x), (y T - y)] T (East) O W

19 UUV-Sonar Feature Extraction 19

20 Automatic Target Recognition an Detection Training Phase: Train SVM classifier with Sonar Image I s and true classification Y T Testing Phase: Matched filter for image segmentation and CNN+SVM for target classification

21 CNN for Sonar Image Features Extraction Convolution + ReLU Pooling Fully-Connected (FC) Layer Input Image Convolution + ReLU Pooling I s Z R 4096 Automatic segmentation is performed via matched filter Matched filter provides better performance than Markov random fields (MRFs) Deep learning features extracted from segmentation by Pre-trained AlexNet AlexNet CNN provides better performance than other features extraction techniques such as HOG and LBP.

22 CNN Performance: Classification Accuracy Legend: TP: True Positive FP: False Positive TN: True Negative FN: False Negative G i : Image Group i Classification Accuracy: CA n TP n n TP FN n n TN FP n TN Probability of detection (Matched Filter performance): 88.31% * Zhu, P., Issacs, J., Fu, B., Ferrari, S., Deep Learning Feature Extraction for Target Recognition and Classification in Underwater Sonar Images 56th IEEE Conference on Decision and Control, Melbourne, Australia (Accepted).

23 CNN Performance: True Positive Rate Legend: TP: True Positive FP: False Positive TN: True Negative FN: False Negative G i : Image Group i True Positive Rate (TPR): TPR n TP ntp n FP Probability of detection (Matched Filter performance): 88.31% * Zhu, P., Issacs, J., Fu, B., Ferrari, S., Deep Learning Feature Extraction for Target Recognition and Classification in Underwater Sonar Images 56th IEEE Conference on Decision and Control, Melbourne, Australia (Accepted).

24 CNN Performance: False Alarms Legend: TP: True Positive FP: False Positive TN: True Negative FN: False Negative G i : Image Group i False Positive Rate (FPR): FPR n FP nfp n TN Probability of detection (Matched Filter performance): 88.31% * Zhu, P., Issacs, J., Fu, B., Ferrari, S., Deep Learning Feature Extraction for Target Recognition and Classification in Underwater Sonar Images 56th IEEE Conference on Decision and Control, Melbourne, Australia (Accepted).

25 UUV-Sonar Information Value 25

26 Distance (m) Distance (m) UUV-Sonar Information Value F W Minefield: 520 Targets, 20 Mine objects Distance (m) Sonar Image Sonar Image Distance (m) O W *Courtesy of Jason Isaacs, NSWC Panama City, FL

27 Key Variable Description Name Description Name Description X uuv Vehicle Position X T Target Position uuv Vehicle Orientation T Target Orientation r = [ d, ] d Relative Distance Relative Orientation Y T True Classification Estimated Classification Ŷ Features F ={S,H,W,L} Assume Constant S Object Shape L Object Length W Object Width H Object Height I s Segmented Image z CNN output Features M Sensor Mode E Noise level

28 Key Variables and Causal Relationships Target Features Vehicle Path Planning Image Generation Information Value Function Learning CNN Features Extraction and SVM Classification

29 Information Value Function Learning Assume information gain can be modeled as Expected Entropy Reduction (EER) N 1 ( i) EER( yˆ pri, Fˆ, rˆ pri, rˆ post ) H pri ( yˆ pri, Fˆ, rˆ pri ) H post ( yˆ pri, yˆ ˆ ˆ post, Fˆ, rpri, rpost ) N i 1 Example: Prior Image location ˆ {[0,75),[5 / 4,7 / 4,)} r pri Target: rˆpost Posterior Image location ˆ {[75,150],[ / 4, / 4,)} r post ŷ pri rˆpri ( ) yˆ i post : given in training data : yˆ 1 : yˆ 0 Discretization: r = [d, ]; d = {[0,75), [75,150]}; = {[-π/4, π /4), [π /4, 3 π /4), [3 π /4, 5 π /4), [5 π /4, 7 π /4]}

30 Information Value Function Learning Continuous probabilities (PDF s) are difficult to learn, random variables are first discretized. Conditional probability table (CPT) is estimated from dataset: P( Yˆ Y,S, L, W, H, r) T Entropy H is calculated based on estimated conditional probability Bayesian Network Model, Features F = {S,H,W,L} Information Value Function (EER) is learned from the dataset images and Features.

31 Bayesian Network Model n Training data set: D {( i, li, wi, hi, di, i, yt, i, yˆ i )} i Random variables: S, L, W, H, D,, YT,andYˆ Realization variables: s, l, w, h, d,, y,and yˆ i i i i i i T s 1 Bayesian Network Model, Features F = {S,H,W,L} Probability parameters learned from dataset: Prior: P( Fˆ, rˆ) Likelihood: Y T P( yˆ, Fˆ, rˆ) Y T S = {Cylindrical, Rectangular}; V = (LWH) 1/3 = {[0, 0.14), [0.14, 0.3), [0.3, 1.1), [1.1, 1.7]} r = [d, ]; d = {[0,75), [75,150]}; = {[- /4, /4), [ /4, 3 /4), [3 /4, 5 /4), [5 /4, 7 /4]}

32 BN Parameters Learning Bayesian Network Model, Features F = {S,H,W,L} i : Variable Node index j: Parent configuration index k: Value index of variable r i : Number of values of variable N ijk : Count number in data set Bayesian parameter estimation: Parameters are random variables Multinomial likelihood of training data set: P(D ) Dirichlet Prior of the parameter: Hyper parameter: Posterior of the parameter: MAP estimate: N n ijk ikj ( k ijk ) P( α) k ( ijk ) Dirichlet( ; ikj ri ( ikj k 1 ( Nijk P( D, α) ikj ijk ˆ MAP ijk j N ' ijk ( N ij ijk 1),..., ijk ) ij1 ijri [ ij 1,..., ijr i ijk 1) ' ' ) ij k α ijk ij k ]

33 Posterior Probability Learning Posterior probabilities calculated via Bayes rule: Given image I pri P( Y T 1 yˆ pri, Fˆ, rˆ pri ) P( Y T 1 Fˆ, rˆ ˆ pri ) P( y pri Y P( yˆ Fˆ, rˆ ) pri pri T 1, Fˆ, rˆ pri ) Given image I pri and I post P( Y T P( Y T 1 yˆ pri 1 Fˆ, rˆ, yˆ pri post, rˆ, Fˆ, rˆ post ) P( yˆ P( yˆ, rˆ, yˆ Y ) post Instantiations can also be calculated: P ˆ pri pri pri post 1, Fˆ, rˆ ) ( ˆ pri P y Fˆ, rˆ, rˆ ) post F ˆ, rˆ ) and P( yˆ pri, yˆ post Fˆ, rˆ, ˆ pri rpost ) ( y pri pri T pri post Y T 1, Fˆ, rˆ post )

34 Results 34

35 Probability of Correct Classification Probability of Correct Classification Influence of UUV position on Classification Dataset A: Low Frequency Probability of correct classification (total): Cylindrical object 91.11% Rectangular object 83.02% Relative Distance d (m) Relative Orientation (rad)

36 Probability of Correct Classification Probability of Correct Classification Influence of UUV position on Classification Dataset A: High Frequency Probability of correct classification (total): Cylindrical object 87.04% Rectangular object 86.27% Relative Distance d (m) Relative Orientation (rad)

37 Probability of Correct Classification Probability of Correct Classification Influence of UUV position on Classification Dataset B: Low Frequency Probability of correct classification (total): Cylindrical object 70.97% Rectangular object 43.48% Relative Distance d (m) Relative Orientation (rad)

38 Probability of Correct Classification Probability of Correct Classification Influence of UUV position on Classification Dataset B: High Frequency Probability of correct classification (total): Rectangular object 86.23% Cylindrical object 77.03% Relative Distance d (m) Relative Orientation (rad)

39 Information Value Function Learning Assume information gain can be modeled as Expected Entropy Reduction (EER) N 1 ( i) EER( yˆ pri, Fˆ, rˆ pri, rˆ post ) H pri ( yˆ pri, Fˆ, rˆ pri ) H post ( yˆ pri, yˆ ˆ ˆ post, Fˆ, rpri, rpost ) N i 1 Example: Prior Image location ˆ {[0,75),[5 / 4,7 / 4,)} r pri Target: rˆpost Posterior Image location ˆ {[75,150],[ / 4, / 4,)} r post ŷ pri rˆpri ( ) yˆ i post : given in training data : yˆ 1 : yˆ 0 Discretization: r = [d, ]; d = {[0,75), [75,150]}; = {[-π/4, π /4), [π /4, 3 π /4), [3 π /4, 5 π /4), [5 π /4, 7 π /4]}

40 Distance in meters Distance in meters Results: Learned Information 150 Target #2: y T = 1, s = Rectangular, l = 1.25m, w = 0.61m, h = 0.47 m Correct classification ( ˆ 1) y pri EER Incorrect classification ( ˆ y pri 0) EER O T O T 0.1 O A O A Distance in meters Distance in meters F T : Target Frame F A : Vehicle Frame

41 Distance in meters Distance in meters Results: Learned Information Gain 150 Target #230: y T = 0, s = Circular, l = 0.05m, w = 0.20 m, h = 0.27 m Correct classification ( ˆ 0) Incorrect classification ( ˆ 1) y pri EER y pri EER O A O T O T 0.1 O A Distance in meters Distance in meters F T : Target Frame F A : Vehicle Frame

42 Probability of Correct Classification CNN-SVM Learning with limited data Percentage of Data used for Training

43 MCM Unmanned Autonomy Evaluation Simulator (MUAES)

44 Conclusions MCM information-driven path planning and control: Directional information gain Automatic CNN-SVM target recognition and detection New reference frames and problem formulation Bayesian model of UUV-sonar sensing problem Information gain function learning MUAES visualization and testing Ongoing and Future Work: Information-driven MCM acquisition and classification MCM path planning and control Validation in MUAES-Sonar Imaging 44

45 Acknowledgments Collaborators: Jason Isaacs, Ph.D. Naval Surface Warfare Center Panama City, FL Drew Lucas, Ph.D. Naval Surface Warfare Center Panama City, FL Matt Bays, Ph.D. Naval Surface Warfare Center Panama City, FL Pingping Zhu, Ph.D. LISC, MEMS, Duke Bo Fu, Ph.D. LISC, MAE, Cornell This work was funded by ONR Code 321OE, MCM Autonomy Program 45

Multi-scale Adaptive Sensor Systems Silvia Ferrari, Professor Mechanical and Aerospace Engineering Cornell University

Multi-scale Adaptive Sensor Systems Silvia Ferrari, Professor Mechanical and Aerospace Engineering Cornell University ONR Maritime Sensing - Discovery & Invention (D&I) Review Naval Surface Warfare Center, Carderock (MD) August 21-23, 2018 Multi-scale Adaptive Sensor Systems Silvia Ferrari, Professor Mechanical and Aerospace

More information

Final Examination CS 540-2: Introduction to Artificial Intelligence

Final Examination CS 540-2: Introduction to Artificial Intelligence Final Examination CS 540-2: Introduction to Artificial Intelligence May 7, 2017 LAST NAME: SOLUTIONS FIRST NAME: Problem Score Max Score 1 14 2 10 3 6 4 10 5 11 6 9 7 8 9 10 8 12 12 8 Total 100 1 of 11

More information

Final Examination CS540-2: Introduction to Artificial Intelligence

Final Examination CS540-2: Introduction to Artificial Intelligence Final Examination CS540-2: Introduction to Artificial Intelligence May 9, 2018 LAST NAME: SOLUTIONS FIRST NAME: Directions 1. This exam contains 33 questions worth a total of 100 points 2. Fill in your

More information

Multiscale Adaptive Sensor Systems

Multiscale Adaptive Sensor Systems Multiscale Adaptive Sensor Systems Silvia Ferrari Sibley School of Mechanical and Aerospace Engineering Cornell University ONR Maritime Sensing D&I Review Naval Surface Warfare Center, Carderock 9-11 August

More information

Brief Introduction of Machine Learning Techniques for Content Analysis

Brief Introduction of Machine Learning Techniques for Content Analysis 1 Brief Introduction of Machine Learning Techniques for Content Analysis Wei-Ta Chu 2008/11/20 Outline 2 Overview Gaussian Mixture Model (GMM) Hidden Markov Model (HMM) Support Vector Machine (SVM) Overview

More information

HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH

HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH Hoang Trang 1, Tran Hoang Loc 1 1 Ho Chi Minh City University of Technology-VNU HCM, Ho Chi

More information

Bayesian Networks Basic and simple graphs

Bayesian Networks Basic and simple graphs Bayesian Networks Basic and simple graphs Ullrika Sahlin, Centre of Environmental and Climate Research Lund University, Sweden Ullrika.Sahlin@cec.lu.se http://www.cec.lu.se/ullrika-sahlin Bayesian [Belief]

More information

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles Myungsoo Jun and Raffaello D Andrea Sibley School of Mechanical and Aerospace Engineering Cornell University

More information

<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation)

<Special Topics in VLSI> Learning for Deep Neural Networks (Back-propagation) Learning for Deep Neural Networks (Back-propagation) Outline Summary of Previous Standford Lecture Universal Approximation Theorem Inference vs Training Gradient Descent Back-Propagation

More information

p(d θ ) l(θ ) 1.2 x x x

p(d θ ) l(θ ) 1.2 x x x p(d θ ).2 x 0-7 0.8 x 0-7 0.4 x 0-7 l(θ ) -20-40 -60-80 -00 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ θ x FIGURE 3.. The top graph shows several training points in one dimension, known or assumed to

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Adaptive Multi-Modal Sensing of General Concealed Targets

Adaptive Multi-Modal Sensing of General Concealed Targets Adaptive Multi-Modal Sensing of General Concealed argets Lawrence Carin Balaji Krishnapuram, David Williams, Xuejun Liao and Ya Xue Department of Electrical & Computer Engineering Duke University Durham,

More information

Anomaly Detection for the CERN Large Hadron Collider injection magnets

Anomaly Detection for the CERN Large Hadron Collider injection magnets Anomaly Detection for the CERN Large Hadron Collider injection magnets Armin Halilovic KU Leuven - Department of Computer Science In cooperation with CERN 2018-07-27 0 Outline 1 Context 2 Data 3 Preprocessing

More information

Anticipating Visual Representations from Unlabeled Data. Carl Vondrick, Hamed Pirsiavash, Antonio Torralba

Anticipating Visual Representations from Unlabeled Data. Carl Vondrick, Hamed Pirsiavash, Antonio Torralba Anticipating Visual Representations from Unlabeled Data Carl Vondrick, Hamed Pirsiavash, Antonio Torralba Overview Problem Key Insight Methods Experiments Problem: Predict future actions and objects Image

More information

The Monte Carlo Method: Bayesian Networks

The Monte Carlo Method: Bayesian Networks The Method: Bayesian Networks Dieter W. Heermann Methods 2009 Dieter W. Heermann ( Methods)The Method: Bayesian Networks 2009 1 / 18 Outline 1 Bayesian Networks 2 Gene Expression Data 3 Bayesian Networks

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Bayesian Networks (Part I)

Bayesian Networks (Part I) 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks (Part I) Graphical Model Readings: Murphy 10 10.2.1 Bishop 8.1,

More information

Methods and Criteria for Model Selection. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Methods and Criteria for Model Selection. CS57300 Data Mining Fall Instructor: Bruno Ribeiro Methods and Criteria for Model Selection CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Introduce classifier evaluation criteria } Introduce Bias x Variance duality } Model Assessment }

More information

Neuromorphic Sensing and Control of Autonomous Micro-Aerial Vehicles

Neuromorphic Sensing and Control of Autonomous Micro-Aerial Vehicles Neuromorphic Sensing and Control of Autonomous Micro-Aerial Vehicles Commercialization Fellowship Technical Presentation Taylor Clawson Ithaca, NY June 12, 2018 About Me: Taylor Clawson 3 rd year PhD student

More information

Learning Bayesian networks

Learning Bayesian networks 1 Lecture topics: Learning Bayesian networks from data maximum likelihood, BIC Bayesian, marginal likelihood Learning Bayesian networks There are two problems we have to solve in order to estimate Bayesian

More information

Introduction to Bayesian Learning

Introduction to Bayesian Learning Course Information Introduction Introduction to Bayesian Learning Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Apprendimento Automatico: Fondamenti - A.A. 2016/2017 Outline

More information

Deep Learning (CNNs)

Deep Learning (CNNs) 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Deep Learning (CNNs) Deep Learning Readings: Murphy 28 Bishop - - HTF - - Mitchell

More information

Algorithmisches Lernen/Machine Learning

Algorithmisches Lernen/Machine Learning Algorithmisches Lernen/Machine Learning Part 1: Stefan Wermter Introduction Connectionist Learning (e.g. Neural Networks) Decision-Trees, Genetic Algorithms Part 2: Norman Hendrich Support-Vector Machines

More information

Lecture 5: Bayesian Network

Lecture 5: Bayesian Network Lecture 5: Bayesian Network Topics of this lecture What is a Bayesian network? A simple example Formal definition of BN A slightly difficult example Learning of BN An example of learning Important topics

More information

Chapter 16. Structured Probabilistic Models for Deep Learning

Chapter 16. Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 1 Chapter 16 Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 2 Structured Probabilistic Models way of using graphs to describe

More information

Real-time image-based parking occupancy detection using deep learning. Debaditya Acharya, Weilin Yan & Kourosh Khoshelham The University of Melbourne

Real-time image-based parking occupancy detection using deep learning. Debaditya Acharya, Weilin Yan & Kourosh Khoshelham The University of Melbourne Real-time image-based parking occupancy detection using deep learning Debaditya Acharya, Weilin Yan & Kourosh Khoshelham The University of Melbourne Slide 1/20 Prologue People spend on Does average that

More information

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012 Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Matrix Data: Classification: Part 2 Instructor: Yizhou Sun yzsun@ccs.neu.edu September 21, 2014 Methods to Learn Matrix Data Set Data Sequence Data Time Series Graph & Network

More information

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino Artificial Neural Networks Data Base and Data Mining Group of Politecnico di Torino Elena Baralis Politecnico di Torino Artificial Neural Networks Inspired to the structure of the human brain Neurons as

More information

Classifier performance evaluation

Classifier performance evaluation Classifier performance evaluation Václav Hlaváč Czech Technical University in Prague Czech Institute of Informatics, Robotics and Cybernetics 166 36 Prague 6, Jugoslávských partyzánu 1580/3, Czech Republic

More information

Q1 (12 points): Chap 4 Exercise 3 (a) to (f) (2 points each)

Q1 (12 points): Chap 4 Exercise 3 (a) to (f) (2 points each) Q1 (1 points): Chap 4 Exercise 3 (a) to (f) ( points each) Given a table Table 1 Dataset for Exercise 3 Instance a 1 a a 3 Target Class 1 T T 1.0 + T T 6.0 + 3 T F 5.0-4 F F 4.0 + 5 F T 7.0-6 F T 3.0-7

More information

Data Mining: Concepts and Techniques. (3 rd ed.) Chapter 8. Chapter 8. Classification: Basic Concepts

Data Mining: Concepts and Techniques. (3 rd ed.) Chapter 8. Chapter 8. Classification: Basic Concepts Data Mining: Concepts and Techniques (3 rd ed.) Chapter 8 1 Chapter 8. Classification: Basic Concepts Classification: Basic Concepts Decision Tree Induction Bayes Classification Methods Rule-Based Classification

More information

Linear Classifiers as Pattern Detectors

Linear Classifiers as Pattern Detectors Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2014/2015 Lesson 16 8 April 2015 Contents Linear Classifiers as Pattern Detectors Notation...2 Linear

More information

Linear Classifiers as Pattern Detectors

Linear Classifiers as Pattern Detectors Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2013/2014 Lesson 18 23 April 2014 Contents Linear Classifiers as Pattern Detectors Notation...2 Linear

More information

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,

MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October, MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October, 23 2013 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run

More information

Introduction to Convolutional Neural Networks (CNNs)

Introduction to Convolutional Neural Networks (CNNs) Introduction to Convolutional Neural Networks (CNNs) nojunk@snu.ac.kr http://mipal.snu.ac.kr Department of Transdisciplinary Studies Seoul National University, Korea Jan. 2016 Many slides are from Fei-Fei

More information

10 : HMM and CRF. 1 Case Study: Supervised Part-of-Speech Tagging

10 : HMM and CRF. 1 Case Study: Supervised Part-of-Speech Tagging 10-708: Probabilistic Graphical Models 10-708, Spring 2018 10 : HMM and CRF Lecturer: Kayhan Batmanghelich Scribes: Ben Lengerich, Michael Kleyman 1 Case Study: Supervised Part-of-Speech Tagging We will

More information

Stephen Scott.

Stephen Scott. 1 / 35 (Adapted from Ethem Alpaydin and Tom Mitchell) sscott@cse.unl.edu In Homework 1, you are (supposedly) 1 Choosing a data set 2 Extracting a test set of size > 30 3 Building a tree on the training

More information

The Origin of Deep Learning. Lili Mou Jan, 2015

The Origin of Deep Learning. Lili Mou Jan, 2015 The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets

More information

Bayesian Networks Inference with Probabilistic Graphical Models

Bayesian Networks Inference with Probabilistic Graphical Models 4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning

More information

Lecture 4 Discriminant Analysis, k-nearest Neighbors

Lecture 4 Discriminant Analysis, k-nearest Neighbors Lecture 4 Discriminant Analysis, k-nearest Neighbors Fredrik Lindsten Division of Systems and Control Department of Information Technology Uppsala University. Email: fredrik.lindsten@it.uu.se fredrik.lindsten@it.uu.se

More information

RETRIEVAL MODELS. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

RETRIEVAL MODELS. Dr. Gjergji Kasneci Introduction to Information Retrieval WS RETRIEVAL MODELS Dr. Gjergji Kasneci Introduction to Information Retrieval WS 2012-13 1 Outline Intro Basics of probability and information theory Retrieval models Boolean model Vector space model Probabilistic

More information

Overview of Statistical Tools. Statistical Inference. Bayesian Framework. Modeling. Very simple case. Things are usually more complicated

Overview of Statistical Tools. Statistical Inference. Bayesian Framework. Modeling. Very simple case. Things are usually more complicated Fall 3 Computer Vision Overview of Statistical Tools Statistical Inference Haibin Ling Observation inference Decision Prior knowledge http://www.dabi.temple.edu/~hbling/teaching/3f_5543/index.html Bayesian

More information

Recent Advances in Bayesian Inference Techniques

Recent Advances in Bayesian Inference Techniques Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian

More information

Sum-Product Networks. STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 17, 2017

Sum-Product Networks. STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 17, 2017 Sum-Product Networks STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 17, 2017 Introduction Outline What is a Sum-Product Network? Inference Applications In more depth

More information

Reasoning Under Uncertainty: Bayesian networks intro

Reasoning Under Uncertainty: Bayesian networks intro Reasoning Under Uncertainty: Bayesian networks intro Alan Mackworth UBC CS 322 Uncertainty 4 March 18, 2013 Textbook 6.3, 6.3.1, 6.5, 6.5.1, 6.5.2 Lecture Overview Recap: marginal and conditional independence

More information

Machine learning comes from Bayesian decision theory in statistics. There we want to minimize the expected value of the loss function.

Machine learning comes from Bayesian decision theory in statistics. There we want to minimize the expected value of the loss function. Bayesian learning: Machine learning comes from Bayesian decision theory in statistics. There we want to minimize the expected value of the loss function. Let y be the true label and y be the predicted

More information

A graph contains a set of nodes (vertices) connected by links (edges or arcs)

A graph contains a set of nodes (vertices) connected by links (edges or arcs) BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,

More information

Introduction to Machine Learning Midterm Exam

Introduction to Machine Learning Midterm Exam 10-701 Introduction to Machine Learning Midterm Exam Instructors: Eric Xing, Ziv Bar-Joseph 17 November, 2015 There are 11 questions, for a total of 100 points. This exam is open book, open notes, but

More information

Performance Evaluation

Performance Evaluation Performance Evaluation Confusion Matrix: Detected Positive Negative Actual Positive A: True Positive B: False Negative Negative C: False Positive D: True Negative Recall or Sensitivity or True Positive

More information

CS839: Probabilistic Graphical Models. Lecture 7: Learning Fully Observed BNs. Theo Rekatsinas

CS839: Probabilistic Graphical Models. Lecture 7: Learning Fully Observed BNs. Theo Rekatsinas CS839: Probabilistic Graphical Models Lecture 7: Learning Fully Observed BNs Theo Rekatsinas 1 Exponential family: a basic building block For a numeric random variable X p(x ) =h(x)exp T T (x) A( ) = 1

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

Inertial Navigation and Various Applications of Inertial Data. Yongcai Wang. 9 November 2016

Inertial Navigation and Various Applications of Inertial Data. Yongcai Wang. 9 November 2016 Inertial Navigation and Various Applications of Inertial Data Yongcai Wang 9 November 2016 Types of Gyroscope Mechanical Gyroscope Laser Gyroscope Sagnac Effect Stable Platform IMU and Strapdown IMU In

More information

DETECTING HUMAN ACTIVITIES IN THE ARCTIC OCEAN BY CONSTRUCTING AND ANALYZING SUPER-RESOLUTION IMAGES FROM MODIS DATA INTRODUCTION

DETECTING HUMAN ACTIVITIES IN THE ARCTIC OCEAN BY CONSTRUCTING AND ANALYZING SUPER-RESOLUTION IMAGES FROM MODIS DATA INTRODUCTION DETECTING HUMAN ACTIVITIES IN THE ARCTIC OCEAN BY CONSTRUCTING AND ANALYZING SUPER-RESOLUTION IMAGES FROM MODIS DATA Shizhi Chen and YingLi Tian Department of Electrical Engineering The City College of

More information

Approximate Inference

Approximate Inference Approximate Inference Simulation has a name: sampling Sampling is a hot topic in machine learning, and it s really simple Basic idea: Draw N samples from a sampling distribution S Compute an approximate

More information

Hidden Markov Models. George Konidaris

Hidden Markov Models. George Konidaris Hidden Markov Models George Konidaris gdk@cs.brown.edu Fall 2018 Recall: Bayesian Network Flu Allergy Sinus Nose Headache Recall: BN Flu Allergy Flu P True 0.6 False 0.4 Sinus Allergy P True 0.2 False

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS LAST TIME Intro to cudnn Deep neural nets using cublas and cudnn TODAY Building a better model for image classification Overfitting

More information

Bayesian Networks. Motivation

Bayesian Networks. Motivation Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations

More information

Learning Bayesian network : Given structure and completely observed data

Learning Bayesian network : Given structure and completely observed data Learning Bayesian network : Given structure and completely observed data Probabilistic Graphical Models Sharif University of Technology Spring 2017 Soleymani Learning problem Target: true distribution

More information

Hidden Markov models 1

Hidden Markov models 1 Hidden Markov models 1 Outline Time and uncertainty Markov process Hidden Markov models Inference: filtering, prediction, smoothing Most likely explanation: Viterbi 2 Time and uncertainty The world changes;

More information

Reading Group on Deep Learning Session 1

Reading Group on Deep Learning Session 1 Reading Group on Deep Learning Session 1 Stephane Lathuiliere & Pablo Mesejo 2 June 2016 1/31 Contents Introduction to Artificial Neural Networks to understand, and to be able to efficiently use, the popular

More information

Multi-Robotic Systems

Multi-Robotic Systems CHAPTER 9 Multi-Robotic Systems The topic of multi-robotic systems is quite popular now. It is believed that such systems can have the following benefits: Improved performance ( winning by numbers ) Distributed

More information

Machine Learning Overview

Machine Learning Overview Machine Learning Overview Sargur N. Srihari University at Buffalo, State University of New York USA 1 Outline 1. What is Machine Learning (ML)? 2. Types of Information Processing Problems Solved 1. Regression

More information

DEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE

DEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE Data Provided: None DEPARTMENT OF COMPUTER SCIENCE Autumn Semester 203 204 MACHINE LEARNING AND ADAPTIVE INTELLIGENCE 2 hours Answer THREE of the four questions. All questions carry equal weight. Figures

More information

The connection of dropout and Bayesian statistics

The connection of dropout and Bayesian statistics The connection of dropout and Bayesian statistics Interpretation of dropout as approximate Bayesian modelling of NN http://mlg.eng.cam.ac.uk/yarin/thesis/thesis.pdf Dropout Geoffrey Hinton Google, University

More information

CptS 570 Machine Learning School of EECS Washington State University. CptS Machine Learning 1

CptS 570 Machine Learning School of EECS Washington State University. CptS Machine Learning 1 CptS 570 Machine Learning School of EECS Washington State University CptS 570 - Machine Learning 1 IEEE Expert, October 1996 CptS 570 - Machine Learning 2 Given sample S from all possible examples D Learner

More information

PROBABILISTIC REASONING OVER TIME

PROBABILISTIC REASONING OVER TIME PROBABILISTIC REASONING OVER TIME In which we try to interpret the present, understand the past, and perhaps predict the future, even when very little is crystal clear. Outline Time and uncertainty Inference:

More information

Vlad Estivill-Castro. Robots for People --- A project for intelligent integrated systems

Vlad Estivill-Castro. Robots for People --- A project for intelligent integrated systems 1 Vlad Estivill-Castro Robots for People --- A project for intelligent integrated systems V. Estivill-Castro 2 Probabilistic Map-based Localization (Kalman Filter) Chapter 5 (textbook) Based on textbook

More information

Lightweight Probabilistic Deep Networks Supplemental Material

Lightweight Probabilistic Deep Networks Supplemental Material Lightweight Probabilistic Deep Networks Supplemental Material Jochen Gast Stefan Roth Department of Computer Science, TU Darmstadt In this supplemental material we derive the recipe to create uncertainty

More information

Probabilistic Graphical Models for Image Analysis - Lecture 1

Probabilistic Graphical Models for Image Analysis - Lecture 1 Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.

More information

Statistical Filters for Crowd Image Analysis

Statistical Filters for Crowd Image Analysis Statistical Filters for Crowd Image Analysis Ákos Utasi, Ákos Kiss and Tamás Szirányi Distributed Events Analysis Research Group, Computer and Automation Research Institute H-1111 Budapest, Kende utca

More information

Target Tracking and Classification using Collaborative Sensor Networks

Target Tracking and Classification using Collaborative Sensor Networks Target Tracking and Classification using Collaborative Sensor Networks Xiaodong Wang Department of Electrical Engineering Columbia University p.1/3 Talk Outline Background on distributed wireless sensor

More information

PREDICTION OF HETERODIMERIC PROTEIN COMPLEXES FROM PROTEIN-PROTEIN INTERACTION NETWORKS USING DEEP LEARNING

PREDICTION OF HETERODIMERIC PROTEIN COMPLEXES FROM PROTEIN-PROTEIN INTERACTION NETWORKS USING DEEP LEARNING PREDICTION OF HETERODIMERIC PROTEIN COMPLEXES FROM PROTEIN-PROTEIN INTERACTION NETWORKS USING DEEP LEARNING Peiying (Colleen) Ruan, PhD, Deep Learning Solution Architect 3/26/2018 Background OUTLINE Method

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 9: Variational Inference Relaxations Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne 24/10/2011 (EPFL) Graphical Models 24/10/2011 1 / 15

More information

Dynamic Data Modeling, Recognition, and Synthesis. Rui Zhao Thesis Defense Advisor: Professor Qiang Ji

Dynamic Data Modeling, Recognition, and Synthesis. Rui Zhao Thesis Defense Advisor: Professor Qiang Ji Dynamic Data Modeling, Recognition, and Synthesis Rui Zhao Thesis Defense Advisor: Professor Qiang Ji Contents Introduction Related Work Dynamic Data Modeling & Analysis Temporal localization Insufficient

More information

A Probability Density Function Approach to Distributed Sensors Path Planning

A Probability Density Function Approach to Distributed Sensors Path Planning 2010 IEEE International Conference on Robotics and Automation Anchorage Convention District May 3-8, 2010, Anchorage, Alaska, USA A Probability Density Function Approach to Distributed Sensors Path Planning

More information

Evaluation & Credibility Issues

Evaluation & Credibility Issues Evaluation & Credibility Issues What measure should we use? accuracy might not be enough. How reliable are the predicted results? How much should we believe in what was learned? Error on the training data

More information

Mining Classification Knowledge

Mining Classification Knowledge Mining Classification Knowledge Remarks on NonSymbolic Methods JERZY STEFANOWSKI Institute of Computing Sciences, Poznań University of Technology SE lecture revision 2013 Outline 1. Bayesian classification

More information

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6

Machine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6 Machine Learning for Large-Scale Data Analysis and Decision Making 80-629-17A Neural Networks Week #6 Today Neural Networks A. Modeling B. Fitting C. Deep neural networks Today s material is (adapted)

More information

Gentle Introduction to Infinite Gaussian Mixture Modeling

Gentle Introduction to Infinite Gaussian Mixture Modeling Gentle Introduction to Infinite Gaussian Mixture Modeling with an application in neuroscience By Frank Wood Rasmussen, NIPS 1999 Neuroscience Application: Spike Sorting Important in neuroscience and for

More information

Feed-forward Network Functions

Feed-forward Network Functions Feed-forward Network Functions Sargur Srihari Topics 1. Extension of linear models 2. Feed-forward Network Functions 3. Weight-space symmetries 2 Recap of Linear Models Linear Models for Regression, Classification

More information

Bayesian Networks. Marcello Cirillo PhD Student 11 th of May, 2007

Bayesian Networks. Marcello Cirillo PhD Student 11 th of May, 2007 Bayesian Networks Marcello Cirillo PhD Student 11 th of May, 2007 Outline General Overview Full Joint Distributions Bayes' Rule Bayesian Network An Example Network Everything has a cost Learning with Bayesian

More information

SLAM for Ship Hull Inspection using Exactly Sparse Extended Information Filters

SLAM for Ship Hull Inspection using Exactly Sparse Extended Information Filters SLAM for Ship Hull Inspection using Exactly Sparse Extended Information Filters Matthew Walter 1,2, Franz Hover 1, & John Leonard 1,2 Massachusetts Institute of Technology 1 Department of Mechanical Engineering

More information

An Introduction to Bayesian Machine Learning

An Introduction to Bayesian Machine Learning 1 An Introduction to Bayesian Machine Learning José Miguel Hernández-Lobato Department of Engineering, Cambridge University April 8, 2013 2 What is Machine Learning? The design of computational systems

More information

Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning

Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning Apprentissage, réseaux de neurones et modèles graphiques (RCP209) Neural Networks and Deep Learning Nicolas Thome Prenom.Nom@cnam.fr http://cedric.cnam.fr/vertigo/cours/ml2/ Département Informatique Conservatoire

More information

Shankar Shivappa University of California, San Diego April 26, CSE 254 Seminar in learning algorithms

Shankar Shivappa University of California, San Diego April 26, CSE 254 Seminar in learning algorithms Recognition of Visual Speech Elements Using Adaptively Boosted Hidden Markov Models. Say Wei Foo, Yong Lian, Liang Dong. IEEE Transactions on Circuits and Systems for Video Technology, May 2004. Shankar

More information

CS 229 Project Final Report: Reinforcement Learning for Neural Network Architecture Category : Theory & Reinforcement Learning

CS 229 Project Final Report: Reinforcement Learning for Neural Network Architecture Category : Theory & Reinforcement Learning CS 229 Project Final Report: Reinforcement Learning for Neural Network Architecture Category : Theory & Reinforcement Learning Lei Lei Ruoxuan Xiong December 16, 2017 1 Introduction Deep Neural Network

More information

Part I. Linear Discriminant Analysis. Discriminant analysis. Discriminant analysis

Part I. Linear Discriminant Analysis. Discriminant analysis. Discriminant analysis Week 5 Based in part on slides from textbook, slides of Susan Holmes Part I Linear Discriminant Analysis October 29, 2012 1 / 1 2 / 1 Nearest centroid rule Suppose we break down our data matrix as by the

More information

Introduction to Machine Learning Midterm Exam Solutions

Introduction to Machine Learning Midterm Exam Solutions 10-701 Introduction to Machine Learning Midterm Exam Solutions Instructors: Eric Xing, Ziv Bar-Joseph 17 November, 2015 There are 11 questions, for a total of 100 points. This exam is open book, open notes,

More information

CSL302/612 Artificial Intelligence End-Semester Exam 120 Minutes

CSL302/612 Artificial Intelligence End-Semester Exam 120 Minutes CSL302/612 Artificial Intelligence End-Semester Exam 120 Minutes Name: Roll Number: Please read the following instructions carefully Ø Calculators are allowed. However, laptops or mobile phones are not

More information

Invariant Scattering Convolution Networks

Invariant Scattering Convolution Networks Invariant Scattering Convolution Networks Joan Bruna and Stephane Mallat Submitted to PAMI, Feb. 2012 Presented by Bo Chen Other important related papers: [1] S. Mallat, A Theory for Multiresolution Signal

More information

Recurrent Latent Variable Networks for Session-Based Recommendation

Recurrent Latent Variable Networks for Session-Based Recommendation Recurrent Latent Variable Networks for Session-Based Recommendation Panayiotis Christodoulou Cyprus University of Technology paa.christodoulou@edu.cut.ac.cy 27/8/2017 Panayiotis Christodoulou (C.U.T.)

More information

Hypothesis Evaluation

Hypothesis Evaluation Hypothesis Evaluation Machine Learning Hamid Beigy Sharif University of Technology Fall 1395 Hamid Beigy (Sharif University of Technology) Hypothesis Evaluation Fall 1395 1 / 31 Table of contents 1 Introduction

More information

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional

More information

Expectation Propagation for Approximate Bayesian Inference

Expectation Propagation for Approximate Bayesian Inference Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given

More information

Bayesian Decision Theory

Bayesian Decision Theory Introduction to Pattern Recognition [ Part 4 ] Mahdi Vasighi Remarks It is quite common to assume that the data in each class are adequately described by a Gaussian distribution. Bayesian classifier is

More information

Convolutional Neural Network Architecture

Convolutional Neural Network Architecture Convolutional Neural Network Architecture Zhisheng Zhong Feburary 2nd, 2018 Zhisheng Zhong Convolutional Neural Network Architecture Feburary 2nd, 2018 1 / 55 Outline 1 Introduction of Convolution Motivation

More information

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1 Bayes Networks CS540 Bryan R Gibson University of Wisconsin-Madison Slides adapted from those used by Prof. Jerry Zhu, CS540-1 1 / 59 Outline Joint Probability: great for inference, terrible to obtain

More information