Probabilistic image processing and Bayesian network
|
|
- Arabella Hart
- 6 years ago
- Views:
Transcription
1 robabilistic imae processin and Bayesian network Kazuyuki Tanaka Graduate School o Inormation Sciences Tohoku University kazu@smapip.is.tohoku.ac.jp Reerences K. Tanaka: Statistical-mechanical approach to imae processin (Topical Review) J. hys. A vol.35 pp.r8-r50 (00). K. Tanaka H. Shouno M. Okada and D. M. Titterinton: Accuracy o the Bethe approimation or hyperparameter estimation in probabilistic imae processin J. hys. A vol.37 pp (004). 8 November 005 CISJ005
2 Bayesian Network and Belie ropaation robabilistic Inormation rocessin Bayes Formula Bayesian Network robabilistic Model Belie ropaation J. earl: robabilistic Reasonin in Intellient Systems: Networks o lausible Inerence (Moran Kaumann 988). C. Berrou and A. Glavieu: Near optimum error correctin codin and decodin: Turbo-codes IEEE Trans. Comm. 44 (996). 8 November 005 CISJ005
3 Contents. Introduction. Belie ropaation 3. Bayesian Imae Analysis and Gaussian Graphical Model 4. Concludin Remarks 8 November 005 CISJ005 3
4 Belie ropaation How should we treat the calculation o the summation over N coniuration? L W ( L ) = 0 = 0 = 0 N It is very hard to calculate eactly ecept some special cases. N Formulation or approimate alorithm Accuracy o the approimate alorithm 8 November 005 CISJ005 4
5 8 November 005 CISJ005 5 Tractable Model Factorizable Not Factorizable robabilistic models with no loop are tractable. robabilistic models with loop are not tractable. ( ) = d c b a d c b a d c b a ) ( ) ( ) ( ) ( ) ( ) ( ) ( D C B A D C B A a b c d a b c d a b c d ( ) a b c d d c b a W
6 8 November 005 CISJ005 6 robabilistic model on a raph with no loop ( ) ( ) ( ) ) ( ) ( ) ( y W y W y W W W y D C B A d c b a d c b a ( ) ( ) a b c d d c b a y y a b c d Marinal probability o the node
7 robabilistic model on a raph with no loop 6 d ( y) = M ( y) M ( y) M ( y) 5 6 M y = W y M 3 M 4 M = y a c ( ) ( ) ( ) ( ) ( ) W ( ) ( ) ( ) y M 5 y M 6 y b Marinal probability can be epressed in terms o the product o messaes rom all the neihbourin nodes o node. Messae rom the node to the node can be epressed in terms o the product o messae rom all the neihbourin nodes o the node ecept one rom the node. 8 November 005 CISJ005 7
8 robabilistic Model on a Graph with Loops Z ( ) ( ) = W L L ij i j ij N Ω Marinal robability Z W ij N ij ( ) i j ( ) = ( L ) \{ } L ( ) = ( L ) { } L \ 8 November 005 CISJ005 8
9 Q Belie ropaation M 3 M 4 M 7 4 M 5 ( ξ ) ( ξ ζ ) Q = Q ς 3 5 M 4 M 3 M 4 M 5 M 6 8 November 005 CISJ W 8 6 M 8 M 7 ( ) = M ( ) M ( ) Q ( ) = M ( ) M ( ) M ( ) Z M 4 ( ) M ( ) 5 3 Z W ( ) M ( ) M ( ) M ( ) 3 6 M 4 W ς M ( ξ ) ( ς ξ ) M ( ς ) 4 ( ς ) M ( ς ) Messae Update Rule 8
10 Messae assin Rule o Belie ropaation M ( ξ ) = ξ ς ς W W ( ς ξ ) M ( ς ) M ( ς ) M ( ς ) 3 ( ς ξ ) M ( ς ) M ( ς ) M ( ς ) M 3 M M 5 5 M Fied oint Equations or Massae r M r = Φ r ( ) M 8 November 005 CISJ005 0
11 Fied oint Equation and Iterative Method r ( ) Fied oint Equation * * M = Φ M r r Iterative Method r M r M r M 3 Φ Φ Φ M r ( M ) 0 ( r M ) ( r M ) y M 0 M * M y = M 0 y = Φ() 8 November 005 CISJ005
12 Contents. Introduction. Belie ropaation 3. Bayesian Imae Analysis and Gaussian Graphical Model 4. Concludin Remarks 8 November 005 CISJ005
13 Bayesian Imae Analysis Noise Transmission Oriinal Imae Deraded Imae { Oriinal Imae Deraded Imae} r = A osteriori robability Deradation rocess A riori robability r { Deraded Imae Oriinal Imae} r{ Oriinal Imae} { Deraded Imae} r Marinal Likelihood 8 November 005 CISJ005 3
14 Bayesian Imae Analysis ( ) Deradation rocess i + n i Ω = n i i + i ( 0 ) ~ N i ( ) = ep ( ) i i i Ω π Additive White Gaussian Noise Transmission Oriinal Imae Deraded Imae 8 November 005 CISJ005 4
15 Bayesian Imae Analysis A riori robability ( + ) i ( ) = Z ( ) j ep R ij N ( ) i j Generate Similar? Standard Imaes 8 November 005 CISJ005 5
16 Bayesian Imae Analysis A osteriori robability i j ( + ) ( ) = = Z ( ) ( ) ( ) OS ep ( ) ( ) ( ) i i i j i Ω ij N Gaussian Graphical Model 8 November 005 CISJ005 6
17 Bayesian Imae Analysis ( ) ( ) y A riori robability Ω Oriinal Imae = { i Ω} = { i Ω} i Deraded Imae Deraded Imae i iels A osteriori robability ( ) = ( ) ( ) ( ) ˆ i i ( ) d = ( ) i i d i = + 8 November 005 CISJ005 7
18 ( ˆ ˆ ) Hyperparameter Determination by Maimization o Marinal Likelihood = ar ma ( ) ( ) ˆ i ( ˆ ˆ ) = d i y ( ) = ( ) d = ( ) ( ) d ( ) = ( ) ( ) Ω Oriinal Imae { Ω} = i i ( ) ( ) ( ) Marinal Likelihood Marinalization Deraded Imae { Ω} = i i 8 November 005 CISJ005 8
19 8 November 005 CISJ005 9 Maimization o Marinal Likelihood by EM (Epectation Maimization) Alorithm ( ) ( ) ( ) d = Marinal Likelihood { } Ω = i i Incomplete Data Ω y ( ) ( ) ( ) d Q = ' ' ln ' ' ( ) ( ) 0 ' ' ' 0 ' ' ' ' ' ' ' = = = = = = Q Q ( ) ( ) 0 0 = = Equivalent Q-Function
20 Maimization o Marinal Likelihood by EM (Epectation Maimization) Alorithm Marinal Likelihood Q-Function Q ( ) = ( ) ( ) d ( ' ' ) = ( ) ln ( ' ') d EM Alorithm Iterate the ollowin EM-steps until converence: E -Step : Q M -Step : ( ' ' ( t) ( t) ) ( t) ( t) ln ( ( t + ) ( t + ) ) ar ma Q ' ' () t () t ( ) ( ' ') ( ' β ') ( ). A.. Dempster N. M. Laird and D. B. Rubin Maimum likelihood rom incomplete data via the EM alorithm J. Roy. Stat. Soc. B 39 (977). y d. Ω 8 November 005 CISJ005 0
21 One-Dimensional Sinal Oriinal Sinal 00 i 00 EM Alorithm 0 Deraded Sinal 00 i = Estimated Sinal 00 i i ˆi i 8 November 005 CISJ005
22 Imae Restoration by Gaussian Graphical Model Oriinal Imae Deraded Imae EM Alorithm with Belie ropaation MSE: 5 MSE: 59 8 November 005 CISJ005
23 8 November 005 CISJ005 3 Eact Results o Gaussian Graphical Model ( ) ( ) ( ) ( ) = == Ω C C d Z N ij j i i i i T T OS ep ep ep ( ) ( ) ( ) ( ) + + = Ω C I C C I C T ep det det π ( ) C I ˆ + = Multi-dimensional Gauss interal ormula Ω y ( ) ( ) ( ) ma ar ˆ ˆ =
24 MSE = Ω Comparison o Belie ropaation with Eact Results in Gaussian Graphical Model = 40 = 40 i Ω ( ˆ ) i i Belie ropaation Eact Belie ropaation Eact ( ˆ ˆ ) = ar MSE ˆ ˆ ln ( ˆ ˆ ) ( ) ( ) MSE ˆ ˆ ln ( ˆ ˆ ) November 005 CISJ005 4 ma
25 Oriinal Imae Imae Restoration by Gaussian Graphical Model Deraded Imae Belie ropaation Eact Lowpass Filter MSE: 5 Wiener Filter MSE: 35 Median Filter MSE:35 MSE: 4 MSE: 545 MSE: 447 ( ) 8 November MSE = ˆ 005 i i CISJ005 5 Ω i Ω
26 Oriinal Imae Imae Restoration by Gaussian Graphical Model Deraded Imae Belie ropaation Eact Lowpass Filter MSE: 59 Wiener Filter MSE: 60 Median Filter MSE36 MSE: 4 MSE: 37 MSE: 44 ( ) 8 November MSE = ˆ 005 i i CISJ005 6 Ω i Ω
27 Etension o Belie ropaation Generalized Belie ropaation J. S. Yedidia W. T. Freeman and Y. Weiss: Constructin ree-enery enery approimations and eneralized belie propaation alorithms IEEE Transactions on Inormation Theory 5 (005). Generalized belie propaation is equivalent to the cluster variation method in statistical mechanics R. Kikuchi: A theory o cooperative phenomena hys. Rev. 8 (95). T. Morita: Cluster variation method o cooperative phenomena and its eneralization I J. hys. Soc. Jpn (957). 8 November 005 CISJ005 7
28 Imae Restoration by Gaussian Graphical Model ( ˆ ˆ ) = 40 = 40 = ar ma ( ) MSE = Ω i Ω ( ) ( ˆ ) i i Belie ropaation Generalized Belie ropaation Eact Belie ropaation Generalized Belie ropaation Eact MSE ˆ ˆ ln ( ˆ ˆ ) MSE ˆ ˆ ln ( ˆ ˆ ) November 005 CISJ005 8
29 Imae Restoration by Gaussian Graphical Model and Conventional Filters = 40 MSE MSE Belie ropaation 37 Lowpass Filter (33) (55) Generalized Belie ropaation 35 Median Filter (33) (55) GB Eact 35 Wiener Filter (33) (55) MSE = Ω ( y) ( ˆ ) y Ω y (33) Lowpass (55) Median (55) Wiener 8 November 005 CISJ005 9
30 Imae Restoration by Gaussian Graphical Model and Conventional Filters = 40 MSE MSE Belie ropaation 60 Lowpass Filter (33) (55) 4 4 GB Generalized Belie ropaation Eact Median Filter Wiener Filter (33) (55) (33) (55) MSE = Ω ( y) ( ˆ ) y Ω y (55) Lowpass (55) Median (55) Wiener 8 November 005 CISJ005 30
31 Contents. Introduction. Belie ropaation 3. Bayesian Imae Analysis and Gaussian Graphical Model 4. Concludin Remarks 8 November 005 CISJ005 3
32 Summary Formulation o belie propaation Accuracy o belie propaation in Bayesian imae analysis by means o Gaussian raphical model (Comparison between the belie propaation and eact calculation) 8 November 005 CISJ005 3
Approximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery
Approimate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery arxiv:1606.00901v1 [cs.it] Jun 016 Shuai Huang, Trac D. Tran Department of Electrical and Computer Engineering Johns
More informationSteepest descent on factor graphs
Steepest descent on factor graphs Justin Dauwels, Sascha Korl, and Hans-Andrea Loeliger Abstract We show how steepest descent can be used as a tool for estimation on factor graphs. From our eposition,
More informationON CONVERGENCE PROPERTIES OF MESSAGE-PASSING ESTIMATION ALGORITHMS. Justin Dauwels
ON CONVERGENCE PROPERTIES OF MESSAGE-PASSING ESTIMATION ALGORITHMS Justin Dauwels Amari Research Unit, RIKEN Brain Science Institute, Wao-shi, 351-0106, Saitama, Japan email: justin@dauwels.com ABSTRACT
More informationPhysical Fluctuomatics 4th Maximum likelihood estimation and EM algorithm
hyscal Fluctuomatcs 4th Maxmum lkelhood estmaton and EM alorthm Kazuyuk Tanaka Graduate School o Inormaton Scences Tohoku Unversty kazu@smapp.s.tohoku.ac.jp http://www.smapp.s.tohoku.ac.jp/~kazu/ hscal
More informationSteepest Descent as Message Passing
In the roc of IEEE ISOC ITW2005 on Coding Compleity; editor MJ Dinneen; co-chairs U Speidel D Taylor; pages 2-6 Steepest Descent as Message assing Justin Dauwels, Sascha Korl, Hans-Andrea Loeliger Dept
More informationState Space and Hidden Markov Models
State Space and Hidden Markov Models Kunsch H.R. State Space and Hidden Markov Models. ETH- Zurich Zurich; Aliaksandr Hubin Oslo 2014 Contents 1. Introduction 2. Markov Chains 3. Hidden Markov and State
More informationStatistical performance analysis by loopy belief propagation in probabilistic image processing
Statstcal perormance analyss by loopy bele propaaton n probablstc mae processn Kazuyuk Tanaka raduate School o Inormaton Scences Tohoku Unversty Japan http://www.smapp.s.tohoku.ac.p/~kazu/ Collaborators
More informationExpectation Maximization Deconvolution Algorithm
Epectation Maimization Deconvolution Algorithm Miaomiao ZHANG March 30, 2011 Abstract In this paper, we use a general mathematical and eperimental methodology to analyze image deconvolution. The main procedure
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture Notes Fall 2009 November, 2009 Byoung-Ta Zhang School of Computer Science and Engineering & Cognitive Science, Brain Science, and Bioinformatics Seoul National University
More informationType II variational methods in Bayesian estimation
Type II variational methods in Bayesian estimation J. A. Palmer, D. P. Wipf, and K. Kreutz-Delgado Department of Electrical and Computer Engineering University of California San Diego, La Jolla, CA 9093
More informationBayesian Inference of Noise Levels in Regression
Bayesian Inference of Noise Levels in Regression Christopher M. Bishop Microsoft Research, 7 J. J. Thomson Avenue, Cambridge, CB FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop
More informationProbabilistic and Bayesian Machine Learning
Probabilistic and Bayesian Machine Learning Day 4: Expectation and Belief Propagation Yee Whye Teh ywteh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/
More informationStatistical-Mechanical Approach to Probabilistic Image Processing Loopy Belief Propagation and Advanced Mean-Field Method
Statistical-Mechanical Approach to Probabilistic Image Processing Loopy Belief Propagation and Advanced Mean-Field Method Kazuyuki Tanaka and Noriko Yoshiike 2 Graduate School of Information Sciences,
More informationMobile Robot Localization
Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations
More informationarxiv: v1 [stat.ml] 20 Mar 2018
Momentum-Space Renormalization Group Transformation in Bayesian Image Modeling by Gaussian Graphical Model arxiv:1804.0077v1 [stat.ml] 0 Mar 018 Kazuyuki Tanaka 1, Masamichi akamura 1, Shun Kataoka 1,
More information(a) Show that there is a root α of f (x) = 0 in the interval [1.2, 1.3]. (2)
. f() = 4 cosec 4 +, where is in radians. (a) Show that there is a root α of f () = 0 in the interval [.,.3]. Show that the equation f() = 0 can be written in the form = + sin 4 Use the iterative formula
More informationParticle Methods as Message Passing
Particle Methods as Message Passing Justin Dauwels RIKEN Brain Science Institute Hirosawa,2-1,Wako-shi,Saitama,Japan Email: justin@dauwels.com Sascha Korl Phonak AG CH-8712 Staefa, Switzerland Email: sascha.korl@phonak.ch
More informationOutline. Spring It Introduction Representation. Markov Random Field. Conclusion. Conditional Independence Inference: Variable elimination
Probabilistic Graphical Models COMP 790-90 Seminar Spring 2011 The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline It Introduction ti Representation Bayesian network Conditional Independence Inference:
More informationUNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS
UNDERSTANDING BELIEF PROPOGATION AND ITS GENERALIZATIONS JONATHAN YEDIDIA, WILLIAM FREEMAN, YAIR WEISS 2001 MERL TECH REPORT Kristin Branson and Ian Fasel June 11, 2003 1. Inference Inference problems
More informationLecture 4.2 Finite Difference Approximation
Lecture 4. Finite Difference Approimation 1 Discretization As stated in Lecture 1.0, there are three steps in numerically solving the differential equations. They are: 1. Discretization of the domain by
More informationPower EP. Thomas Minka Microsoft Research Ltd., Cambridge, UK MSR-TR , October 4, Abstract
Power EP Thomas Minka Microsoft Research Ltd., Cambridge, UK MSR-TR-2004-149, October 4, 2004 Abstract This note describes power EP, an etension of Epectation Propagation (EP) that makes the computations
More informationAN ON-LINE ADAPTATION ALGORITHM FOR ADAPTIVE SYSTEM TRAINING WITH MINIMUM ERROR ENTROPY: STOCHASTIC INFORMATION GRADIENT
A O-LIE ADAPTATIO ALGORITHM FOR ADAPTIVE SYSTEM TRAIIG WITH MIIMUM ERROR ETROPY: STOHASTI IFORMATIO GRADIET Deniz Erdogmus, José. Príncipe omputational euroengineering Laboratory Department of Electrical
More informationBeyond Newton s method Thomas P. Minka
Beyond Newton s method Thomas P. Minka 2000 (revised 7/21/2017) Abstract Newton s method for optimization is equivalent to iteratively maimizing a local quadratic approimation to the objective function.
More informationMachine Learning Lecture 3
Announcements Machine Learning Lecture 3 Eam dates We re in the process of fiing the first eam date Probability Density Estimation II 9.0.207 Eercises The first eercise sheet is available on L2P now First
More informationWhere now? Machine Learning and Bayesian Inference
Machine Learning and Bayesian Inference Dr Sean Holden Computer Laboratory, Room FC6 Telephone etension 67 Email: sbh@clcamacuk wwwclcamacuk/ sbh/ Where now? There are some simple take-home messages from
More informationApproximate inference, Sampling & Variational inference Fall Cours 9 November 25
Approimate inference, Sampling & Variational inference Fall 2015 Cours 9 November 25 Enseignant: Guillaume Obozinski Scribe: Basile Clément, Nathan de Lara 9.1 Approimate inference with MCMC 9.1.1 Gibbs
More informationMixture Models and EM
Mixture Models and EM Goal: Introduction to probabilistic mixture models and the expectationmaximization (EM) algorithm. Motivation: simultaneous fitting of multiple model instances unsupervised clustering
More informationRegression with Input-Dependent Noise: A Bayesian Treatment
Regression with Input-Dependent oise: A Bayesian Treatment Christopher M. Bishop C.M.BishopGaston.ac.uk Cazhaow S. Qazaz qazazcsgaston.ac.uk eural Computing Research Group Aston University, Birmingham,
More informationVariational algorithms for marginal MAP
Variational algorithms for marginal MAP Alexander Ihler UC Irvine CIOG Workshop November 2011 Variational algorithms for marginal MAP Alexander Ihler UC Irvine CIOG Workshop November 2011 Work with Qiang
More informationLecture 3: Pattern Classification. Pattern classification
EE E68: Speech & Audio Processing & Recognition Lecture 3: Pattern Classification 3 4 5 The problem of classification Linear and nonlinear classifiers Probabilistic classification Gaussians, mitures and
More informationQuadratic Programming Relaxations for Metric Labeling and Markov Random Field MAP Estimation
Quadratic Programming Relaations for Metric Labeling and Markov Random Field MAP Estimation Pradeep Ravikumar John Lafferty School of Computer Science, Carnegie Mellon University, Pittsburgh, PA 15213,
More informationLecture 18 Generalized Belief Propagation and Free Energy Approximations
Lecture 18, Generalized Belief Propagation and Free Energy Approximations 1 Lecture 18 Generalized Belief Propagation and Free Energy Approximations In this lecture we talked about graphical models and
More information2 Statistical Estimation: Basic Concepts
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:
More informationNUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS
NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS Justin Dauwels Dept. of Information Technology and Electrical Engineering ETH, CH-8092 Zürich, Switzerland dauwels@isi.ee.ethz.ch
More informationGraphical Models for Collaborative Filtering
Graphical Models for Collaborative Filtering Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Sequence modeling HMM, Kalman Filter, etc.: Similarity: the same graphical model topology,
More informationSignals and Spectra - Review
Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs
More informationRecent Advances in Bayesian Inference Techniques
Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian
More informationOUTLINE! SaTC - 15 June Basic Architecture of a Communication System. U s. coder c. coder 0. SaTC - 15 June 2016
Game heoretic Approach to Information Security AMER BAȘAR ECE, CAS, CSL, II, and MechSE, UIUC basar@illinois.edu SaC Workshop University of Wisconsin, Madison June 5-7, 26 OULINE! Fundamental problem in
More informationMachine Learning for Signal Processing Expectation Maximization Mixture Models. Bhiksha Raj 27 Oct /
Machine Learning for Signal rocessing Expectation Maximization Mixture Models Bhiksha Raj 27 Oct 2016 11755/18797 1 Learning Distributions for Data roblem: Given a collection of examples from some data,
More informationSparsity Regularization for Image Reconstruction with Poisson Data
Sparsity Regularization for Image Reconstruction with Poisson Data Daniel J. Lingenfelter a, Jeffrey A. Fessler a,andzhonghe b a Electrical Engineering and Computer Science, University of Michigan, Ann
More informationStatistical Filters for Crowd Image Analysis
Statistical Filters for Crowd Image Analysis Ákos Utasi, Ákos Kiss and Tamás Szirányi Distributed Events Analysis Research Group, Computer and Automation Research Institute H-1111 Budapest, Kende utca
More informationMACHINE LEARNING ADVANCED MACHINE LEARNING
MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 22 MACHINE LEARNING Discrete Probabilities Consider two variables and y taking discrete
More informationSTA 414/2104, Spring 2014, Practice Problem Set #1
STA 44/4, Spring 4, Practice Problem Set # Note: these problems are not for credit, and not to be handed in Question : Consider a classification problem in which there are two real-valued inputs, and,
More informationMultiscale Systems Engineering Research Group
Hidden Markov Model Prof. Yan Wang Woodruff School of Mechanical Engineering Georgia Institute of echnology Atlanta, GA 30332, U.S.A. yan.wang@me.gatech.edu Learning Objectives o familiarize the hidden
More informationMACHINE LEARNING ADVANCED MACHINE LEARNING
MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 2 2 MACHINE LEARNING Overview Definition pdf Definition joint, condition, marginal,
More information14 : Theory of Variational Inference: Inner and Outer Approximation
10-708: Probabilistic Graphical Models 10-708, Spring 2017 14 : Theory of Variational Inference: Inner and Outer Approximation Lecturer: Eric P. Xing Scribes: Maria Ryskina, Yen-Chia Hsu 1 Introduction
More informationIntroduction p. 1 Fundamental Problems p. 2 Core of Fundamental Theory and General Mathematical Ideas p. 3 Classical Statistical Decision p.
Preface p. xiii Acknowledgment p. xix Introduction p. 1 Fundamental Problems p. 2 Core of Fundamental Theory and General Mathematical Ideas p. 3 Classical Statistical Decision p. 4 Bayes Decision p. 5
More informationAdaptive Velocity Tuning for Visual Motion Estimation
Adaptive Velocity Tuning for Visual Motion Estimation Volker Willert 1 and Julian Eggert 2 1- Darmstadt University of Technology Institute of Automatic Control, Control Theory and Robotics Lab Landgraf-Georg-Str.
More information6.867 Machine learning
6.867 Machine learning Mid-term eam October 8, 6 ( points) Your name and MIT ID: .5.5 y.5 y.5 a).5.5 b).5.5.5.5 y.5 y.5 c).5.5 d).5.5 Figure : Plots of linear regression results with different types of
More informationThe Variational Gaussian Approximation Revisited
The Variational Gaussian Approximation Revisited Manfred Opper Cédric Archambeau March 16, 2009 Abstract The variational approximation of posterior distributions by multivariate Gaussians has been much
More informationp(d θ ) l(θ ) 1.2 x x x
p(d θ ).2 x 0-7 0.8 x 0-7 0.4 x 0-7 l(θ ) -20-40 -60-80 -00 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ θ x FIGURE 3.. The top graph shows several training points in one dimension, known or assumed to
More information13 : Variational Inference: Loopy Belief Propagation
10-708: Probabilistic Graphical Models 10-708, Spring 2014 13 : Variational Inference: Loopy Belief Propagation Lecturer: Eric P. Xing Scribes: Rajarshi Das, Zhengzhong Liu, Dishan Gupta 1 Introduction
More informationBayesian Methods / G.D. Hager S. Leonard
Bayesian Methods Recall Robot Localization Given Sensor readings z 1, z 2,, z t = z 1:t Known control inputs u 0, u 1, u t = u 0:t Known model t+1 t, u t ) with initial 1 u 0 ) Known map z t t ) Compute
More informationVariational Bayes. A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M
A key quantity in Bayesian inference is the marginal likelihood of a set of data D given a model M PD M = PD θ, MPθ Mdθ Lecture 14 : Variational Bayes where θ are the parameters of the model and Pθ M is
More informationFractional Belief Propagation
Fractional Belief Propagation im iegerinck and Tom Heskes S, niversity of ijmegen Geert Grooteplein 21, 6525 EZ, ijmegen, the etherlands wimw,tom @snn.kun.nl Abstract e consider loopy belief propagation
More informationMixtures of Gaussians. Sargur Srihari
Mixtures of Gaussians Sargur srihari@cedar.buffalo.edu 1 9. Mixture Models and EM 0. Mixture Models Overview 1. K-Means Clustering 2. Mixtures of Gaussians 3. An Alternative View of EM 4. The EM Algorithm
More informationParametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012
Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More informationA parametric approach to Bayesian optimization with pairwise comparisons
A parametric approach to Bayesian optimization with pairwise comparisons Marco Co Eindhoven University of Technology m.g.h.co@tue.nl Bert de Vries Eindhoven University of Technology and GN Hearing bdevries@ieee.org
More informationCSE 559A: Computer Vision
CSE 559A: Computer Vision Fall 208: T-R: :30-pm @ Lopata 0 Instructor: Ayan Chakrabarti (ayan@wustl.edu). Course Staff: Zhihao ia, Charlie Wu, Han Liu http://www.cse.wustl.edu/~ayan/courses/cse559a/ Sep
More informationy x is symmetric with respect to which of the following?
AP Calculus Summer Assignment Name: Note: Unless otherwise specified, the domain of a function f is assumed to be the set of all real numbers for which f () is a real number. Part : Multiple Choice Solve
More informationProbability models for machine learning. Advanced topics ML4bio 2016 Alan Moses
Probability models for machine learning Advanced topics ML4bio 2016 Alan Moses What did we cover in this course so far? 4 major areas of machine learning: Clustering Dimensionality reduction Classification
More informationCSE 559A: Computer Vision Tomorrow Zhihao's Office Hours back in Jolley 309: 10:30am-Noon
CSE 559A: Computer Vision ADMINISTRIVIA Tomorrow Zhihao's Office Hours back in Jolley 309: 0:30am-Noon Fall 08: T-R: :30-pm @ Lopata 0 This Friday: Regular Office Hours Net Friday: Recitation for PSET
More informationReview Algebra and Functions & Equations (10 questions)
Paper 1 Review No calculator allowed [ worked solutions included ] 1. Find the set of values of for which e e 3 e.. Given that 3 k 1 is positive for all values of, find the range of possible values for
More informationVariational Scoring of Graphical Model Structures
Variational Scoring of Graphical Model Structures Matthew J. Beal Work with Zoubin Ghahramani & Carl Rasmussen, Toronto. 15th September 2003 Overview Bayesian model selection Approximations using Variational
More informationunit P[r x*] C decode encode unit P[x r] f(x) x D
Probabilistic Interpretation of Population Codes Richard S. Zemel Peter Dayan Aleandre Pouget zemel@u.arizona.edu dayan@ai.mit.edu ale@salk.edu Abstract We present a theoretical framework for population
More informationUpper Bounds to Error Probability with Feedback
Upper Bounds to Error robability with Feedbac Barış Naiboğlu Lizhong Zheng Research Laboratory of Electronics at MIT Cambridge, MA, 0239 Email: {naib, lizhong }@mit.edu Abstract A new technique is proposed
More informationMATH 175: Final Exam Review for Pre-calculus
MATH 75: Final Eam Review for Pre-calculus In order to prepare for the final eam, you need too be able to work problems involving the following topics:. Can you graph rational functions by hand after algebraically
More informationMath 2412 Activity 1(Due by EOC Sep. 17)
Math 4 Activity (Due by EOC Sep. 7) Determine whether each relation is a unction.(indicate why or why not.) Find the domain and range o each relation.. 4,5, 6,7, 8,8. 5,6, 5,7, 6,6, 6,7 Determine whether
More informationNON-NEGATIVE MATRIX FACTORIZATION FOR PARAMETER ESTIMATION IN HIDDEN MARKOV MODELS. Balaji Lakshminarayanan and Raviv Raich
NON-NEGATIVE MATRIX FACTORIZATION FOR PARAMETER ESTIMATION IN HIDDEN MARKOV MODELS Balaji Lakshminarayanan and Raviv Raich School of EECS, Oregon State University, Corvallis, OR 97331-551 {lakshmba,raich@eecs.oregonstate.edu}
More informationCSC2535: Computation in Neural Networks Lecture 7: Variational Bayesian Learning & Model Selection
CSC2535: Computation in Neural Networks Lecture 7: Variational Bayesian Learning & Model Selection (non-examinable material) Matthew J. Beal February 27, 2004 www.variational-bayes.org Bayesian Model Selection
More informationLecture 3: Latent Variables Models and Learning with the EM Algorithm. Sam Roweis. Tuesday July25, 2006 Machine Learning Summer School, Taiwan
Lecture 3: Latent Variables Models and Learning with the EM Algorithm Sam Roweis Tuesday July25, 2006 Machine Learning Summer School, Taiwan Latent Variable Models What to do when a variable z is always
More informationProbabilistic Graphical Models
School of Computer Science Probabilistic Graphical Models Variational Inference IV: Variational Principle II Junming Yin Lecture 17, March 21, 2012 X 1 X 1 X 1 X 1 X 2 X 3 X 2 X 2 X 3 X 3 Reading: X 4
More informationExpectation Maximization Mixture Models HMMs
11-755 Machine Learning for Signal rocessing Expectation Maximization Mixture Models HMMs Class 9. 21 Sep 2010 1 Learning Distributions for Data roblem: Given a collection of examples from some data, estimate
More information6. Vector Random Variables
6. Vector Random Variables In the previous chapter we presented methods for dealing with two random variables. In this chapter we etend these methods to the case of n random variables in the following
More informationDevelopment of Crane Tele-operation System using Laser Pointer Interface
23 IEEE/RSJ International Conerence on Intellient Robots and Systems (IROS) November 3-7, 23. Tokyo, Japan Development o Crane Tele-operation System usin Laser Pointer Interace Masaki Neishi, Hisasi Osumi
More informationWhy do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time
Handling uncertainty over time: predicting, estimating, recognizing, learning Chris Atkeson 2004 Why do we care? Speech recognition makes use of dependence of words and phonemes across time. Knowing where
More informationMultidimensional partitions of unity and Gaussian terrains
and Gaussian terrains Richard A. Bale, Jeff P. Grossman, Gary F. Margrave, and Michael P. Lamoureu ABSTRACT Partitions of unity play an important rôle as amplitude-preserving windows for nonstationary
More informationNoise-Blind Image Deblurring Supplementary Material
Noise-Blind Image Deblurring Supplementary Material Meiguang Jin University of Bern Switzerland Stefan Roth TU Darmstadt Germany Paolo Favaro University of Bern Switzerland A. Upper and Lower Bounds Our
More informationNumerical Integration (Quadrature) Another application for our interpolation tools!
Numerical Integration (Quadrature) Another application for our interpolation tools! Integration: Area under a curve Curve = data or function Integrating data Finite number of data points spacing specified
More informationA Practitioner s Guide to Generalized Linear Models
A Practitioners Guide to Generalized Linear Models Background The classical linear models and most of the minimum bias procedures are special cases of generalized linear models (GLMs). GLMs are more technically
More informationParameter Estimation in the Spatio-Temporal Mixed Effects Model Analysis of Massive Spatio-Temporal Data Sets
Parameter Estimation in the Spatio-Temporal Mixed Effects Model Analysis of Massive Spatio-Temporal Data Sets Matthias Katzfuß Advisor: Dr. Noel Cressie Department of Statistics The Ohio State University
More information8. THEOREM If the partial derivatives f x. and f y exist near (a, b) and are continuous at (a, b), then f is differentiable at (a, b).
8. THEOREM I the partial derivatives and eist near (a b) and are continuous at (a b) then is dierentiable at (a b). For a dierentiable unction o two variables z= ( ) we deine the dierentials d and d to
More informationInformation Theoretic Imaging
Information Theoretic Imaging WU Faculty: J. A. O Sullivan WU Doctoral Student: Naveen Singla Boeing Engineer: James Meany First Year Focus: Imaging for Data Storage Image Reconstruction Data Retrieval
More informationAnalytic Long-Term Forecasting with Periodic Gaussian Processes
Nooshin Haji Ghassemi School of Computing Blekinge Institute of Technology Sweden Marc Peter Deisenroth Department of Computing Imperial College London United Kingdom Department of Computer Science TU
More informationParticle-Based Approximate Inference on Graphical Model
article-based Approimate Inference on Graphical Model Reference: robabilistic Graphical Model Ch. 2 Koller & Friedman CMU, 0-708, Fall 2009 robabilistic Graphical Models Lectures 8,9 Eric ing attern Recognition
More informationMachine Learning 4771
Machine Learning 4771 Instructor: ony Jebara Kalman Filtering Linear Dynamical Systems and Kalman Filtering Structure from Motion Linear Dynamical Systems Audio: x=pitch y=acoustic waveform Vision: x=object
More informationRandom Fields in Bayesian Inference: Effects of the Random Field Discretization
Random Fields in Bayesian Inference: Effects of the Random Field Discretization Felipe Uribe a, Iason Papaioannou a, Wolfgang Betz a, Elisabeth Ullmann b, Daniel Straub a a Engineering Risk Analysis Group,
More informationAsymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels
Asymptotic istortion Performance of Source-Channel iversity Schemes over Relay Channels Karim G. Seddik 1, Andres Kwasinski 2, and K. J. Ray Liu 1 1 epartment of Electrical and Computer Engineering, 2
More informationBrain Lesion Segmentation: A Bayesian Weighted EM Approach
Brain Lesion Segmentation: A Bayesian Weighted EM Approach Senan Doyle, Florence Forbes, Michel Dojat November 19, 2009 Table of contents Introduction & Background A Weighted Multi-Sequence Markov Model
More informationParametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory
Statistical Inference Parametric Inference Maximum Likelihood Inference Exponential Families Expectation Maximization (EM) Bayesian Inference Statistical Decison Theory IP, José Bioucas Dias, IST, 2007
More informationLearning Conditional Probabilities from Incomplete Data: An Experimental Comparison Marco Ramoni Knowledge Media Institute Paola Sebastiani Statistics
Learning Conditional Probabilities from Incomplete Data: An Experimental Comparison Marco Ramoni Knowledge Media Institute Paola Sebastiani Statistics Department Abstract This paper compares three methods
More informationApproximating the Partition Function by Deleting and then Correcting for Model Edges (Extended Abstract)
Approximating the Partition Function by Deleting and then Correcting for Model Edges (Extended Abstract) Arthur Choi and Adnan Darwiche Computer Science Department University of California, Los Angeles
More informationDoes Better Inference mean Better Learning?
Does Better Inference mean Better Learning? Andrew E. Gelfand, Rina Dechter & Alexander Ihler Department of Computer Science University of California, Irvine {agelfand,dechter,ihler}@ics.uci.edu Abstract
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationAP Calculus BC Final Exam Preparatory Materials December 2016
AP Calculus BC Final Eam Preparatory Materials December 06 Your first semester final eam will consist of both multiple choice and free response questions, similar to the AP Eam The following practice problems
More informationName: MA 160 Dr. Katiraie (100 points) Test #3 Spring 2013
Name: MA 160 Dr. Katiraie (100 points) Test #3 Spring 2013 Show all of your work on the test paper. All of the problems must be solved symbolically using Calculus. You may use your calculator to confirm
More informationBayesian Approach 2. CSC412 Probabilistic Learning & Reasoning
CSC412 Probabilistic Learning & Reasoning Lecture 12: Bayesian Parameter Estimation February 27, 2006 Sam Roweis Bayesian Approach 2 The Bayesian programme (after Rev. Thomas Bayes) treats all unnown quantities
More informationColor Scheme. swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July / 14
Color Scheme www.cs.wisc.edu/ swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July 2016 1 / 14 Statistical Inference via Optimization Many problems in statistical inference
More informationComment on Article by Scutari
Bayesian Analysis (2013) 8, Number 3, pp. 543 548 Comment on Article by Scutari Hao Wang Scutari s paper studies properties of the distribution of graphs ppgq. This is an interesting angle because it differs
More information