Fast approximate curve evolution

Size: px
Start display at page:

Download "Fast approximate curve evolution"

Transcription

1 Fast approximate curve evolution James Malcolm Yogesh Rathi Anthony Yezzi Allen Tannenbaum Georgia Institute of Technology, Atlanta, GA Brigham and Women s Hospital, Boston, MA James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 1/28MA

2 Introduction Curve evolution is robust technique for variational image segmentation James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 2/28MA

3 Introduction Curve evolution is robust technique for variational image segmentation Active contours, level set methods James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 2/28MA

4 Introduction Curve evolution is robust technique for variational image segmentation Active contours, level set methods Numerics involved make its use in high performance systems computationally prohibitive, e.g. upwind derivatives g(x, y) φ = ( max(g(x, y), 0) φ + x + min(g(x, y), 0) φ 2 x + max(g(x, y), 0) φ + y + min(g(x, y), 0) φ 2 y ) 1/2 James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 2/28MA

5 Introduction Curve evolution is robust technique for variational image segmentation Active contours, level set methods Numerics involved make its use in high performance systems computationally prohibitive, e.g. upwind derivatives g(x, y) φ = ( max(g(x, y), 0) φ + x + min(g(x, y), 0) φ 2 x + max(g(x, y), 0) φ + y + min(g(x, y), 0) φ 2 y ) 1/2 We propose an approximate curve evolution scheme that is simple, fast, and accurate. James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 2/28MA

6 Energy minimization Formulate image segmentation as energy minimization problem E(x) = E data (x) + E smoothness (x) James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 3/28MA

7 Energy minimization Formulate image segmentation as energy minimization problem E(x) = E data (x) + E smoothness (x) Find the curve that optimally separates object from background E(C) = E data (C) + E smoothness (C) Equivalently use a signed distance function φ E(φ) = E data (φ) + E smoothness (φ) James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 3/28MA

8 Energy minimization Formulate image segmentation as energy minimization problem E(x) = E data (x) + E smoothness (x) Find the curve that optimally separates object from background E(C) = E data (C) + E smoothness (C) Equivalently use a signed distance function φ E(φ) = E data (φ) + E smoothness (φ) Iteratively deform curve to minimize energy E(φ) = (g(φ) + K) φ φ James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 3/28MA

9 Numerical implementation Signed distance function adhering to φ = 1 James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 4/28MA

10 Numerical implementation Signed distance function adhering to φ = 1 Nonlinear partial differential equation James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 4/28MA

11 Numerical implementation Signed distance function adhering to φ = 1 Nonlinear partial differential equation Stability requires upwinding, finite differencing schemes, forward Euler updates, regularization, etc. James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 4/28MA

12 Related work Various techniques proposed, each with tradeoffs: Perform numerical computations only in narrow band around interface (Adalsteinson and Sethian 1995, Whitaker 1998) James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 5/28MA

13 Related work Various techniques proposed, each with tradeoffs: Perform numerical computations only in narrow band around interface (Adalsteinson and Sethian 1995, Whitaker 1998) Reduced domain, but still full numerics James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 5/28MA

14 Related work Various techniques proposed, each with tradeoffs: Perform numerical computations only in narrow band around interface (Adalsteinson and Sethian 1995, Whitaker 1998) Reduced domain, but still full numerics Binary representation (Gibou and Fedkiw 2005) James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 5/28MA

15 Related work Various techniques proposed, each with tradeoffs: Perform numerical computations only in narrow band around interface (Adalsteinson and Sethian 1995, Whitaker 1998) Reduced domain, but still full numerics Binary representation (Gibou and Fedkiw 2005) Removes much of numerics, but force computed on entire domain James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 5/28MA

16 Related work Various techniques proposed, each with tradeoffs: Perform numerical computations only in narrow band around interface (Adalsteinson and Sethian 1995, Whitaker 1998) Reduced domain, but still full numerics Binary representation (Gibou and Fedkiw 2005) Removes much of numerics, but force computed on entire domain Discrete representation and list switching (Shi and Karl 2005) James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 5/28MA

17 Related work Various techniques proposed, each with tradeoffs: Perform numerical computations only in narrow band around interface (Adalsteinson and Sethian 1995, Whitaker 1998) Reduced domain, but still full numerics Binary representation (Gibou and Fedkiw 2005) Removes much of numerics, but force computed on entire domain Discrete representation and list switching (Shi and Karl 2005) Removes numerics, but requires interpolation off the interface and computation of the force twice James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 5/28MA

18 Continuous v. Discrete Continuous representation James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 6/28MA

19 Continuous v. Discrete Continuous representation Discrete approximation Discrete only uses: φ = 1 inside, φ = 0 interface, φ = 1 outside. James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 6/28MA

20 Assumption: Subpixel error makes little difference at the macro level φ = 1 φ = 1 James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 7/28MA

21 Algorithm Overview 1. Based on energy, compute force only along interface 2. Points are propagated according to the force 3. Interface is cleaned up 4. Regional statistics are updated James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 8/28MA

22 Algorithm Overview 1. Based on energy, compute force only along interface 2. Points are propagated according to the force 3. Interface is cleaned up 4. Regional statistics are updated Each iteration has two phases: dilation, contraction James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 8/28MA

23 Algorithm Overview 1. Based on energy, compute force only along interface 2. Points are propagated according to the force 3. Interface is cleaned up 4. Regional statistics are updated Each iteration has two phases: dilation, contraction This work uses the discrete signed distance function: φ = 1 inside, φ = 0 interface, φ = 1 outside. James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 8/28MA

24 Main loop for each iteration do {Contraction} Callback: compute force Restrict to contraction (only allow positive forces) Propagate, Cleanup Callback: move points in and out {Dilation} Callback: compute force Restrict to contraction (only allow negative forces) Propagate, Cleanup Callback: move points in and out end for Note: Callbacks are energy specific. James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 9/28MA

25 Compute force Based on chosen energy, force is computed at each point along curve James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 10/28MA

26 Compute force Based on chosen energy, force is computed at each point along curve Only sign of this force matters James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 10/28MA

27 Compute force Based on chosen energy, force is computed at each point along curve Only sign of this force matters Discrete representation suitable for approximate first order derivatives James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 10/28MA

28 Movement of points Points only move in four directions: up, down, left, right James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 11/28MA

29 Points move a unit distance in direction indicated by force James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 12/28MA

30 Points move a unit distance in direction indicated by force Dilation James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 12/28MA

31 Points move a unit distance in direction indicated by force Dilation Contraction James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 12/28MA

32 Maintaining a minimal interface Drop points that only touch one side of interface James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 13/28MA

33 Maintaining a minimal interface Drop points that only touch one side of interface We only need check up/down/left/right neighbors for decisions on movement James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 13/28MA

34 Maintaining a minimal interface Drop points that only touch one side of interface We only need check up/down/left/right neighbors for decisions on movement Prevents artifacts from developing James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 13/28MA

35 Incorporating an energy Arbitrary energies defined by three functions: James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 14/28MA

36 Incorporating an energy Arbitrary energies defined by three functions: computeforce() compute positive/negative energy gradient at each point along curve, e.g. E(C) N = g(c) N. James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 14/28MA

37 Incorporating an energy Arbitrary energies defined by three functions: computeforce() compute positive/negative energy gradient at each point along curve, e.g. E(C) N = g(c) N. movein(),moveout() update regional statistics based on specified points moving across interface James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 14/28MA

38 Why the two phased approach? Without subpixel resolution, curve can oscillate along an object boundary James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 15/28MA

39 Why the two phased approach? Without subpixel resolution, curve can oscillate along an object boundary Oscillation produces jagged edges James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 15/28MA

40 Why the two phased approach? Without subpixel resolution, curve can oscillate along an object boundary Oscillation produces jagged edges Contour remains roughly in same position James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 15/28MA

41 Recap for each iteration do {Contraction} Callback: compute force Restrict to contraction (only allow positive forces) Propagate, Cleanup Callback: move points in and out {Dilation} Callback: compute force Restrict to contraction (only allow negative forces) Propagate, Cleanup Callback: move points in and out end for James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 16/28MA

42 Example energies This work demonstrates two statistical intensity-based energies from the literature: James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 17/28MA

43 Example energies This work demonstrates two statistical intensity-based energies from the literature: 1. Separating regions represented by their mean intensity James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 17/28MA

44 Example energies This work demonstrates two statistical intensity-based energies from the literature: 1. Separating regions represented by their mean intensity 2. Separating regions represented by their full density James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 17/28MA

45 Example energies This work demonstrates two statistical intensity-based energies from the literature: 1. Separating regions represented by their mean intensity 2. Separating regions represented by their full density Can ignore δ(φ) since operating along interface James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 17/28MA

46 Mean intensity separation Characterize both regions by the average intensity James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 18/28MA

47 Mean intensity separation Characterize both regions by the average intensity Energy favors regions with low variance about the mean intensity (cartoon model) (Chan and Vese 2001, Yezzi et al. 1999). James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 18/28MA

48 Mean intensity separation Characterize both regions by the average intensity Energy favors regions with low variance about the mean intensity (cartoon model) (Chan and Vese 2001, Yezzi et al. 1999). Energy and gradient: E = (I(x) v) 2 H(φ(x)) + (I(x) u) 2 H( φ(x))dx E = δ(φ) [ (I(x) v) 2 (I(x) u) 2] James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 18/28MA

49 Mean intensity separation Characterize both regions by the average intensity Energy favors regions with low variance about the mean intensity (cartoon model) (Chan and Vese 2001, Yezzi et al. 1999). Energy and gradient: E = (I(x) v) 2 H(φ(x)) + (I(x) u) 2 H( φ(x))dx E = δ(φ) [ (I(x) v) 2 (I(x) u) 2] Recursively update means as points move in and out James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 18/28MA

50 Full density separation Characterize both regions by their full intensity distribution James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 19/28MA

51 Full density separation Characterize both regions by their full intensity distribution Minimize the similarity between regions judged via Bhattacharyya measure (Zhang and Freedman 2003, Rathi et al. 2006) James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 19/28MA

52 Full density separation Characterize both regions by their full intensity distribution Minimize the similarity between regions judged via Bhattacharyya measure (Zhang and Freedman 2003, Rathi et al. 2006) Energy: E = d 2 (p, q) = Z p(z)q(z)dz James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 19/28MA

53 Full density separation Characterize both regions by their full intensity distribution Minimize the similarity between regions judged via Bhattacharyya measure (Zhang and Freedman 2003, Rathi et al. 2006) Energy: E = d 2 (p, q) = Simplified gradient: E = d2 (p, q) 2 ( 1 A p 1 A q ) Z + δ(φ) 2 p(z)q(z)dz ( 1 A q ) p(z) q(z) 1 q(z) A p p(z) James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 19/28MA

54 Extensions Smoothing (Gibou and Fedkiw 2005, Shi and Karl 2005) James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 20/28MA

55 Extensions Smoothing (Gibou and Fedkiw 2005, Shi and Karl 2005) Alternate evolution and smoothing via Gaussian kernel James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 20/28MA

56 Extensions Smoothing (Gibou and Fedkiw 2005, Shi and Karl 2005) Alternate evolution and smoothing via Gaussian kernel Single phase for faster evolution at the cost of jagged edges James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 20/28MA

57 Extensions Smoothing (Gibou and Fedkiw 2005, Shi and Karl 2005) Alternate evolution and smoothing via Gaussian kernel Single phase for faster evolution at the cost of jagged edges Monotonic front propagation James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 20/28MA

58 Speeds depend on initialization and image size James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 21/28MA

59 Speeds depend on initialization and image size Benchmarked at speeds ranging from ms per iteration James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 21/28MA

60 Speeds depend on initialization and image size Benchmarked at speeds ranging from ms per iteration Convergence often in 10 or fewer iterations due to unit propagation James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 21/28MA

61 Mean intensity James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 22/28MA

62 Mean intensity James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 23/28MA

63 Mean intensity James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 24/28MA

64 Full density James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 25/28MA

65 Approximation James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 26/28MA

66 Volumetric James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 27/28MA

67 Conclusion Discrete approximation of sign distance function James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 28/28MA

68 Conclusion Discrete approximation of sign distance function Simple curve mechanics James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 28/28MA

69 Conclusion Discrete approximation of sign distance function Simple curve mechanics High performance on uniprocessors James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 28/28MA

70 Conclusion Discrete approximation of sign distance function Simple curve mechanics High performance on uniprocessors James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 28/28MA

71 Conclusion Discrete approximation of sign distance function Simple curve mechanics High performance on uniprocessors Questions? James Malcolm Georgia Institute of Technology, Atlanta, GABrigham and Women s Hospital, Boston, 28/28MA

Segmenting Images on the Tensor Manifold

Segmenting Images on the Tensor Manifold Segmenting Images on the Tensor Manifold Yogesh Rathi, Allen Tannenbaum Georgia Institute of Technology, Atlanta, GA yogesh.rathi@gatech.edu, tannenba@ece.gatech.edu Oleg Michailovich University of Waterloo,

More information

13. Nonlinear least squares

13. Nonlinear least squares L. Vandenberghe ECE133A (Fall 2018) 13. Nonlinear least squares definition and examples derivatives and optimality condition Gauss Newton method Levenberg Marquardt method 13.1 Nonlinear least squares

More information

ITK Filters. Thresholding Edge Detection Gradients Second Order Derivatives Neighborhood Filters Smoothing Filters Distance Map Image Transforms

ITK Filters. Thresholding Edge Detection Gradients Second Order Derivatives Neighborhood Filters Smoothing Filters Distance Map Image Transforms ITK Filters Thresholding Edge Detection Gradients Second Order Derivatives Neighborhood Filters Smoothing Filters Distance Map Image Transforms ITCS 6010:Biomedical Imaging and Visualization 1 ITK Filters:

More information

Asaf Bar Zvi Adi Hayat. Semantic Segmentation

Asaf Bar Zvi Adi Hayat. Semantic Segmentation Asaf Bar Zvi Adi Hayat Semantic Segmentation Today s Topics Fully Convolutional Networks (FCN) (CVPR 2015) Conditional Random Fields as Recurrent Neural Networks (ICCV 2015) Gaussian Conditional random

More information

Analysis of Numerical Methods for Level Set Based Image Segmentation

Analysis of Numerical Methods for Level Set Based Image Segmentation Analysis of Numerical Methods for Level Set Based Image Segmentation Björn Scheuermann and Bodo Rosenhahn Institut für Informationsverarbeitung Leibnitz Universität Hannover {scheuerm,rosenhahn}@tnt.uni-hannover.de

More information

A Localized Linearized ROF Model for Surface Denoising

A Localized Linearized ROF Model for Surface Denoising 1 2 3 4 A Localized Linearized ROF Model for Surface Denoising Shingyu Leung August 7, 2008 5 Abstract 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 1 Introduction CT/MRI scan becomes a very

More information

A variational framework combining level-sets and thresholding.

A variational framework combining level-sets and thresholding. A variational framework combg level-sets and thresholdg. Samuel Dambreville, Marc Niethammer, Anthony Yezzi and Allen Tannenbaum School of Electrical and Computer Engeerg Georgia Institute of Technology

More information

Particle Filters for Change Detection and Shape Tracking

Particle Filters for Change Detection and Shape Tracking Particle Filters for Change Detection and Shape Tracking Namrata Vaswani School of Electrical and Computer Engineering Georgia Institute of Technology http://users.ece.gatech.edu/ namrata Change Detection

More information

Numerical Methods of Applied Mathematics -- II Spring 2009

Numerical Methods of Applied Mathematics -- II Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 18.336 Numerical Methods of Applied Mathematics -- II Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Nearest Neighbor. Machine Learning CSE546 Kevin Jamieson University of Washington. October 26, Kevin Jamieson 2

Nearest Neighbor. Machine Learning CSE546 Kevin Jamieson University of Washington. October 26, Kevin Jamieson 2 Nearest Neighbor Machine Learning CSE546 Kevin Jamieson University of Washington October 26, 2017 2017 Kevin Jamieson 2 Some data, Bayes Classifier Training data: True label: +1 True label: -1 Optimal

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

Smoothed Particle Hydrodynamics (SPH) Huamin Wang

Smoothed Particle Hydrodynamics (SPH) Huamin Wang Smoothed Particle Hydrodynamics (SPH) Huamin Wang Fluid Representation Fluid is represented using a set of particles. Particle i has position x i, velocity v i, and mass m i. Animation Example Using 10

More information

Classification: The rest of the story

Classification: The rest of the story U NIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN CS598 Machine Learning for Signal Processing Classification: The rest of the story 3 October 2017 Today s lecture Important things we haven t covered yet Fisher

More information

Dynamic Programming. Macro 3: Lecture 2. Mark Huggett 2. 2 Georgetown. September, 2016

Dynamic Programming. Macro 3: Lecture 2. Mark Huggett 2. 2 Georgetown. September, 2016 Macro 3: Lecture 2 Mark Huggett 2 2 Georgetown September, 2016 Three Maps: Review of Lecture 1 X = R 1 + and X grid = {x 1,..., x n } X where x i+1 > x i 1. T (v)(x) = max u(f (x) y) + βv(y) s.t. y Γ 1

More information

A New Distribution Metric for Image Segmentation

A New Distribution Metric for Image Segmentation A New Distribution Metric for Image Segmentation Romeil Sandhu a Tryphon Georgiou b Allen Tannenbaum a a School of Electrical & Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30332-0250

More information

A Pseudo-distance for Shape Priors in Level Set Segmentation

A Pseudo-distance for Shape Priors in Level Set Segmentation O. Faugeras, N. Paragios (Eds.), 2nd IEEE Workshop on Variational, Geometric and Level Set Methods in Computer Vision, Nice, 2003. A Pseudo-distance for Shape Priors in Level Set Segmentation Daniel Cremers

More information

Carotid Lumen Segmentation Based on Tubular Anisotropy and Contours Without Edges Release 0.00

Carotid Lumen Segmentation Based on Tubular Anisotropy and Contours Without Edges Release 0.00 Carotid Lumen Segmentation Based on Tubular Anisotropy and Contours Without Edges Release 0.00 Julien Mille, Fethallah Benmansour and Laurent D. Cohen July 20, 2009 CEREMADE, UMR CNRS 7534, Université

More information

ECE661: Homework 6. Ahmed Mohamed October 28, 2014

ECE661: Homework 6. Ahmed Mohamed October 28, 2014 ECE661: Homework 6 Ahmed Mohamed (akaseb@purdue.edu) October 28, 2014 1 Otsu Segmentation Algorithm Given a grayscale image, my implementation of the Otsu algorithm follows these steps: 1. Construct a

More information

IMA Preprint Series # 2098

IMA Preprint Series # 2098 LEVEL SET BASED BIMODAL SEGMENTATION WITH STATIONARY GLOBAL MINIMUM By Suk-Ho Lee and Jin Keun Seo IMA Preprint Series # 9 ( February ) INSTITUTE FOR MATHEMATICS AND ITS APPLICATIONS UNIVERSITY OF MINNESOTA

More information

Polynomial accelerated MCMC... and other sampling algorithms inspired by computational optimization

Polynomial accelerated MCMC... and other sampling algorithms inspired by computational optimization Polynomial accelerated MCMC... and other sampling algorithms inspired by computational optimization Colin Fox fox@physics.otago.ac.nz Al Parker, John Bardsley Fore-words Motivation: an inverse oceanography

More information

Fraunhofer Institute for Computer Graphics Research Interactive Graphics Systems Group, TU Darmstadt Fraunhoferstrasse 5, Darmstadt, Germany

Fraunhofer Institute for Computer Graphics Research Interactive Graphics Systems Group, TU Darmstadt Fraunhoferstrasse 5, Darmstadt, Germany Scale Space and PDE methods in image analysis and processing Arjan Kuijper Fraunhofer Institute for Computer Graphics Research Interactive Graphics Systems Group, TU Darmstadt Fraunhoferstrasse 5, 64283

More information

Modulation-Feature based Textured Image Segmentation Using Curve Evolution

Modulation-Feature based Textured Image Segmentation Using Curve Evolution Modulation-Feature based Textured Image Segmentation Using Curve Evolution Iasonas Kokkinos, Giorgos Evangelopoulos and Petros Maragos Computer Vision, Speech Communication and Signal Processing Group

More information

Weighted Minimal Surfaces and Discrete Weighted Minimal Surfaces

Weighted Minimal Surfaces and Discrete Weighted Minimal Surfaces Weighted Minimal Surfaces and Discrete Weighted Minimal Surfaces Qin Zhang a,b, Guoliang Xu a, a LSEC, Institute of Computational Mathematics, Academy of Mathematics and System Science, Chinese Academy

More information

Announcements. Proposals graded

Announcements. Proposals graded Announcements Proposals graded Kevin Jamieson 2018 1 Bayesian Methods Machine Learning CSE546 Kevin Jamieson University of Washington November 1, 2018 2018 Kevin Jamieson 2 MLE Recap - coin flips Data:

More information

Computer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization

Computer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization Prof. Daniel Cremers 6. Mixture Models and Expectation-Maximization Motivation Often the introduction of latent (unobserved) random variables into a model can help to express complex (marginal) distributions

More information

Learning with Momentum, Conjugate Gradient Learning

Learning with Momentum, Conjugate Gradient Learning Learning with Momentum, Conjugate Gradient Learning Introduction to Neural Networks : Lecture 8 John A. Bullinaria, 2004 1. Visualising Learning 2. Learning with Momentum 3. Learning with Line Searches

More information

Tracking (Optimal filtering) on Large Dimensional State Spaces (LDSS)

Tracking (Optimal filtering) on Large Dimensional State Spaces (LDSS) Tracking (Optimal filtering) on Large Dimensional State Spaces (LDSS) Namrata Vaswani Dept of Electrical & Computer Engineering Iowa State University http://www.ece.iastate.edu/~namrata HMM Model & Tracking

More information

A Nonparametric Statistical Method for Image Segmentation Using Information Theory and Curve Evolution

A Nonparametric Statistical Method for Image Segmentation Using Information Theory and Curve Evolution SUBMITTED TO IEEE TRANSACTIONS ON IMAGE PROCESSING 1 A Nonparametric Statistical Method for Image Segmentation Using Information Theory and Curve Evolution Junmo Kim, John W. Fisher III, Anthony Yezzi,

More information

Midterm exam CS 189/289, Fall 2015

Midterm exam CS 189/289, Fall 2015 Midterm exam CS 189/289, Fall 2015 You have 80 minutes for the exam. Total 100 points: 1. True/False: 36 points (18 questions, 2 points each). 2. Multiple-choice questions: 24 points (8 questions, 3 points

More information

Edge Detection. CS 650: Computer Vision

Edge Detection. CS 650: Computer Vision CS 650: Computer Vision Edges and Gradients Edge: local indication of an object transition Edge detection: local operators that find edges (usually involves convolution) Local intensity transitions are

More information

Nonlinear Diffusion. Journal Club Presentation. Xiaowei Zhou

Nonlinear Diffusion. Journal Club Presentation. Xiaowei Zhou 1 / 41 Journal Club Presentation Xiaowei Zhou Department of Electronic and Computer Engineering The Hong Kong University of Science and Technology 2009-12-11 2 / 41 Outline 1 Motivation Diffusion process

More information

Reformulating the HMM as a trajectory model by imposing explicit relationship between static and dynamic features

Reformulating the HMM as a trajectory model by imposing explicit relationship between static and dynamic features Reformulating the HMM as a trajectory model by imposing explicit relationship between static and dynamic features Heiga ZEN (Byung Ha CHUN) Nagoya Inst. of Tech., Japan Overview. Research backgrounds 2.

More information

Gaussian and Linear Discriminant Analysis; Multiclass Classification

Gaussian and Linear Discriminant Analysis; Multiclass Classification Gaussian and Linear Discriminant Analysis; Multiclass Classification Professor Ameet Talwalkar Slide Credit: Professor Fei Sha Professor Ameet Talwalkar CS260 Machine Learning Algorithms October 13, 2015

More information

Logistic Regression Review Fall 2012 Recitation. September 25, 2012 TA: Selen Uguroglu

Logistic Regression Review Fall 2012 Recitation. September 25, 2012 TA: Selen Uguroglu Logistic Regression Review 10-601 Fall 2012 Recitation September 25, 2012 TA: Selen Uguroglu!1 Outline Decision Theory Logistic regression Goal Loss function Inference Gradient Descent!2 Training Data

More information

Comments. Assignment 3 code released. Thought questions 3 due this week. Mini-project: hopefully you have started. implement classification algorithms

Comments. Assignment 3 code released. Thought questions 3 due this week. Mini-project: hopefully you have started. implement classification algorithms Neural networks Comments Assignment 3 code released implement classification algorithms use kernels for census dataset Thought questions 3 due this week Mini-project: hopefully you have started 2 Example:

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

Lecture 7: Minimization or maximization of functions (Recipes Chapter 10)

Lecture 7: Minimization or maximization of functions (Recipes Chapter 10) Lecture 7: Minimization or maximization of functions (Recipes Chapter 10) Actively studied subject for several reasons: Commonly encountered problem: e.g. Hamilton s and Lagrange s principles, economics

More information

Linear Diffusion. E9 242 STIP- R. Venkatesh Babu IISc

Linear Diffusion. E9 242 STIP- R. Venkatesh Babu IISc Linear Diffusion Derivation of Heat equation Consider a 2D hot plate with Initial temperature profile I 0 (x, y) Uniform (isotropic) conduction coefficient c Unit thickness (along z) Problem: What is temperature

More information

A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization

A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization A Multilevel Proximal Algorithm for Large Scale Composite Convex Optimization Panos Parpas Department of Computing Imperial College London www.doc.ic.ac.uk/ pp500 p.parpas@imperial.ac.uk jointly with D.V.

More information

CS599 Lecture 2 Function Approximation in RL

CS599 Lecture 2 Function Approximation in RL CS599 Lecture 2 Function Approximation in RL Look at how experience with a limited part of the state set be used to produce good behavior over a much larger part. Overview of function approximation (FA)

More information

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition NONLINEAR CLASSIFICATION AND REGRESSION Nonlinear Classification and Regression: Outline 2 Multi-Layer Perceptrons The Back-Propagation Learning Algorithm Generalized Linear Models Radial Basis Function

More information

Numerical Data Fitting in Dynamical Systems

Numerical Data Fitting in Dynamical Systems Numerical Data Fitting in Dynamical Systems A Practical Introduction with Applications and Software by Klaus Schittkowski Department of Mathematics, University of Bayreuth, Bayreuth, Germany * * KLUWER

More information

The Fast Sweeping Method. Hongkai Zhao. Department of Mathematics University of California, Irvine

The Fast Sweeping Method. Hongkai Zhao. Department of Mathematics University of California, Irvine The Fast Sweeping Method Hongkai Zhao Department of Mathematics University of California, Irvine Highlights of fast sweeping mehtod Simple (A Gauss-Seidel type of iterative method). Optimal complexity.

More information

Motion Estimation (I) Ce Liu Microsoft Research New England

Motion Estimation (I) Ce Liu Microsoft Research New England Motion Estimation (I) Ce Liu celiu@microsoft.com Microsoft Research New England We live in a moving world Perceiving, understanding and predicting motion is an important part of our daily lives Motion

More information

CS 6501: Deep Learning for Computer Graphics. Basics of Neural Networks. Connelly Barnes

CS 6501: Deep Learning for Computer Graphics. Basics of Neural Networks. Connelly Barnes CS 6501: Deep Learning for Computer Graphics Basics of Neural Networks Connelly Barnes Overview Simple neural networks Perceptron Feedforward neural networks Multilayer perceptron and properties Autoencoders

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative

More information

Parallelizing large scale time domain electromagnetic inverse problem

Parallelizing large scale time domain electromagnetic inverse problem Parallelizing large scale time domain electromagnetic inverse problems Eldad Haber with: D. Oldenburg & R. Shekhtman + Emory University, Atlanta, GA + The University of British Columbia, Vancouver, BC,

More information

Final Exam, Machine Learning, Spring 2009

Final Exam, Machine Learning, Spring 2009 Name: Andrew ID: Final Exam, 10701 Machine Learning, Spring 2009 - The exam is open-book, open-notes, no electronics other than calculators. - The maximum possible score on this exam is 100. You have 3

More information

ECE 471/571 - Lecture 17. Types of NN. History. Back Propagation. Recurrent (feedback during operation) Feedforward

ECE 471/571 - Lecture 17. Types of NN. History. Back Propagation. Recurrent (feedback during operation) Feedforward ECE 47/57 - Lecture 7 Back Propagation Types of NN Recurrent (feedback during operation) n Hopfield n Kohonen n Associative memory Feedforward n No feedback during operation or testing (only during determination

More information

Generalized Newton-Type Method for Energy Formulations in Image Processing

Generalized Newton-Type Method for Energy Formulations in Image Processing Generalized Newton-Type Method for Energy Formulations in Image Processing Leah Bar and Guillermo Sapiro Department of Electrical and Computer Engineering University of Minnesota Outline Optimization in

More information

Chapter 8: Generalization and Function Approximation

Chapter 8: Generalization and Function Approximation Chapter 8: Generalization and Function Approximation Objectives of this chapter: Look at how experience with a limited part of the state set be used to produce good behavior over a much larger part. Overview

More information

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain

More information

Outline Introduction Edge Detection A t c i ti ve C Contours

Outline Introduction Edge Detection A t c i ti ve C Contours Edge Detection and Active Contours Elsa Angelini Department TSI, Telecom ParisTech elsa.angelini@telecom-paristech.fr 2008 Outline Introduction Edge Detection Active Contours Introduction The segmentation

More information

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory Bayesian decision theory 8001652 Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory Jussi Tohka jussi.tohka@tut.fi Institute of Signal Processing Tampere University of Technology

More information

BOTTLENOSE dolphins, Tursiops truncatus, and African

BOTTLENOSE dolphins, Tursiops truncatus, and African JOURNAL OF L A TEX CLASS FILES, VOL., NO. 1, JANUARY 7 1 Efficient Foraging Strategies in Multi-Agent Systems through Curve Evolutions Musad Haque, Amir Rahmani, Magnus Egerstedt, and Anthony Yezzi Abstract

More information

Numerical Learning Algorithms

Numerical Learning Algorithms Numerical Learning Algorithms Example SVM for Separable Examples.......................... Example SVM for Nonseparable Examples....................... 4 Example Gaussian Kernel SVM...............................

More information

Introduction Dual Representations Kernel Design RBF Linear Reg. GP Regression GP Classification Summary. Kernel Methods. Henrik I Christensen

Introduction Dual Representations Kernel Design RBF Linear Reg. GP Regression GP Classification Summary. Kernel Methods. Henrik I Christensen Kernel Methods Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Kernel Methods 1 / 37 Outline

More information

11 More Regression; Newton s Method; ROC Curves

11 More Regression; Newton s Method; ROC Curves More Regression; Newton s Method; ROC Curves 59 11 More Regression; Newton s Method; ROC Curves LEAST-SQUARES POLYNOMIAL REGRESSION Replace each X i with feature vector e.g. (X i ) = [X 2 i1 X i1 X i2

More information

Results: MCMC Dancers, q=10, n=500

Results: MCMC Dancers, q=10, n=500 Motivation Sampling Methods for Bayesian Inference How to track many INTERACTING targets? A Tutorial Frank Dellaert Results: MCMC Dancers, q=10, n=500 1 Probabilistic Topological Maps Results Real-Time

More information

Computational statistics

Computational statistics Computational statistics Lecture 3: Neural networks Thierry Denœux 5 March, 2016 Neural networks A class of learning methods that was developed separately in different fields statistics and artificial

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Stephan Dreiseitl University of Applied Sciences Upper Austria at Hagenberg Harvard-MIT Division of Health Sciences and Technology HST.951J: Medical Decision Support Knowledge

More information

9 Classification. 9.1 Linear Classifiers

9 Classification. 9.1 Linear Classifiers 9 Classification This topic returns to prediction. Unlike linear regression where we were predicting a numeric value, in this case we are predicting a class: winner or loser, yes or no, rich or poor, positive

More information

Advection / Hyperbolic PDEs. PHY 604: Computational Methods in Physics and Astrophysics II

Advection / Hyperbolic PDEs. PHY 604: Computational Methods in Physics and Astrophysics II Advection / Hyperbolic PDEs Notes In addition to the slides and code examples, my notes on PDEs with the finite-volume method are up online: https://github.com/open-astrophysics-bookshelf/numerical_exercises

More information

CBE495 LECTURE IV MODEL PREDICTIVE CONTROL

CBE495 LECTURE IV MODEL PREDICTIVE CONTROL What is Model Predictive Control (MPC)? CBE495 LECTURE IV MODEL PREDICTIVE CONTROL Professor Dae Ryook Yang Fall 2013 Dept. of Chemical and Biological Engineering Korea University * Some parts are from

More information

Lecture 25: November 27

Lecture 25: November 27 10-725: Optimization Fall 2012 Lecture 25: November 27 Lecturer: Ryan Tibshirani Scribes: Matt Wytock, Supreeth Achar Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have

More information

Supplementary Figure 1: Scheme of the RFT. (a) At first, we separate two quadratures of the field (denoted by and ); (b) then, each quadrature

Supplementary Figure 1: Scheme of the RFT. (a) At first, we separate two quadratures of the field (denoted by and ); (b) then, each quadrature Supplementary Figure 1: Scheme of the RFT. (a At first, we separate two quadratures of the field (denoted by and ; (b then, each quadrature undergoes a nonlinear transformation, which results in the sine

More information

Multi-layer Neural Networks

Multi-layer Neural Networks Multi-layer Neural Networks Steve Renals Informatics 2B Learning and Data Lecture 13 8 March 2011 Informatics 2B: Learning and Data Lecture 13 Multi-layer Neural Networks 1 Overview Multi-layer neural

More information

Slide05 Haykin Chapter 5: Radial-Basis Function Networks

Slide05 Haykin Chapter 5: Radial-Basis Function Networks Slide5 Haykin Chapter 5: Radial-Basis Function Networks CPSC 636-6 Instructor: Yoonsuck Choe Spring Learning in MLP Supervised learning in multilayer perceptrons: Recursive technique of stochastic approximation,

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Variational Inference (11/04/13)

Variational Inference (11/04/13) STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further

More information

Parallel-in-Time Optimization with the General-Purpose XBraid Package

Parallel-in-Time Optimization with the General-Purpose XBraid Package Parallel-in-Time Optimization with the General-Purpose XBraid Package Stefanie Günther, Jacob Schroder, Robert Falgout, Nicolas Gauger 7th Workshop on Parallel-in-Time methods Wednesday, May 3 rd, 2018

More information

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat Neural Networks, Computation Graphs CMSC 470 Marine Carpuat Binary Classification with a Multi-layer Perceptron φ A = 1 φ site = 1 φ located = 1 φ Maizuru = 1 φ, = 2 φ in = 1 φ Kyoto = 1 φ priest = 0 φ

More information

Lecture 4: Perceptrons and Multilayer Perceptrons

Lecture 4: Perceptrons and Multilayer Perceptrons Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons

More information

(0, 0), (1, ), (2, ), (3, ), (4, ), (5, ), (6, ).

(0, 0), (1, ), (2, ), (3, ), (4, ), (5, ), (6, ). 1 Interpolation: The method of constructing new data points within the range of a finite set of known data points That is if (x i, y i ), i = 1, N are known, with y i the dependent variable and x i [x

More information

IN THE past two decades, active contour models (ACMs,

IN THE past two decades, active contour models (ACMs, 258 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 22, NO. 1, JANUARY 213 Reinitialization-Free Level Set Evolution via Reaction Diffusion Kaihua Zhang, Lei Zhang, Member, IEEE, Huihui Song, and David Zhang,

More information

Relative Merits of 4D-Var and Ensemble Kalman Filter

Relative Merits of 4D-Var and Ensemble Kalman Filter Relative Merits of 4D-Var and Ensemble Kalman Filter Andrew Lorenc Met Office, Exeter International summer school on Atmospheric and Oceanic Sciences (ISSAOS) "Atmospheric Data Assimilation". August 29

More information

Neural Network Training

Neural Network Training Neural Network Training Sargur Srihari Topics in Network Training 0. Neural network parameters Probabilistic problem formulation Specifying the activation and error functions for Regression Binary classification

More information

Dual Estimation and the Unscented Transformation

Dual Estimation and the Unscented Transformation Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology

More information

CSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18

CSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18 CSE 417T: Introduction to Machine Learning Final Review Henry Chai 12/4/18 Overfitting Overfitting is fitting the training data more than is warranted Fitting noise rather than signal 2 Estimating! "#$

More information

Gradient Sampling for Improved Action Selection and Path Synthesis

Gradient Sampling for Improved Action Selection and Path Synthesis Gradient Sampling for Improved Action Selection and Path Synthesis Ian M. Mitchell Department of Computer Science The University of British Columbia November 2016 mitchell@cs.ubc.ca http://www.cs.ubc.ca/~mitchell

More information

The particle swarm optimization algorithm: convergence analysis and parameter selection

The particle swarm optimization algorithm: convergence analysis and parameter selection Information Processing Letters 85 (2003) 317 325 www.elsevier.com/locate/ipl The particle swarm optimization algorithm: convergence analysis and parameter selection Ioan Cristian Trelea INA P-G, UMR Génie

More information

Gaussian with mean ( µ ) and standard deviation ( σ)

Gaussian with mean ( µ ) and standard deviation ( σ) Slide from Pieter Abbeel Gaussian with mean ( µ ) and standard deviation ( σ) 10/6/16 CSE-571: Robotics X ~ N( µ, σ ) Y ~ N( aµ + b, a σ ) Y = ax + b + + + + 1 1 1 1 1 1 1 1 1 1, ~ ) ( ) ( ), ( ~ ), (

More information

Regression with Numerical Optimization. Logistic

Regression with Numerical Optimization. Logistic CSG220 Machine Learning Fall 2008 Regression with Numerical Optimization. Logistic regression Regression with Numerical Optimization. Logistic regression based on a document by Andrew Ng October 3, 204

More information

Data Fitting and Uncertainty

Data Fitting and Uncertainty TiloStrutz Data Fitting and Uncertainty A practical introduction to weighted least squares and beyond With 124 figures, 23 tables and 71 test questions and examples VIEWEG+ TEUBNER IX Contents I Framework

More information

E190Q Lecture 10 Autonomous Robot Navigation

E190Q Lecture 10 Autonomous Robot Navigation E190Q Lecture 10 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 2015 1 Figures courtesy of Siegwart & Nourbakhsh Kilobots 2 https://www.youtube.com/watch?v=2ialuwgafd0 Control Structures

More information

5.1 2D example 59 Figure 5.1: Parabolic velocity field in a straight two-dimensional pipe. Figure 5.2: Concentration on the input boundary of the pipe. The vertical axis corresponds to r 2 -coordinate,

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

Adaptive Primal Dual Optimization for Image Processing and Learning

Adaptive Primal Dual Optimization for Image Processing and Learning Adaptive Primal Dual Optimization for Image Processing and Learning Tom Goldstein Rice University tag7@rice.edu Ernie Esser University of British Columbia eesser@eos.ubc.ca Richard Baraniuk Rice University

More information

Shape Outlier Detection Using Pose Preserving Dynamic Shape Models

Shape Outlier Detection Using Pose Preserving Dynamic Shape Models Shape Outlier Detection Using Pose Preserving Dynamic Shape Models Chan-Su Lee and Ahmed Elgammal Rutgers, The State University of New Jersey Department of Computer Science Outline Introduction Shape Outlier

More information

Project 4: Navier-Stokes Solution to Driven Cavity and Channel Flow Conditions

Project 4: Navier-Stokes Solution to Driven Cavity and Channel Flow Conditions Project 4: Navier-Stokes Solution to Driven Cavity and Channel Flow Conditions R. S. Sellers MAE 5440, Computational Fluid Dynamics Utah State University, Department of Mechanical and Aerospace Engineering

More information

FINAL: CS 6375 (Machine Learning) Fall 2014

FINAL: CS 6375 (Machine Learning) Fall 2014 FINAL: CS 6375 (Machine Learning) Fall 2014 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run out of room for

More information

Measurement of electric potential fields

Measurement of electric potential fields Measurement of electric potential fields Matthew Krupcale, Oliver Ernst Department of Physics, Case Western Reserve University, Cleveland Ohio, 44106-7079 18 November 2012 Abstract In electrostatics, Laplace

More information

Notes on Back Propagation in 4 Lines

Notes on Back Propagation in 4 Lines Notes on Back Propagation in 4 Lines Lili Mou moull12@sei.pku.edu.cn March, 2015 Congratulations! You are reading the clearest explanation of forward and backward propagation I have ever seen. In this

More information

Key words. Shape Derivatives, Topological Derivatives, Level Set Methods, Shape Optimization, Structural Design.

Key words. Shape Derivatives, Topological Derivatives, Level Set Methods, Shape Optimization, Structural Design. INCORPORATING TOPOLOGICAL DERIVATIVES INTO SHAPE DERIVATIVES BASED LEVEL SET METHODS LIN HE, CHIU-YEN KAO, AND STANLEY OSHER Abstract. Shape derivatives and topological derivatives have been incorporated

More information

Midterm: CS 6375 Spring 2015 Solutions

Midterm: CS 6375 Spring 2015 Solutions Midterm: CS 6375 Spring 2015 Solutions The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run out of room for an

More information

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I

More information

Combining multiple surrogate models to accelerate failure probability estimation with expensive high-fidelity models

Combining multiple surrogate models to accelerate failure probability estimation with expensive high-fidelity models Combining multiple surrogate models to accelerate failure probability estimation with expensive high-fidelity models Benjamin Peherstorfer a,, Boris Kramer a, Karen Willcox a a Department of Aeronautics

More information

RECURRENT NETWORKS I. Philipp Krähenbühl

RECURRENT NETWORKS I. Philipp Krähenbühl RECURRENT NETWORKS I Philipp Krähenbühl RECAP: CLASSIFICATION conv 1 conv 2 conv 3 conv 4 1 2 tu RECAP: SEGMENTATION conv 1 conv 2 conv 3 conv 4 RECAP: DETECTION conv 1 conv 2 conv 3 conv 4 RECAP: GENERATION

More information

Iterative Timing Recovery

Iterative Timing Recovery Iterative Timing Recovery John R. Barry School of Electrical and Computer Engineering, Georgia Tech Atlanta, Georgia U.S.A. barry@ece.gatech.edu 0 Outline Timing Recovery Tutorial Problem statement TED:

More information