Distributed Consensus Algorithms on Manifolds. Roberto Tron, Bijan Afsari, René Vidal Johns Hopkins University

Size: px
Start display at page:

Download "Distributed Consensus Algorithms on Manifolds. Roberto Tron, Bijan Afsari, René Vidal Johns Hopkins University"

Transcription

1 Distributed Consensus Algorithms on Manifolds Roberto Tron, Bijan Afsari, René Vidal Johns Hopkins University

2 Distributed algorithms in controls Mo#on coordina#on Forma#on control Collabora#ve mapping Network op#miza#on

3 Consensus algorithms Global agreement from local interacfons

4

5 New SeJngs

6 Cameras Today 24 hrs/min 30 million cameras 4 bn hrs/week Tasks Detection, Segmentation, Motion Estimation, Stabilization, Registration, Categorization Application Areas End Users National Security Governments Recreation Consumer

7 Camera Networks Classical approach One camera wired to one computer MulFple cameras wired to a central processing unit Processing than by a human operator or by a central computer Problems Flexibility: Wiring makes it hard to deploy new cameras Robustness: central node failure = enfre system failure Scaling: processing and bandwidth requirements do not scale well

8 Camera Sensor Networks Motes Small, wireless devices basery powered limited memory and compufng power ApplicaFons Surveillance Environmental monitoring Smart homes Ques#on Can we deploy vision algorithms on camera sensor networks?

9 Challenges to Computer Vision Algorithms Tradi#onal computer vision algorithms ExisFng algorithms are centralized: all images are sent to one node for processing Computer Vision Sensor networks have limited resources Limited processing power and memory Slow wireless channel Nodes can have limited communicafon range Challenge TradiFonal computer vision algorithms require resources not available in a camera sensor network

10 Challenges to Sensor Network Algorithms Tradi#onal distributed algorithms for sensor networks Existing algorithms have been designed for processing simple scalar measurements Sensor Networks In computer vision applicafons Measurements (images) are high- dimensional Measurements are corrupted by noise, outliers EsFmates are non- Euclidean (e.g. rotafons) Challenge TradiFonal sensor network algorithms cannot be directly used for computer vision applicafons

11 Toward Distributed Computer Vision Algorithms Centralized algorithms Considerable complexity Computer Vision Sensor Networks Distributed algorithms Only simple problems Our goal Develop distributed computer vision algorithms that Are efficient: local processing + short communications Converge to the centralized solution Can handle outliers, packet losses, data on manifolds

12 Consensus on manifolds example y z x z z x y Data on y non- linear x manifolds!

13 Previous work Specific Manifolds Sphere [OlfaF- Saber 2006] N- Torus [Kuramoto model, 1975], [Vicksek model, 1995], [Scardovi, Sepulchre 2007] Extrinsic approach Embedding + ProjecFons [SarleSe, Sepulchre 2009], [Hatanaka, Bullo 2010], [Igarashi, Fujita 2010] CoordinaFon Lie groups [SarleSe, Sepulchre 2010] Rigid mofons [SarleSe, Sepulchre 2009], [Thunberg, Hu 2011], [Bai, Wen ], [Igarashi, Spong, 2012], [Smith, Leonard 2001] LocalizaFon Planar case [Piovan, Bullo 2008] Centralized [Shirmohammadi, Taylor 2010]

14 Our work 1. Consensus on any Riemannian manifold with bounded curvature 2. Distributed image- based localizafon from images Distributed opfmizafon

15 Outline Introduc#on Consensus in Euclidean spaces Riemannian geometry

16 Riemannian consensus Choice of step size Local convergence results Almost- global convergence on SO(3)

17 Image- based camera network localiza#on y x z z y x Setup cost funcfon Link with Riemannian Consensus Results

18 Review of Euclidean consensus

19 NotaFon Graph Measurements States

20 OpFmizaFon cost funcfon

21 OpFmizaFon cost funcfon

22 Algorithm = Gradient Descent

23 Algorithm = Gradient Descent

24 How to choose the step size = bound on max eval of Hessian of = max # of neighbors 2 µ max (ϕ)

25 Gradient Descent

26 Gradient Descent

27 Gradient Descent

28 Gradient Descent

29 Gradient Descent

30 Gradient Descent mean{x i (t)} = mean{x i (t + 1)} \ {

31 Convergence to the mean Iterations x \

32 Review of Riemannian geometry

33 NotaFon

34 \ma ^2(

35 Riemannian Consensus

36 NotaFon Graph Measurements States On manifold

37 OpFmizaFon cost funcfon

38 OpFmizaFon cost funcfon Not convex! Consensus configurations are global minima

39 OpFmizaFon cost funcfon

40 Gradient descent

41 Gradient descent

42 How to choose the step size Theorem = max # of neighbors = bound on max eval of Hessian of Tron, Afsari, Vidal - Riemannian Consensus for Manifolds with Bounded Curvature to appear in Transactions on Automatic Control, µ max (ϕ)

43 \v Spaces of non- negafve constant curvature All other spaces might depend on max distance between neighboring states

44 Gradient Descent

45 Gradient Descent

46 Gradient Descent

47 Gradient Descent

48 Convergence Theorem Measurements not too dispersed Step size small enough Convergence to consensus configurafons Tron, Afsari, Vidal - Riemannian Consensus for Manifolds with Bounded Curvature to appear in Transactions on Automatic Control, 2012

49 1. Given a set containing all global minimizers S = (x 1,...,x N ) M N : x 0 M s.t. max d M (x i,x 0 ) <r i r = 1 2 min inj M, π S M N 2. Guarantee iterates do not leave this set Tron, Afsari, Vidal - Average Consensus on Riemannian Manifolds with Bounded Curvature IEEE Conference on Decision and Control 2011

50 Convergence Theorem Spaces of constant, non- negafve curvature Converge to a point inside convex hull Point need not be the global Frechet mean Tron, Afsari, Vidal - Riemannian Consensus for Manifolds with Bounded Curvature to appear in Transactions on Automatic Control, 2012

51 Convergence General result If inifal measurements are inside Sconv And step size is small enough Then, Riemannian Consensus converges to a set in Sconv Constant non- negafve curvature Convergence set is enlarged to Sr The convergence result is to a single consensus state instead of to a set The consensus state is shown to lie in the convex hull of the inifal measurements

52 Experiments Sphere(7) N=15 nodes, 4- regular graph

53 Experiments SO(7) N=15 nodes, 4- regular graph

54 Experiments SO(3) Closed geodesic N=15 nodes, 4- regular graph, closed geodesic

55 x 1 x 4 x 2 Local minimum x 3

56 d^

57 Tron, Afsari, Vidal Intrinsic consensus on SO(3) with almost-global convergence to appear in Conference on Decision and Control 2012 d^

58 Tron, Afsari, Vidal Intrinsic consensus on SO(3) with almost-global convergence to appear in Conference on Decision and Control 2012 f_{

59 x 1 x 4 x 2 Saddle point x 3

60

61 Almost- global convergence Theorem On SO(3), the only set of stable equilibria is the set of global minimizers Tron, Afsari, Vidal Intrinsic consensus on SO(3) with almost-global convergence to appear in Conference on Decision and Control 2012

62 Experiments Squared distance N=15 nodes, 4- regular graph, closed geodesic

63 Experiments Reshaped distance

64 Average number of iterafons for a K- regular network with N nodes N K Avg. It

65 Image- based camera network localizafon

66 y x z z y x

67 Vision graph A B C

68 Input (\til

69 Output y Camera j x z Camera i z y g i =(R i,t i ) x

70 \m (3 min {g ij } (i,j) E f d SE(3) (g ij, g ij ) d^

71 Challenge 1: Inconsistent Transf. g 12 g23 g 31

72 OpFmizaFon problem min {g ij } s.t. (i,j) E f d SE(3) (g ij, g ij ) are consistent \m (3 d^

73 Pose reparametrizafon Theorem are consistent g ij z y g j z y g i x x Tron, Vidal Distributed image-based 3-D localization in camera sensor networks Conference on Decision and Control 2009

74 \m (3 OpFmizaFon problem min {g ij } (i,j) E f d SE(3) (g ij, g ij ) d^

75 \m (3 f d_{ d OpFmizaFon problem min {g ij } (i,j) E f d SE(3) (g 1 g i j, g ij ) f\b

76 OpFmizaFon problem

77 OpFmizaFon problem

78 Challenge 2: Scale Ambiguity y y x z x z z y z y x x

79 Fix shortest edge length y x z z y 1 unit x

80 OpFmizaFon problem How to get global minimizer? Tron, Vidal - Distributed Image-Based 3-D Localization of Camera Sensor Networks in Conference on Decision and Control 2009

81 Ini#aliza#on of rota#ons \m (R

82 Ini#aliza#on of transla#ons \m R \t

83 Complete op#miza#on min ϕ({r i,t i, λ ij }) {R i,t i,λ ij } s.t. λ ij 1 \min {R_i, \lam

84 Gradient descent y x z z y x

85 How to choose the step size Theorem = max # of neighbors = bound on max eval of Hessian of

86 Noiseless case Theorem In noiseless case, equivalent to Riemannian consensus

87 Noiseless case Theorem In noiseless case, non- convex, but no local minima!

88 Experiments on synthefc data Experiment: 8 cameras, 30 scene points, 4- regular graph 1. Camera poses and points 2. Images + Noise 3. RelaFve poses 4. LocalizaFon

89 Experiments on synthefc data RotaFon errors [degrees] Noise 0 pixels 1 pixels 3 pixels IniFal Final 0.00 ± ± ± ± ± ±0.03 Geometric variance of scale rafos Noise 0 pixels 1 pixels 3 pixels Final TranslaFon direcfon errors [degrees] Noise 0 pixels 1 pixels 3 pixels IniFal Final 0.00 ± ± ± ± ± ±0.03

90 Experiments on real data TRON et al.: DISTRIBUTED IMAGE-BASED 3-D LOCALIZATION OF CAMERA SENSOR NETWORKS Fig Images with features used by Bundler to compute the localization diagonal and with positive entries (because M M is symmetric and positive definite, hence unitarily diagonalizable). 1 Define the change of coordinates y = Σ 2 U y. Then we can rewrite the cost as q ϕ(l) = y y = (y )2i. (35) j

91 TRON et al.: DISTRIBUTED IMAGE-BASED 3-D LOCALIZATION OF CAMERA SENSOR NETWORKS Fig Images with features used by Bundler to compute the localization diagonal and with positive entries (because M M is symmetric and positive definite, hence unitarily diagonalizable). 1 Define the change of coordinates y = Σ 2 U y. Then we can rewrite the cost as q ϕ(l) = y y = (y )2i. (35) j 1 i=1 At the same time, the update equation can be written as y(l + 1) = y(l) εm M y(l) = (I εm M )y(l). By substituting M M with its decomposition (and also remembering that diagonal matrices commute), we can make the same change of coordinates as before. After some manipulations we obtain that q ϕ(l + 1) = y (I εσ)2 y = (1 εσi )2 (y )2i. (36) 1 j The denotes the center of the circles. Fig. 4. An illustration for the second part of the proof for Theorem 2 showing an example of the Gers gorin discs for (I εm M ). The region where the eigenvalues could be located is highlighted in solid gray. i=1 Each term of (36) is positive and is smaller than the corresponding term in (35) if and only if 1 εσi < 1 i = 1,..., p (37) 2 which is equivalent to 0 < ε < σmax. If ε satisfies these conditions, then ϕ(l+1) ϕ(l), with equality only if ϕ(l) = 0 (i.e., when the algorithm is at the global optimum). This completes the first part of the proof. For the second part, one can notice that requiring condition (37) is the same as requiring that all the eigenvalues of the matrix (I εm M ) are inside the unit circle. Since M M is a symmetric, positive semidefinite matrix, we know that its eigeinvalues σ are real and non-negative. This means that the eigenvalues of (I εm M ) are less or equal to one for any positive ε. We therefore need to make sure that these eigenvalues are greater than 1. One way to achieve this is by choosing ε such that all Gers gorin discs do not extend further than 1. This can be mathematically written as p (1 ε(m M )i,i ) ε (M M )i,j > 1 j=1,j =i disc center disc radius (38) for all i = 1,..., p. By expliciting ε, the main result of the Theorem can be obtained. As a remark, this Theorem generalize the proof of convergence for standard consensus (see, for instance, [6]). In fact, for standard consensus we have M = C, M M = L, where L is the graph Laplacian. Using our Theorem we get ε (0, 1G ), in accordance with previous results in the literature. C. Proof of Theorem 3 Define M and y as in (26) and (27). As explained in Section III-B, our algorithm is essentially a projected gradient descent procedure to minimize the cost ϕt (l) = 12 M y(l) 2. The proof is essentially based on the application of an extended version of Theorem 2. We need to introduce now some additional notation. Let v be a vector of norm one and d > 0 a scalar. Define the halfspace S = {x Rp : x v d}. Given a vector y Rp, the projection of y on S can be expressed as: y if y v d projs (y) = (39) (I vv )y + dv otherwise Snavely, Seitz, Szeliski. Photo Tourism: Exploring image collections in 3D in ACM Transactions on Graphics

92 Experiments on real data

93 Experiments on real data RotaFon errors per edge TranslaFon direcfon errors per edge

94 Conclusion Riemannian Consensus Theory Sufficient convergence on any Riemannian manifold Almost- global convergence on SO(3) Applica#ons Image- based camera network localizafon

95 Future work Convergence rate Incorporate uncertainfes Dynamic case Time- varying measurements Pose + velocity Improved localizafon

96 Thanks!

Consensus Algorithms for Camera Sensor Networks. Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University

Consensus Algorithms for Camera Sensor Networks. Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University Consensus Algorithms for Camera Sensor Networks Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University Camera Sensor Networks Motes Small, battery powered Embedded camera Wireless interface

More information

Sensor Localization and Target Estimation in Visual Sensor Networks

Sensor Localization and Target Estimation in Visual Sensor Networks Annual Schedule of my Research Sensor Localization and Target Estimation in Visual Sensor Networks Survey and Problem Settings Presented in the FL seminar on May th First Trial and Evaluation of Proposed

More information

Method 1: Geometric Error Optimization

Method 1: Geometric Error Optimization Method 1: Geometric Error Optimization we need to encode the constraints ŷ i F ˆx i = 0, rank F = 2 idea: reconstruct 3D point via equivalent projection matrices and use reprojection error equivalent projection

More information

Robust Camera Location Estimation by Convex Programming

Robust Camera Location Estimation by Convex Programming Robust Camera Location Estimation by Convex Programming Onur Özyeşil and Amit Singer INTECH Investment Management LLC 1 PACM and Department of Mathematics, Princeton University SIAM IS 2016 05/24/2016,

More information

Generalized Weiszfeld Algorithms for Lq Optimization

Generalized Weiszfeld Algorithms for Lq Optimization 1 Generalized Weiszfeld Algorithms for Lq Optimization Khurrum Aftab, Richard Hartley, Jochen Trumpf Abstract In many computer vision applications, a desired model of some type is computed by minimizing

More information

Optimisation on Manifolds

Optimisation on Manifolds Optimisation on Manifolds K. Hüper MPI Tübingen & Univ. Würzburg K. Hüper (MPI Tübingen & Univ. Würzburg) Applications in Computer Vision Grenoble 18/9/08 1 / 29 Contents 2 Examples Essential matrix estimation

More information

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Amit Singer Princeton University Department of Mathematics and Program in Applied and Computational Mathematics

More information

II. DIFFERENTIABLE MANIFOLDS. Washington Mio CENTER FOR APPLIED VISION AND IMAGING SCIENCES

II. DIFFERENTIABLE MANIFOLDS. Washington Mio CENTER FOR APPLIED VISION AND IMAGING SCIENCES II. DIFFERENTIABLE MANIFOLDS Washington Mio Anuj Srivastava and Xiuwen Liu (Illustrations by D. Badlyans) CENTER FOR APPLIED VISION AND IMAGING SCIENCES Florida State University WHY MANIFOLDS? Non-linearity

More information

Distances between spectral densities. V x. V y. Genesis of this talk. The key point : the value of chordal distances. Spectrum approximation problem

Distances between spectral densities. V x. V y. Genesis of this talk. The key point : the value of chordal distances. Spectrum approximation problem Genesis of this talk Distances between spectral densities The shortest distance between two points is always under construction. (R. McClanahan) R. Sepulchre -- University of Cambridge Celebrating Anders

More information

Pose Tracking II! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 12! stanford.edu/class/ee267/!

Pose Tracking II! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 12! stanford.edu/class/ee267/! Pose Tracking II! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 12! stanford.edu/class/ee267/!! WARNING! this class will be dense! will learn how to use nonlinear optimization

More information

Introduction to gradient descent

Introduction to gradient descent 6-1: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction to gradient descent Derivation and intuitions Hessian 6-2: Introduction to gradient descent Prof. J.C. Kao, UCLA Introduction Our

More information

Scalable Subspace Clustering

Scalable Subspace Clustering Scalable Subspace Clustering René Vidal Center for Imaging Science, Laboratory for Computational Sensing and Robotics, Institute for Computational Medicine, Department of Biomedical Engineering, Johns

More information

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu Dimension Reduction Techniques Presented by Jie (Jerry) Yu Outline Problem Modeling Review of PCA and MDS Isomap Local Linear Embedding (LLE) Charting Background Advances in data collection and storage

More information

L26: Advanced dimensionality reduction

L26: Advanced dimensionality reduction L26: Advanced dimensionality reduction The snapshot CA approach Oriented rincipal Components Analysis Non-linear dimensionality reduction (manifold learning) ISOMA Locally Linear Embedding CSCE 666 attern

More information

Edges and Scale. Image Features. Detecting edges. Origin of Edges. Solution: smooth first. Effects of noise

Edges and Scale. Image Features. Detecting edges. Origin of Edges. Solution: smooth first. Effects of noise Edges and Scale Image Features From Sandlot Science Slides revised from S. Seitz, R. Szeliski, S. Lazebnik, etc. Origin of Edges surface normal discontinuity depth discontinuity surface color discontinuity

More information

Gradient Descent. Dr. Xiaowei Huang

Gradient Descent. Dr. Xiaowei Huang Gradient Descent Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ Up to now, Three machine learning algorithms: decision tree learning k-nn linear regression only optimization objectives are discussed,

More information

Motion Estimation (I)

Motion Estimation (I) Motion Estimation (I) Ce Liu celiu@microsoft.com Microsoft Research New England We live in a moving world Perceiving, understanding and predicting motion is an important part of our daily lives Motion

More information

A New Trust Region Algorithm Using Radial Basis Function Models

A New Trust Region Algorithm Using Radial Basis Function Models A New Trust Region Algorithm Using Radial Basis Function Models Seppo Pulkkinen University of Turku Department of Mathematics July 14, 2010 Outline 1 Introduction 2 Background Taylor series approximations

More information

Manifold Coarse Graining for Online Semi-supervised Learning

Manifold Coarse Graining for Online Semi-supervised Learning for Online Semi-supervised Learning Mehrdad Farajtabar, Amirreza Shaban, Hamid R. Rabiee, Mohammad H. Rohban Digital Media Lab, Department of Computer Engineering, Sharif University of Technology, Tehran,

More information

A Riemannian Framework for Denoising Diffusion Tensor Images

A Riemannian Framework for Denoising Diffusion Tensor Images A Riemannian Framework for Denoising Diffusion Tensor Images Manasi Datar No Institute Given Abstract. Diffusion Tensor Imaging (DTI) is a relatively new imaging modality that has been extensively used

More information

Verifying Global Minima for L 2 Minimization Problems

Verifying Global Minima for L 2 Minimization Problems Verifying Global Minima for L 2 Minimization Problems Richard Hartley Australian National University and NICTA Yongduek Seo Sogang University, Korea Abstract We consider the least-squares (L2) triangulation

More information

Feature extraction: Corners and blobs

Feature extraction: Corners and blobs Feature extraction: Corners and blobs Review: Linear filtering and edge detection Name two different kinds of image noise Name a non-linear smoothing filter What advantages does median filtering have over

More information

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto Unsupervised Learning Techniques 9.520 Class 07, 1 March 2006 Andrea Caponnetto About this class Goal To introduce some methods for unsupervised learning: Gaussian Mixtures, K-Means, ISOMAP, HLLE, Laplacian

More information

Stochastic gradient descent on Riemannian manifolds

Stochastic gradient descent on Riemannian manifolds Stochastic gradient descent on Riemannian manifolds Silvère Bonnabel 1 Centre de Robotique - Mathématiques et systèmes Mines ParisTech SMILE Seminar Mines ParisTech Novembre 14th, 2013 1 silvere.bonnabel@mines-paristech

More information

Robust Principal Component Analysis

Robust Principal Component Analysis ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M

More information

Zeno-free, distributed event-triggered communication and control for multi-agent average consensus

Zeno-free, distributed event-triggered communication and control for multi-agent average consensus Zeno-free, distributed event-triggered communication and control for multi-agent average consensus Cameron Nowzari Jorge Cortés Abstract This paper studies a distributed event-triggered communication and

More information

Diffusion based Projection Method for Distributed Source Localization in Wireless Sensor Networks

Diffusion based Projection Method for Distributed Source Localization in Wireless Sensor Networks The Third International Workshop on Wireless Sensor, Actuator and Robot Networks Diffusion based Projection Method for Distributed Source Localization in Wireless Sensor Networks Wei Meng 1, Wendong Xiao,

More information

Math 302 Outcome Statements Winter 2013

Math 302 Outcome Statements Winter 2013 Math 302 Outcome Statements Winter 2013 1 Rectangular Space Coordinates; Vectors in the Three-Dimensional Space (a) Cartesian coordinates of a point (b) sphere (c) symmetry about a point, a line, and a

More information

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Overview Introduction Linear Methods for Dimensionality Reduction Nonlinear Methods and Manifold

More information

Biometrics: Introduction and Examples. Raymond Veldhuis

Biometrics: Introduction and Examples. Raymond Veldhuis Biometrics: Introduction and Examples Raymond Veldhuis 1 Overview Biometric recognition Face recognition Challenges Transparent face recognition Large-scale identification Watch list Anonymous biometrics

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

Non-linear Dimensionality Reduction

Non-linear Dimensionality Reduction Non-linear Dimensionality Reduction CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Introduction Laplacian Eigenmaps Locally Linear Embedding (LLE)

More information

Convex Optimization of Graph Laplacian Eigenvalues

Convex Optimization of Graph Laplacian Eigenvalues Convex Optimization of Graph Laplacian Eigenvalues Stephen Boyd Stanford University (Joint work with Persi Diaconis, Arpita Ghosh, Seung-Jean Kim, Sanjay Lall, Pablo Parrilo, Amin Saberi, Jun Sun, Lin

More information

Global Positioning from Local Distances

Global Positioning from Local Distances Global Positioning from Local Distances Amit Singer Yale University, Department of Mathematics, Program in Applied Mathematics SIAM Annual Meeting, July 2008 Amit Singer (Yale University) San Diego 1 /

More information

Shape of Gaussians as Feature Descriptors

Shape of Gaussians as Feature Descriptors Shape of Gaussians as Feature Descriptors Liyu Gong, Tianjiang Wang and Fang Liu Intelligent and Distributed Computing Lab, School of Computer Science and Technology Huazhong University of Science and

More information

Non-convex optimization. Issam Laradji

Non-convex optimization. Issam Laradji Non-convex optimization Issam Laradji Strongly Convex Objective function f(x) x Strongly Convex Objective function Assumptions Gradient Lipschitz continuous f(x) Strongly convex x Strongly Convex Objective

More information

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture

More information

FFTs in Graphics and Vision. The Laplace Operator

FFTs in Graphics and Vision. The Laplace Operator FFTs in Graphics and Vision The Laplace Operator 1 Outline Math Stuff Symmetric/Hermitian Matrices Lagrange Multipliers Diagonalizing Symmetric Matrices The Laplacian Operator 2 Linear Operators Definition:

More information

Uncertainty Models in Quasiconvex Optimization for Geometric Reconstruction

Uncertainty Models in Quasiconvex Optimization for Geometric Reconstruction Uncertainty Models in Quasiconvex Optimization for Geometric Reconstruction Qifa Ke and Takeo Kanade Department of Computer Science, Carnegie Mellon University Email: ke@cmu.edu, tk@cs.cmu.edu Abstract

More information

ISSN: (Online) Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies

ISSN: (Online) Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies ISSN: 2321-7782 (Online) Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case Study Available online at:

More information

CSE 291. Assignment Spectral clustering versus k-means. Out: Wed May 23 Due: Wed Jun 13

CSE 291. Assignment Spectral clustering versus k-means. Out: Wed May 23 Due: Wed Jun 13 CSE 291. Assignment 3 Out: Wed May 23 Due: Wed Jun 13 3.1 Spectral clustering versus k-means Download the rings data set for this problem from the course web site. The data is stored in MATLAB format as

More information

Super-Resolution. Dr. Yossi Rubner. Many slides from Miki Elad - Technion

Super-Resolution. Dr. Yossi Rubner. Many slides from Miki Elad - Technion Super-Resolution Dr. Yossi Rubner yossi@rubner.co.il Many slides from Mii Elad - Technion 5/5/2007 53 images, ratio :4 Example - Video 40 images ratio :4 Example Surveillance Example Enhance Mosaics Super-Resolution

More information

Research on Consistency Problem of Network Multi-agent Car System with State Predictor

Research on Consistency Problem of Network Multi-agent Car System with State Predictor International Core Journal of Engineering Vol. No.9 06 ISSN: 44-895 Research on Consistency Problem of Network Multi-agent Car System with State Predictor Yue Duan a, Linli Zhou b and Yue Wu c Institute

More information

Meaning of the Hessian of a function in a critical point

Meaning of the Hessian of a function in a critical point Meaning of the Hessian of a function in a critical point Mircea Petrache February 1, 2012 We consider a function f : R n R and assume for it to be differentiable with continuity at least two times (that

More information

Manifold Regularization

Manifold Regularization 9.520: Statistical Learning Theory and Applications arch 3rd, 200 anifold Regularization Lecturer: Lorenzo Rosasco Scribe: Hooyoung Chung Introduction In this lecture we introduce a class of learning algorithms,

More information

Lecture 6 Positive Definite Matrices

Lecture 6 Positive Definite Matrices Linear Algebra Lecture 6 Positive Definite Matrices Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Spring 2017 2017/6/8 Lecture 6: Positive Definite Matrices

More information

Machine Learning Practice Page 2 of 2 10/28/13

Machine Learning Practice Page 2 of 2 10/28/13 Machine Learning 10-701 Practice Page 2 of 2 10/28/13 1. True or False Please give an explanation for your answer, this is worth 1 pt/question. (a) (2 points) No classifier can do better than a naive Bayes

More information

Geometric Modelling Summer 2016

Geometric Modelling Summer 2016 Geometric Modelling Summer 2016 Exercises Benjamin Karer M.Sc. http://gfx.uni-kl.de/~gm Benjamin Karer M.Sc. Geometric Modelling Summer 2016 1 Dierential Geometry Benjamin Karer M.Sc. Geometric Modelling

More information

APOCS: A RAPIDLY CONVERGENT SOURCE LOCALIZATION ALGORITHM FOR SENSOR NETWORKS. Doron Blatt and Alfred O. Hero, III

APOCS: A RAPIDLY CONVERGENT SOURCE LOCALIZATION ALGORITHM FOR SENSOR NETWORKS. Doron Blatt and Alfred O. Hero, III APOCS: A RAPIDLY CONVERGENT SOURCE LOCALIZATION ALGORITHM FOR SENSOR NETWORKS Doron Blatt and Alfred O. Hero, III Department of Electrical Engineering and Computer Science, University of Michigan, Ann

More information

Ranking from Crowdsourced Pairwise Comparisons via Matrix Manifold Optimization

Ranking from Crowdsourced Pairwise Comparisons via Matrix Manifold Optimization Ranking from Crowdsourced Pairwise Comparisons via Matrix Manifold Optimization Jialin Dong ShanghaiTech University 1 Outline Introduction FourVignettes: System Model and Problem Formulation Problem Analysis

More information

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix René Vidal Stefano Soatto Shankar Sastry Department of EECS, UC Berkeley Department of Computer Sciences, UCLA 30 Cory Hall,

More information

Stochastic gradient descent on Riemannian manifolds

Stochastic gradient descent on Riemannian manifolds Stochastic gradient descent on Riemannian manifolds Silvère Bonnabel 1 Robotics lab - Mathématiques et systèmes Mines ParisTech Gipsa-lab, Grenoble June 20th, 2013 1 silvere.bonnabel@mines-paristech Introduction

More information

Distributed Coordinated Tracking With Reduced Interaction via a Variable Structure Approach Yongcan Cao, Member, IEEE, and Wei Ren, Member, IEEE

Distributed Coordinated Tracking With Reduced Interaction via a Variable Structure Approach Yongcan Cao, Member, IEEE, and Wei Ren, Member, IEEE IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 57, NO. 1, JANUARY 2012 33 Distributed Coordinated Tracking With Reduced Interaction via a Variable Structure Approach Yongcan Cao, Member, IEEE, and Wei Ren,

More information

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Alvina Goh Vision Reading Group 13 October 2005 Connection of Local Linear Embedding, ISOMAP, and Kernel Principal

More information

Performance Surfaces and Optimum Points

Performance Surfaces and Optimum Points CSC 302 1.5 Neural Networks Performance Surfaces and Optimum Points 1 Entrance Performance learning is another important class of learning law. Network parameters are adjusted to optimize the performance

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

Maintaining Communication Between an Explorer and a Base Station

Maintaining Communication Between an Explorer and a Base Station Maintaining Communication Between an Explorer and a Base Station Miroslaw Dynia, Jaros law Kuty lowski 2, Pawe l Lorek 3, and Friedhelm Meyer auf der Heide 4 DFG Graduate College Automatic Configuration

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Konstantin Tretyakov (kt@ut.ee) MTAT.03.227 Machine Learning So far Machine learning is important and interesting The general concept: Fitting models to data So far Machine

More information

Convex Optimization of Graph Laplacian Eigenvalues

Convex Optimization of Graph Laplacian Eigenvalues Convex Optimization of Graph Laplacian Eigenvalues Stephen Boyd Abstract. We consider the problem of choosing the edge weights of an undirected graph so as to maximize or minimize some function of the

More information

Distance Preservation - Part 2

Distance Preservation - Part 2 Distance Preservation - Part 2 Graph Distances Niko Vuokko October 9th 2007 NLDR Seminar Outline Introduction Geodesic and graph distances From linearity to nonlinearity Isomap Geodesic NLM Curvilinear

More information

MAA507, Power method, QR-method and sparse matrix representation.

MAA507, Power method, QR-method and sparse matrix representation. ,, and representation. February 11, 2014 Lecture 7: Overview, Today we will look at:.. If time: A look at representation and fill in. Why do we need numerical s? I think everyone have seen how time consuming

More information

Cooperative Control and Mobile Sensor Networks

Cooperative Control and Mobile Sensor Networks Cooperative Control and Mobile Sensor Networks Cooperative Control, Part I, A-C Naomi Ehrich Leonard Mechanical and Aerospace Engineering Princeton University and Electrical Systems and Automation University

More information

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M =

CALCULUS ON MANIFOLDS. 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M = CALCULUS ON MANIFOLDS 1. Riemannian manifolds Recall that for any smooth manifold M, dim M = n, the union T M = a M T am, called the tangent bundle, is itself a smooth manifold, dim T M = 2n. Example 1.

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-3: Unconstrained optimization II Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428

More information

Distributed Optimization over Networks Gossip-Based Algorithms

Distributed Optimization over Networks Gossip-Based Algorithms Distributed Optimization over Networks Gossip-Based Algorithms Angelia Nedić angelia@illinois.edu ISE Department and Coordinated Science Laboratory University of Illinois at Urbana-Champaign Outline Random

More information

Nonlinear Dimensionality Reduction

Nonlinear Dimensionality Reduction Nonlinear Dimensionality Reduction Piyush Rai CS5350/6350: Machine Learning October 25, 2011 Recap: Linear Dimensionality Reduction Linear Dimensionality Reduction: Based on a linear projection of the

More information

Outline Introduction: Problem Description Diculties Algebraic Structure: Algebraic Varieties Rank Decient Toeplitz Matrices Constructing Lower Rank St

Outline Introduction: Problem Description Diculties Algebraic Structure: Algebraic Varieties Rank Decient Toeplitz Matrices Constructing Lower Rank St Structured Lower Rank Approximation by Moody T. Chu (NCSU) joint with Robert E. Funderlic (NCSU) and Robert J. Plemmons (Wake Forest) March 5, 1998 Outline Introduction: Problem Description Diculties Algebraic

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 8 A. d Aspremont. Convex Optimization M2. 1/57 Applications A. d Aspremont. Convex Optimization M2. 2/57 Outline Geometrical problems Approximation problems Combinatorial

More information

Synchronization over Cartan motion groups via contraction

Synchronization over Cartan motion groups via contraction Synchronization over Cartan motion groups via contraction Onur Özyeşil, Nir Sharon 1, and Amit Singer 1, 1 Program in Applied and Computational Mathematics, Princeton University Department of Mathematics,

More information

EE Camera & Image Formation

EE Camera & Image Formation Electric Electronic Engineering Bogazici University February 21, 2018 Introduction Introduction Camera models Goal: To understand the image acquisition process. Function of the camera Similar to that of

More information

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Author: Raif M. Rustamov Presenter: Dan Abretske Johns Hopkins 2007 Outline Motivation and Background Laplace-Beltrami Operator

More information

Distributed Detection and Estimation in Wireless Sensor Networks: Resource Allocation, Fusion Rules, and Network Security

Distributed Detection and Estimation in Wireless Sensor Networks: Resource Allocation, Fusion Rules, and Network Security Distributed Detection and Estimation in Wireless Sensor Networks: Resource Allocation, Fusion Rules, and Network Security Edmond Nurellari The University of Leeds, UK School of Electronic and Electrical

More information

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming

CSC Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming CSC2411 - Linear Programming and Combinatorial Optimization Lecture 10: Semidefinite Programming Notes taken by Mike Jamieson March 28, 2005 Summary: In this lecture, we introduce semidefinite programming

More information

Blob Detection CSC 767

Blob Detection CSC 767 Blob Detection CSC 767 Blob detection Slides: S. Lazebnik Feature detection with scale selection We want to extract features with characteristic scale that is covariant with the image transformation Blob

More information

Privacy and Fault-Tolerance in Distributed Optimization. Nitin Vaidya University of Illinois at Urbana-Champaign

Privacy and Fault-Tolerance in Distributed Optimization. Nitin Vaidya University of Illinois at Urbana-Champaign Privacy and Fault-Tolerance in Distributed Optimization Nitin Vaidya University of Illinois at Urbana-Champaign Acknowledgements Shripad Gade Lili Su argmin x2x SX i=1 i f i (x) Applications g f i (x)

More information

Cooperative Control and Mobile Sensor Networks

Cooperative Control and Mobile Sensor Networks Cooperative Control and Mobile Sensor Networks Cooperative Control, Part I, D-F Naomi Ehrich Leonard Mechanical and Aerospace Engineering Princeton University and Electrical Systems and Automation University

More information

1 Kernel methods & optimization

1 Kernel methods & optimization Machine Learning Class Notes 9-26-13 Prof. David Sontag 1 Kernel methods & optimization One eample of a kernel that is frequently used in practice and which allows for highly non-linear discriminant functions

More information

Gossip Based Centroid and Common Reference Frame Estimation in Multi-Agent Systems

Gossip Based Centroid and Common Reference Frame Estimation in Multi-Agent Systems Gossip Based Centroid and Common Reference Frame Estimation in Multi-Agent Systems Mauro Franceschelli and Andrea Gasparri 1 Abstract In this work the decentralized common reference frame estimation problem

More information

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization

CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization CSCI 1951-G Optimization Methods in Finance Part 10: Conic Optimization April 6, 2018 1 / 34 This material is covered in the textbook, Chapters 9 and 10. Some of the materials are taken from it. Some of

More information

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations. Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,

More information

Corners, Blobs & Descriptors. With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros

Corners, Blobs & Descriptors. With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros Corners, Blobs & Descriptors With slides from S. Lazebnik & S. Seitz, D. Lowe, A. Efros Motivation: Build a Panorama M. Brown and D. G. Lowe. Recognising Panoramas. ICCV 2003 How do we build panorama?

More information

Distributed Formation Control without a Global Reference Frame

Distributed Formation Control without a Global Reference Frame 2014 American Control Conference (ACC) June 4-6, 2014. Portland, Oregon, USA Distributed Formation Control without a Global Reference Frame Eduardo Montijano, Dingjiang Zhou, Mac Schwager and Carlos Sagues

More information

Ch4: Method of Steepest Descent

Ch4: Method of Steepest Descent Ch4: Method of Steepest Descent The method of steepest descent is recursive in the sense that starting from some initial (arbitrary) value for the tap-weight vector, it improves with the increased number

More information

Matrix completion: Fundamental limits and efficient algorithms. Sewoong Oh Stanford University

Matrix completion: Fundamental limits and efficient algorithms. Sewoong Oh Stanford University Matrix completion: Fundamental limits and efficient algorithms Sewoong Oh Stanford University 1 / 35 Low-rank matrix completion Low-rank Data Matrix Sparse Sampled Matrix Complete the matrix from small

More information

Gradient Descent and Implementation Solving the Euler-Lagrange Equations in Practice

Gradient Descent and Implementation Solving the Euler-Lagrange Equations in Practice 1 Lecture Notes, HCI, 4.1.211 Chapter 2 Gradient Descent and Implementation Solving the Euler-Lagrange Equations in Practice Bastian Goldlücke Computer Vision Group Technical University of Munich 2 Bastian

More information

Riemannian Optimization Method on the Flag Manifold for Independent Subspace Analysis

Riemannian Optimization Method on the Flag Manifold for Independent Subspace Analysis Riemannian Optimization Method on the Flag Manifold for Independent Subspace Analysis Yasunori Nishimori 1, Shotaro Akaho 1, and Mark D. Plumbley 2 1 Neuroscience Research Institute, National Institute

More information

Attributed Graph Matching for Image-Features Association using SIFT Descriptors

Attributed Graph Matching for Image-Features Association using SIFT Descriptors Attributed Graph Matching for Image-Features Association using SIFT Descriptors Gerard Sanromà, René Alquézar 2, and Francesc Serratosa Departament d Enginyeria Informàtica i Matemàtiques, Universitat

More information

Semidefinite Facial Reduction for Euclidean Distance Matrix Completion

Semidefinite Facial Reduction for Euclidean Distance Matrix Completion Semidefinite Facial Reduction for Euclidean Distance Matrix Completion Nathan Krislock, Henry Wolkowicz Department of Combinatorics & Optimization University of Waterloo First Alpen-Adria Workshop on Optimization

More information

Sensitivity Analysis for the Problem of Matrix Joint Diagonalization

Sensitivity Analysis for the Problem of Matrix Joint Diagonalization Sensitivity Analysis for the Problem of Matrix Joint Diagonalization Bijan Afsari bijan@math.umd.edu AMSC Department Sensitivity Analysis for the Problem of Matrix Joint Diagonalization p. 1/1 Outline

More information

Spectral Processing. Misha Kazhdan

Spectral Processing. Misha Kazhdan Spectral Processing Misha Kazhdan [Taubin, 1995] A Signal Processing Approach to Fair Surface Design [Desbrun, et al., 1999] Implicit Fairing of Arbitrary Meshes [Vallet and Levy, 2008] Spectral Geometry

More information

Stabilization of Collective Motion in Three Dimensions: A Consensus Approach

Stabilization of Collective Motion in Three Dimensions: A Consensus Approach Stabilization of Collective Motion in Three Dimensions: A Consensus Approach Luca Scardovi, Naomi Ehrich Leonard, and Rodolphe Sepulchre Abstract This paper proposes a methodology to stabilize relative

More information

Self-Calibration and Biconvex Compressive Sensing

Self-Calibration and Biconvex Compressive Sensing Self-Calibration and Biconvex Compressive Sensing Shuyang Ling Department of Mathematics, UC Davis July 12, 2017 Shuyang Ling (UC Davis) SIAM Annual Meeting, 2017, Pittsburgh July 12, 2017 1 / 22 Acknowledgements

More information

, b = 0. (2) 1 2 The eigenvectors of A corresponding to the eigenvalues λ 1 = 1, λ 2 = 3 are

, b = 0. (2) 1 2 The eigenvectors of A corresponding to the eigenvalues λ 1 = 1, λ 2 = 3 are Quadratic forms We consider the quadratic function f : R 2 R defined by f(x) = 2 xt Ax b T x with x = (x, x 2 ) T, () where A R 2 2 is symmetric and b R 2. We will see that, depending on the eigenvalues

More information

Fast Linear Iterations for Distributed Averaging 1

Fast Linear Iterations for Distributed Averaging 1 Fast Linear Iterations for Distributed Averaging 1 Lin Xiao Stephen Boyd Information Systems Laboratory, Stanford University Stanford, CA 943-91 lxiao@stanford.edu, boyd@stanford.edu Abstract We consider

More information

Sensor Network Localization from Local Connectivity : Performance Analysis for the MDS-MAP Algorithm

Sensor Network Localization from Local Connectivity : Performance Analysis for the MDS-MAP Algorithm Sensor Network Localization from Local Connectivity : Performance Analysis for the MDS-MAP Algorithm Sewoong Oh, Amin Karbasi, and Andrea Montanari August 27, 2009 Abstract Sensor localization from only

More information

Statistical Pose Averaging with Non-Isotropic and Incomplete Relative Measurements

Statistical Pose Averaging with Non-Isotropic and Incomplete Relative Measurements Statistical Pose Averaging with Non-Isotropic and Incomplete Relative Measurements Roberto Tron and Kostas Daniilidis GRASP Lab, University of Pennsylvania, Philadelphia, PA, USA {tron,kostas}@cis.upenn.edu

More information

Survey of Synchronization Part I: Kuramoto Oscillators

Survey of Synchronization Part I: Kuramoto Oscillators Survey of Synchronization Part I: Kuramoto Oscillators Tatsuya Ibuki FL11-5-2 20 th, May, 2011 Outline of My Research in This Semester Survey of Synchronization - Kuramoto oscillator : This Seminar - Synchronization

More information

Handlebody Decomposition of a Manifold

Handlebody Decomposition of a Manifold Handlebody Decomposition of a Manifold Mahuya Datta Statistics and Mathematics Unit Indian Statistical Institute, Kolkata mahuya@isical.ac.in January 12, 2012 contents Introduction What is a handlebody

More information

Pose estimation from point and line correspondences

Pose estimation from point and line correspondences Pose estimation from point and line correspondences Giorgio Panin October 17, 008 1 Problem formulation Estimate (in a LSE sense) the pose of an object from N correspondences between known object points

More information

Decentralized Stabilization of Heterogeneous Linear Multi-Agent Systems

Decentralized Stabilization of Heterogeneous Linear Multi-Agent Systems 1 Decentralized Stabilization of Heterogeneous Linear Multi-Agent Systems Mauro Franceschelli, Andrea Gasparri, Alessandro Giua, and Giovanni Ulivi Abstract In this paper the formation stabilization problem

More information