Wavelets and Filter Banks on Graphs
|
|
- Adela Matthews
- 5 years ago
- Views:
Transcription
1 Wavelets and Filter Banks on Graphs Pierre Vandergheynst Signal Processing Lab, EPFL Joint work with David Shuman Duke Workshop on Sensing and Analysis of High-Dimensional Data Duke University, July 2011
2 2 Processing Signals on Graphs Social Network Electrical Network Neuronal Network Transportation Network
3 2 Processing Signals on Graphs Social Network Electrical Network Neuronal Network Transportation Network
4 3 Short outline Summary of one wavelet construction on graphs - multiscale, filtering Pyramidal algorithms - polyphase components and downsampling - the Laplacian Pyramid - 2-channels, critically sampled filter banks?
5 4 Spectral Graph Wavelets G=(E,V) a weighted undirected graph, with Laplacian L = D A
6 4 Spectral Graph Wavelets G=(E,V) a weighted undirected graph, with Laplacian L = D A Dilation operates through operator: T t g = g(tl)
7 4 Spectral Graph Wavelets G=(E,V) a weighted undirected graph, with Laplacian L = D A Dilation operates through operator: T t g = g(tl) Translation (localization): Define ψ t,j = T t gδ j ψ t,j (i) = response to a delta at vertex j N 1 =0 g(tλ )φ (j)φ (i) ψ t,a (u) = Lφ (j) =λ φ (j) R dω ˆψ(tω)e jωa e jωu
8 4 Spectral Graph Wavelets G=(E,V) a weighted undirected graph, with Laplacian L = D A Dilation operates through operator: T t g = g(tl) Translation (localization): Define ψ t,j = T t gδ j ψ t,j (i) = response to a delta at vertex j N 1 =0 g(tλ )φ (j)φ (i) ψ t,a (u) = Lφ (j) =λ φ (j) And so formally define the graph wavelet coefficients of f: W f (t, j) =ψ t,j,f W f (t, j) =T t gf(j) = N 1 =0 R dω ˆψ(tω)e jωa e jωu g(tλ ) ˆf()φ (j)
9 5 Frames A, B > O, h : R + R + (i.e. scaling function) 0 <A h 2 (u) + s g(t su) 2 B < scaling function wavelets φ n = T h δ n = h(l)δ n A 2 B 1 A simple way to get a tight frame: 0 0 λ 1 γ(λ )= 1 1/2 dt t g2 (tλ ) g(λ )= γ(λ ) γ(2λ ) for any admissible kernel g
10 6 Scaling & Localization ψ t,i (j)
11 6 Scaling & Localization ψ t,i (j) decreasing scale
12 7 Example
13 7 Example
14 7 Example
15 7 Example
16 7 Example
17 7 Example
18 8 Sparsity and Smoothness on Graphs scaling functions coeffs
19 9 Remark on Implementation Not necessary to compute spectral decomposition for filtering Polynomial approximation : g(tω) K 1 a k (t)p k (ω) 1 k=0 ex: Chebyshev, minimax It is to implement any Fourier multiplier 0 0 λ 40 Then wavelet operator expressed with powers of Laplacian: T t g K 1 k=0 a k (t)l k And use sparsity of Laplacian in an iterative way
20 10 Remark on Implementation W f (t, j) = p(l)f # j W f (t, j) W f (t, j) Bf W f (t n,j)= 1 2 c n,0f # + M n k=1 sup norm control (minimax or Chebyshef) c n,k T k (L)f # j T k (L)f = 2 a 1 (L a 2 I) T k 1 (L)f T k 2 (L)f Computational cost dominated by matrix-vector multiply with (sparse) Laplacian J matrix. In particular O( M n E ) n=1 Note: same algorithm for adjoint!
21 11 Graph wavelets Redundancy breaks sparsity - can we remove some or all of it? Faster algorithms - traditional wavelets have fast filter banks implementation - whatever scale, you use the same filters - here: large scales -> more computations Goal: solve both problems at one
22 12 Basic Ingredients Euclidean multiresolution is based on two main operations Filtering (typically low-pass and high-pass) Down and Up sampling
23 12 Basic Ingredients Euclidean multiresolution is based on two main operations Filtering (typically low-pass and high-pass) Down and Up sampling
24 12 Basic Ingredients Euclidean multiresolution is based on two main operations Filtering (typically low-pass and high-pass) Down and Up sampling Filtering is fine but how do we downsample on graphs???
25 13 Basic Ingredients Subsampling is equivallent to splitting in two cosets (even, odd)
26 13 Basic Ingredients Subsampling is equivallent to splitting in two cosets (even, odd) Questions: How do we partition a graph into meaningful cosets? Are there efficient algorithms for these partitions? Are there theoretical guarantees? How do we define a new graph from the cosets?
27 14 Cosets - A spectral view Subsampling is equivallent to splitting in two cosets (even, odd) Classically, selecting a coset can be interpreted easily in Fourier: f sub (i) = 1 2 f(i) 1 + cos(πi) eigenvector of largest eigenvalue
28 15 Cosets and Nodal Domains Nodal domain: maximally connected subgraph s.t. all vertices have same sign w.r.t a reference function We would like to find a very large number of nodal domains, ideally V! Nodal domains of Laplacian eigenvectors are special (and well studied)
29 15 Cosets and Nodal Domains Nodal domain: maximally connected subgraph s.t. all vertices have same sign w.r.t a reference function We would like to find a very large number of nodal domains, ideally V! Nodal domains of Laplacian eigenvectors are special (and well studied) Theorem: the number of nodal domains associated to the largest laplacian eigenvector of a connected graph is maximal, ν(φ max )=ν(g) = V IFF G is bipartite In general: ν(g) = V χ(g)+2 (extreme cases: bipartite and complete graphs)
30 16 Cosets and Nodal Domains Nodal domain: maximally connected subgraph s.t. all vertices have same sign w.r.t a reference function We would like to find a very large number of nodal domains, ideally V! Nodal domains of Laplacian eigenvectors are special (and well studied) For any connected graph we will thus naturally define cosets and their associated selection functions V + = {i V s.t. φ N 1 (i) 0} M + (i) = sgn(φn 1 (i)) V = {i V s.t. φ N 1 (i) < 0} M (i) = sgn(φn 1 (i))
31 17 Examples of cosets Simple line graph φ k (u) =sin(πku/n + π/2n) λ k =2 2 cos(πk/n) 1 k n
32 17 Examples of cosets Simple line graph φ k (u) =sin(πku/n + π/2n) λ k =2 2 cos(πk/n) 1 k n
33 18 Examples of cosets Simple line graph Simple ring graph φ 1 k(u) =sin(2πku/n) φ 2 k(u) = cos(2πku/n) 1 k n/2 λ k =2 2 cos(2πk/n)
34 18 Examples of cosets Simple line graph Simple ring graph φ 1 k(u) =sin(2πku/n) φ 2 k(u) = cos(2πku/n) 1 k n/2 λ k =2 2 cos(2πk/n)
35 19 Examples of cosets Simple line graph Simple ring graph Lattice
36 19 Examples of cosets Simple line graph Simple ring graph Lattice quincunx
37 20 The Agonizing Limits of Intuition Multiplicity of λ max - how do we choose the control vector in that subspace? - even a prescription can be numerically ill-defined - graphs with flat spectrum in close to their spectral radius Laplacian eigenvectors do not always behave like global oscillations - seems to be true for random perturbations of simple graphs - true even for a class of trees [Saito2011]
38 21 The Laplacian Pyramid Analysis operator x - G y 1 U H D y low
39 22 The Laplacian Pyramid Analysis operator x - y 1 G H D U y 0
40 23 The Laplacian Pyramid Analysis operator x - y 1 G H M y 0
41 23 The Laplacian Pyramid Analysis operator x - y 1 G H M y 0 y 0 = H m x = MHx y 1 = x Gy 0 = x GH m x
42 23 The Laplacian Pyramid Analysis operator x - y 1 G H M y 0
43 23 The Laplacian Pyramid Analysis operator x - y 1 G H M y 0 y0 y 1 y = H m x, I GH m T a
44 24 The Laplacian Pyramid Analysis operator y0 y 1 y = H m x, I GH m T a
45 24 The Laplacian Pyramid Analysis operator y0 y 1 y = H m I GH m T a x, Simple (traditional) left inverse ˆx =(G I ) T s y0 y 1 y T s T a = I with no conditions on H or G
46 25 The Laplacian Pyramid Pseudo Inverse? T a = T a T T a 1Ta T Let s try to use only filters
47 25 The Laplacian Pyramid Pseudo Inverse? T a = T a T T a 1Ta T Let s try to use only filters Define iteratively, through descent on LS: arg min x T a x y 2 2 ˆx k+1 =ˆx k + τt a T (y T aˆx k ) T a T =(H m T I H m T G T )
48 26 The Laplacian Pyramid we can easily implement T a T T a with filters and masks: With the real symmetric matrix x N = τ N 1 Q = T a T T a (I τq) j b j=0 Use Chebyshev approximation of: L(ω) =τ and N 1 b = T a T y (1 τω) j j=0
49 27 Kron Reduction In order to iterate the construction, we need to construct a graph on the reduced vertex set. A r = A[α, α] A[α, α)a(α, α) 1 A(α, α] A[α, α] A[α, α) A = A(α, α] A(α, α)
50 Kron Reduction In order to iterate the construction, we need to construct a graph on the reduced vertex set. A r = A[α, α] A[α, α)a(α, α) 1 A(α, α] A[α, α] A[α, α) A = A(α, α] A(α, α) 1.0 Kron reduction 1/3 1/ /3 [Dorfler et al, 2011]
51 28 Kron Reduction In order to iterate the construction, we need to construct a graph on the reduced vertex set. A r = A[α, α] A[α, α)a(α, α) 1 A(α, α] A[α, α] A[α, α) A = A(α, α] A(α, α) Properties: maps a weighted undirected laplacian to a weighted undirected laplacian spectral interlacing (spectrum does not degenerate) λ k (A) λ k (A r ) λ k+n α (A) disconnected vertices linked in reduced graph IFF there is a path that runs only through eliminated nodes
52 29 Example Note: For a k-regular bipartite graph kin A L = A T ki n Kron-reduced Laplacian: L r = k 2 I n AA T
53 29 Example Note: For a k-regular bipartite graph kin A L = A T ki n Kron-reduced Laplacian: L r = k 2 I n AA T ˆf r (i) = ˆf(i)+ ˆf(N i) i =1,...,N/2
54 30
55 30
56 30
57 30
58 30
59 30
60 31 Filter Banks 2 critically sampled channels Filter H Coset 1 Downsample f 0 Filter G Downsample Coset 2
61 31 Filter Banks 2 critically sampled channels Coset 1 Filter H Downsample f 0 Filter G Downsample Coset 2 Theorem: For a k-rbg, the filter bank is perfect-reconstruction IFF H(i) 2 + G(i) 2 =2 H(i)G(N i)+h(n i)g(i) =0
62 32 Conclusions Structured, data dependent dictionary of wavelets - sparsity and smoothness on graph are merged in simple and elegant fashion - fast algo, clean problem formulation - graph structure can be totally hidden in wavelets Filter banks based on nodal domains or coloring - Universal algo based on filtering and Kron reduction - Efficient IFF some structure in the graph - Unfortunately no closed form theory in general
63 33
64 34 Wavelet Ingredients Wavelet transform based on two operations: Dilation (or scaling) and Translation (or localization) ψ s,a (x) = 1 x a s ψ s
65 34 Wavelet Ingredients Wavelet transform based on two operations: Dilation (or scaling) and Translation (or localization) ψ s,a (x) = 1 x a s ψ s (T s f)(a) = 1 s ψ x a f(x)dx s (T s f)(a) =ψ (s,a),f
66 34 Wavelet Ingredients Wavelet transform based on two operations: Dilation (or scaling) and Translation (or localization) ψ s,a (x) = 1 x a s ψ s (T s f)(a) = Equivalently: 1 s ψ x a f(x)dx (T s f)(a) =ψ s (s,a),f (T s δ a )(x) = 1 s ψ x a s e iωx ˆψ (sω) ˆf(ω)dω (T s f)(x) = 1 2π
67 35 Graph Laplacian and Spectral Theory G =(V,E,w) weighted, undirected graph Non-normalized Laplacian: L = D A Real, symmetric (Lf)(i) = i j w i,j (f(i) f(j)) Why Laplacian?
68 35 Graph Laplacian and Spectral Theory G =(V,E,w) weighted, undirected graph Non-normalized Laplacian: L = D A Real, symmetric (Lf)(i) = i j w i,j (f(i) f(j)) Why Laplacian? Z 2 with usual stencil (Lf) i,j =4f i,j f i+1,j f i 1,j f i,j+1 f i,j 1 In general, graph laplacian from nicely sampled manifold converges to Laplace-Beltrami operator
69 36 Graph Laplacian and Spectral Theory d 2 dx 2 e iωx f(x) = 1 2π ˆf(ω)e iωx dω
70 36 Graph Laplacian and Spectral Theory d 2 dx 2 e iωx f(x) = 1 2π ˆf(ω)e iωx dω Eigen decomposition of Laplacian: Lφ l = λ l φ l
71 36 Graph Laplacian and Spectral Theory d 2 dx 2 e iωx f(x) = 1 2π ˆf(ω)e iωx dω Eigen decomposition of Laplacian: For simplicity assume connected graph and Lφ l = λ l φ l 0=λ 0 < λ 1 λ 2... λ N 1 For any function on the vertex set (vector) we have: N ˆf() =φ,f = φ (i)f(i) Graph Fourier Transform f(i) = N 1 =0 i=1 ˆf()φ (i)
72 37 Spectral Graph Wavelets Remember good old Euclidean case: (T s f)(x) = 1 e iωx 2π ˆψ (sω) ˆf(ω)dω We will adopt this operator view
73 37 Spectral Graph Wavelets Remember good old Euclidean case: (T s f)(x) = 1 e iωx 2π ˆψ (sω) ˆf(ω)dω We will adopt this operator view Operator-valued function via continuous Borel functional calculus g : R + R + T g = g(l) Operator-valued function Action of operator is induced by its Fourier symbol T g f() =g(λ ) ˆf() N 1 (T g f)(i) = g(λ ) ˆf()φ (i) =0
74 38 Non-local Wavelet Frame Non-local Wavelets are Graph Wavelets on Non-Local Graph ψ t, (i) increasing scale Interest: good adaptive sparsity basis
75 39 Distributed Computation Scenario: Network of N nodes, each knows - local data f(n) - local neighbors - M Chebyshev coefficients of wavelet kernel - A global upper bound on largest eigenvalue of graph laplacian
76 39 Distributed Computation Scenario: Network of N nodes, each knows - local data f(n) - local neighbors - M Chebyshev coefficients of wavelet kernel - A global upper bound on largest eigenvalue of graph laplacian To compute: Φf (j 1)N+n = 1 2 c j,0f + M k=1 c j,k T k (L)f n
77 39 Distributed Computation Scenario: Network of N nodes, each knows - local data f(n) - local neighbors - M Chebyshev coefficients of wavelet kernel - A global upper bound on largest eigenvalue of graph laplacian To compute: T 1 (L)f n Φf (j 1)N+n = 1 2 c j,0f + = 2 α (L αi)f n M k=1 c j,k T k (L)f n sensor only needs f(n) from its neighbors
78 39 Distributed Computation Scenario: Network of N nodes, each knows - local data f(n) - local neighbors - M Chebyshev coefficients of wavelet kernel - A global upper bound on largest eigenvalue of graph laplacian To compute: T 1 (L)f n Φf (j 1)N+n = 1 2 c j,0f + = 2 α (L αi)f n M k=1 c j,k T k (L)f n sensor only needs f(n) from its neighbors T k (L)f = 2 α (L αi) T k 1 (L)f T k 2 (L)f Computed by exchanging last computed values
79 Distributed Computation Communication cost: 2M E messages of length 1 per node Example: distributed denoising, or distributed regression, with Lasso a (k) Total communication cost: 1 arg min a 2 y Φ a a 1,µ i = S µi,τ a k 1 + τφ(y Φ a k 1 ) i The iterative soft thresholding algorithm converges 2 minimizer Cost of (20), E N if τ < W [25]. The fina 2 estimate of the signal is then given by W a. We now turn to the issue of how to implement th Denoised Signal 2M E messages gorithm of length in a distributed 1 fashion by sending messag neighbors in the network. OneCost option would E be 4M E messages distributed of length lasso algorithm J+1of [26], which is a spec the Alternating Direction Method of Multipliers [2 In every iteration of that algorithm, each node tr Sensing and Analysis of current High-D estimate Data of all the wavelet coefficients t neighbors. With a transform the size of the spec wavelet transform, this requires 2 E total messag Distributed Lasso [Mateos, Bazerque, Gianakis] Chebyshev Φy ΦΦ a argmin a 1 2 y W a a 1,µ, where a 1,µ := N(J+1) i=1 µ i a i. The optimizatio in (20) can be solved for example by iterative soft th [24]. The initial estimate of the wavelet coeffi is arbitrary, and at each iteration of the soft th algorithm, the update of the estimated wavelet coe given by (( [ a (k) i = S µi τ a (k 1) + τw y W a (k 1)]) i k =1, 2, where τ is the step size and S µi τ is the shrinka thresholding operator { 0, if z µi τ S µi τ (z) := z sgn(z)µ i τ, o.w. 40
80 41 Wavelets on Graphs? Existing constructions - wavelets on meshes (computer graphics, numerical analysis), often via lifting - diffusion wavelets [Maggioni, Coifman & others] - recently several other constructions based on organizing graph in a multiscale way [Gavish-Coifman] Goal - process signals on graphs - retain simplicity and signal processing flavor - algorithm to handle fairly large graphs
81 42 Scaling & Localization Effect of operator dilation?
82 42 Scaling & Localization Effect of operator dilation?
83 42 Scaling & Localization Effect of operator dilation?
84 42 Scaling & Localization Effect of operator dilation? Theorem: d G (i, j) >K and g has K vanishing derivatives at 0 ψ t,j (i) ψ t,j Dt for any t smaller than a critical scale function of d G (i, j) Reason? At small scale, wavelet operator behaves like power of Laplacian
Wavelets on Graphs, an Introduction
Wavelets on Graphs, an Introduction Pierre Vandergheynst and David Shuman Ecole Polytechnique Fédérale de Lausanne (EPFL) Signal Processing Laboratory {pierre.vandergheynst,david.shuman}@epfl.ch Université
More informationGraph Signal Processing
Graph Signal Processing Rahul Singh Data Science Reading Group Iowa State University March, 07 Outline Graph Signal Processing Background Graph Signal Processing Frameworks Laplacian Based Discrete Signal
More informationSpectral Graph Wavelets on the Cortical Connectome and Regularization of the EEG Inverse Problem
Spectral Graph Wavelets on the Cortical Connectome and Regularization of the EEG Inverse Problem David K Hammond University of Oregon / NeuroInformatics Center International Conference on Industrial and
More informationWavelets on graphs via spectral graph theory
Wavelets on graphs via spectral graph theory David K. Hammond,a,, Pierre Vandergheynst b,2, Rémi Gribonval c a NeuroInformatics Center, University of Oregon, Eugene, USA b Ecole Polytechnique Fédérale
More informationWavelets on graphs via spectral graph theory
Wavelets on graphs via spectral graph theory David K. Hammond, Pierre Vandergheynst, Rémi Gribonval To cite this version: David K. Hammond, Pierre Vandergheynst, Rémi Gribonval. Wavelets on graphs via
More informationWavelet Bi-frames with Uniform Symmetry for Curve Multiresolution Processing
Wavelet Bi-frames with Uniform Symmetry for Curve Multiresolution Processing Qingtang Jiang Abstract This paper is about the construction of univariate wavelet bi-frames with each framelet being symmetric.
More informationWavelets and Multiresolution Processing
Wavelets and Multiresolution Processing Wavelets Fourier transform has it basis functions in sinusoids Wavelets based on small waves of varying frequency and limited duration In addition to frequency,
More informationThe Spectral Graph Wavelet Transform: Fundamental Theory and Fast Computation
The Spectral Graph Wavelet Transform: Fundamental Theory and Fast Computation David K. Hammond, Pierre Vandergheynst, Rémi Gribonval To cite this version: David K. Hammond, Pierre Vandergheynst, Rémi Gribonval.
More informationFrames. Hongkai Xiong 熊红凯 Department of Electronic Engineering Shanghai Jiao Tong University
Frames Hongkai Xiong 熊红凯 http://ivm.sjtu.edu.cn Department of Electronic Engineering Shanghai Jiao Tong University 2/39 Frames 1 2 3 Frames and Riesz Bases Translation-Invariant Dyadic Wavelet Transform
More informationRandom Sampling of Bandlimited Signals on Graphs
Random Sampling of Bandlimited Signals on Graphs Pierre Vandergheynst École Polytechnique Fédérale de Lausanne (EPFL) School of Engineering & School of Computer and Communication Sciences Joint work with
More informationLecture Notes 5: Multiresolution Analysis
Optimization-based data analysis Fall 2017 Lecture Notes 5: Multiresolution Analysis 1 Frames A frame is a generalization of an orthonormal basis. The inner products between the vectors in a frame and
More informationGlobal vs. Multiscale Approaches
Harmonic Analysis on Graphs Global vs. Multiscale Approaches Weizmann Institute of Science, Rehovot, Israel July 2011 Joint work with Matan Gavish (WIS/Stanford), Ronald Coifman (Yale), ICML 10' Challenge:
More informationMultiresolution image processing
Multiresolution image processing Laplacian pyramids Some applications of Laplacian pyramids Discrete Wavelet Transform (DWT) Wavelet theory Wavelet image compression Bernd Girod: EE368 Digital Image Processing
More informationGAUSSIAN PROCESS TRANSFORMS
GAUSSIAN PROCESS TRANSFORMS Philip A. Chou Ricardo L. de Queiroz Microsoft Research, Redmond, WA, USA pachou@microsoft.com) Computer Science Department, Universidade de Brasilia, Brasilia, Brazil queiroz@ieee.org)
More information2D Wavelets. Hints on advanced Concepts
2D Wavelets Hints on advanced Concepts 1 Advanced concepts Wavelet packets Laplacian pyramid Overcomplete bases Discrete wavelet frames (DWF) Algorithme à trous Discrete dyadic wavelet frames (DDWF) Overview
More informationDigital Image Processing
Digital Image Processing Wavelets and Multiresolution Processing (Wavelet Transforms) Christophoros Nikou cnikou@cs.uoi.gr University of Ioannina - Department of Computer Science 2 Contents Image pyramids
More information( nonlinear constraints)
Wavelet Design & Applications Basic requirements: Admissibility (single constraint) Orthogonality ( nonlinear constraints) Sparse Representation Smooth functions well approx. by Fourier High-frequency
More informationChapter 7 Wavelets and Multiresolution Processing. Subband coding Quadrature mirror filtering Pyramid image processing
Chapter 7 Wavelets and Multiresolution Processing Wavelet transform vs Fourier transform Basis functions are small waves called wavelet with different frequency and limited duration Multiresolution theory:
More informationSPARSE REPRESENTATION ON GRAPHS BY TIGHT WAVELET FRAMES AND APPLICATIONS
SPARSE REPRESENTATION ON GRAPHS BY TIGHT WAVELET FRAMES AND APPLICATIONS BIN DONG Abstract. In this paper, we introduce a new (constructive) characterization of tight wavelet frames on non-flat domains
More information1 The Continuous Wavelet Transform The continuous wavelet transform (CWT) Discretisation of the CWT... 2
Contents 1 The Continuous Wavelet Transform 1 1.1 The continuous wavelet transform (CWT)............. 1 1. Discretisation of the CWT...................... Stationary wavelet transform or redundant wavelet
More informationSparse linear models
Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time
More informationVertex-Frequency Analysis on Graphs
Vertex-Frequency Analysis on Graphs David I Shuman, Benjamin Ricaud, Pierre Vandergheynst,2 Signal Processing Laboratory (LTS2, Ecole Polytechnique Fédérale de Lausanne (EPFL, Lausanne, Switzerland Abstract
More informationLinear Diffusion and Image Processing. Outline
Outline Linear Diffusion and Image Processing Fourier Transform Convolution Image Restoration: Linear Filtering Diffusion Processes for Noise Filtering linear scale space theory Gauss-Laplace pyramid for
More informationSpectral Processing. Misha Kazhdan
Spectral Processing Misha Kazhdan [Taubin, 1995] A Signal Processing Approach to Fair Surface Design [Desbrun, et al., 1999] Implicit Fairing of Arbitrary Meshes [Vallet and Levy, 2008] Spectral Geometry
More informationThe Generalized Haar-Walsh Transform (GHWT) for Data Analysis on Graphs and Networks
The Generalized Haar-Walsh Transform (GHWT) for Data Analysis on Graphs and Networks Jeff Irion & Naoki Saito Department of Mathematics University of California, Davis SIAM Annual Meeting 2014 Chicago,
More informationMarch 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University
Kernels March 13, 2008 Paper: R.R. Coifman, S. Lafon, maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels Figure: Example Application from [LafonWWW] meaningful geometric
More informationTHE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING
THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING Luis Rademacher, Ohio State University, Computer Science and Engineering. Joint work with Mikhail Belkin and James Voss This talk A new approach to multi-way
More informationOslo Class 6 Sparsity based regularization
RegML2017@SIMULA Oslo Class 6 Sparsity based regularization Lorenzo Rosasco UNIGE-MIT-IIT May 4, 2017 Learning from data Possible only under assumptions regularization min Ê(w) + λr(w) w Smoothness Sparsity
More informationDivide and Conquer Kernel Ridge Regression. A Distributed Algorithm with Minimax Optimal Rates
: A Distributed Algorithm with Minimax Optimal Rates Yuchen Zhang, John C. Duchi, Martin Wainwright (UC Berkeley;http://arxiv.org/pdf/1305.509; Apr 9, 014) Gatsby Unit, Tea Talk June 10, 014 Outline Motivation.
More informationGraphs, Vectors, and Matrices Daniel A. Spielman Yale University. AMS Josiah Willard Gibbs Lecture January 6, 2016
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016 From Applied to Pure Mathematics Algebraic and Spectral Graph Theory Sparsification: approximating
More informationWavelets, wavelet networks and the conformal group
Wavelets, wavelet networks and the conformal group R. Vilela Mendes CMAF, University of Lisbon http://label2.ist.utl.pt/vilela/ April 2016 () April 2016 1 / 32 Contents Wavelets: Continuous and discrete
More informationLearning on Graphs and Manifolds. CMPSCI 689 Sridhar Mahadevan U.Mass Amherst
Learning on Graphs and Manifolds CMPSCI 689 Sridhar Mahadevan U.Mass Amherst Outline Manifold learning is a relatively new area of machine learning (2000-now). Main idea Model the underlying geometry of
More informationApplied Machine Learning for Biomedical Engineering. Enrico Grisan
Applied Machine Learning for Biomedical Engineering Enrico Grisan enrico.grisan@dei.unipd.it Data representation To find a representation that approximates elements of a signal class with a linear combination
More informationInverse problems Total Variation Regularization Mark van Kraaij Casa seminar 23 May 2007 Technische Universiteit Eindh ove n University of Technology
Inverse problems Total Variation Regularization Mark van Kraaij Casa seminar 23 May 27 Introduction Fredholm first kind integral equation of convolution type in one space dimension: g(x) = 1 k(x x )f(x
More informationORIE 6334 Spectral Graph Theory October 13, Lecture 15
ORIE 6334 Spectral Graph heory October 3, 206 Lecture 5 Lecturer: David P. Williamson Scribe: Shijin Rajakrishnan Iterative Methods We have seen in the previous lectures that given an electrical network,
More informationData-dependent representations: Laplacian Eigenmaps
Data-dependent representations: Laplacian Eigenmaps November 4, 2015 Data Organization and Manifold Learning There are many techniques for Data Organization and Manifold Learning, e.g., Principal Component
More informationDiffusion Wavelets and Applications
Diffusion Wavelets and Applications J.C. Bremer, R.R. Coifman, P.W. Jones, S. Lafon, M. Mohlenkamp, MM, R. Schul, A.D. Szlam Demos, web pages and preprints available at: S.Lafon: www.math.yale.edu/~sl349
More informationSampling, Inference and Clustering for Data on Graphs
Sampling, Inference and Clustering for Data on Graphs Pierre Vandergheynst École Polytechnique Fédérale de Lausanne (EPFL) School of Engineering & School of Computer and Communication Sciences Joint work
More informationData Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs
Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture
More informationKernels A Machine Learning Overview
Kernels A Machine Learning Overview S.V.N. Vishy Vishwanathan vishy@axiom.anu.edu.au National ICT of Australia and Australian National University Thanks to Alex Smola, Stéphane Canu, Mike Jordan and Peter
More informationA Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay
A Statistical Look at Spectral Graph Analysis Deep Mukhopadhyay Department of Statistics, Temple University Office: Speakman 335 deep@temple.edu http://sites.temple.edu/deepstat/ Graph Signal Processing
More informationSparse Approximation: from Image Restoration to High Dimensional Classification
Sparse Approximation: from Image Restoration to High Dimensional Classification Bin Dong Beijing International Center for Mathematical Research Beijing Institute of Big Data Research Peking University
More information1 Math 241A-B Homework Problem List for F2015 and W2016
1 Math 241A-B Homework Problem List for F2015 W2016 1.1 Homework 1. Due Wednesday, October 7, 2015 Notation 1.1 Let U be any set, g be a positive function on U, Y be a normed space. For any f : U Y let
More informationPetar Maymounkov Problem Set 1 MIT with Prof. Jonathan Kelner, Fall 07
Petar Maymounkov Problem Set 1 MIT 18.409 with Prof. Jonathan Kelner, Fall 07 Problem 1.2. Part a): Let u 1,..., u n and v 1,..., v n be the eigenvalues of A and B respectively. Since A B is positive semi-definite,
More informationOnline Dictionary Learning with Group Structure Inducing Norms
Online Dictionary Learning with Group Structure Inducing Norms Zoltán Szabó 1, Barnabás Póczos 2, András Lőrincz 1 1 Eötvös Loránd University, Budapest, Hungary 2 Carnegie Mellon University, Pittsburgh,
More informationHarmonic Analysis and Geometries of Digital Data Bases
Harmonic Analysis and Geometries of Digital Data Bases AMS Session Special Sesson on the Mathematics of Information and Knowledge, Ronald Coifman (Yale) and Matan Gavish (Stanford, Yale) January 14, 2010
More information2D Wavelets for Different Sampling Grids and the Lifting Scheme
D Wavelets for Different Sampling Grids and the Lifting Scheme Miroslav Vrankić University of Zagreb, Croatia Presented by: Atanas Gotchev Lecture Outline 1D wavelets and FWT D separable wavelets D nonseparable
More informationMathematical Methods in Machine Learning
UMD, Spring 2016 Outline Lecture 2: Role of Directionality 1 Lecture 2: Role of Directionality Anisotropic Harmonic Analysis Harmonic analysis decomposes signals into simpler elements called analyzing
More informationDigital Image Processing Lectures 13 & 14
Lectures 13 & 14, Professor Department of Electrical and Computer Engineering Colorado State University Spring 2013 Properties of KL Transform The KL transform has many desirable properties which makes
More informationData dependent operators for the spatial-spectral fusion problem
Data dependent operators for the spatial-spectral fusion problem Wien, December 3, 2012 Joint work with: University of Maryland: J. J. Benedetto, J. A. Dobrosotskaya, T. Doster, K. W. Duke, M. Ehler, A.
More informationMultiresolution schemes
Multiresolution schemes Fondamenti di elaborazione del segnale multi-dimensionale Stefano Ferrari Università degli Studi di Milano stefano.ferrari@unimi.it Elaborazione dei Segnali Multi-dimensionali e
More informationData Sparse Matrix Computation - Lecture 20
Data Sparse Matrix Computation - Lecture 20 Yao Cheng, Dongping Qi, Tianyi Shi November 9, 207 Contents Introduction 2 Theorems on Sparsity 2. Example: A = [Φ Ψ]......................... 2.2 General Matrix
More informationLearning gradients: prescriptive models
Department of Statistical Science Institute for Genome Sciences & Policy Department of Computer Science Duke University May 11, 2007 Relevant papers Learning Coordinate Covariances via Gradients. Sayan
More informationOn sums of eigenvalues, and what they reveal about graphs and manifolds
On sums of eigenvalues, and what they reveal about graphs and manifolds Evans Harrell Georgia Tech Université F. Rabelais www.math.gatech.edu/~harrell Tours le 30 janvier, 2014 Copyright 2014 by Evans
More informationLinear algebra and applications to graphs Part 1
Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces
More informationMultiscale Wavelets on Trees, Graphs and High Dimensional Data
Multiscale Wavelets on Trees, Graphs and High Dimensional Data ICML 2010, Haifa Matan Gavish (Weizmann/Stanford) Boaz Nadler (Weizmann) Ronald Coifman (Yale) Boaz Nadler Ronald Coifman Motto... the relationships
More informationUniversity of Houston, Department of Mathematics Numerical Analysis, Fall 2005
4 Interpolation 4.1 Polynomial interpolation Problem: LetP n (I), n ln, I := [a,b] lr, be the linear space of polynomials of degree n on I, P n (I) := { p n : I lr p n (x) = n i=0 a i x i, a i lr, 0 i
More informationMultiresolution Analysis
Multiresolution Analysis DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Frames Short-time Fourier transform
More informationData Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings
Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline
More informationMultiresolution analysis & wavelets (quick tutorial)
Multiresolution analysis & wavelets (quick tutorial) Application : image modeling André Jalobeanu Multiresolution analysis Set of closed nested subspaces of j = scale, resolution = 2 -j (dyadic wavelets)
More information1 Sparsity and l 1 relaxation
6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the
More informationMultiresolution schemes
Multiresolution schemes Fondamenti di elaborazione del segnale multi-dimensionale Multi-dimensional signal processing Stefano Ferrari Università degli Studi di Milano stefano.ferrari@unimi.it Elaborazione
More informationDiscrete Signal Processing on Graphs: Sampling Theory
IEEE TRANS. SIGNAL PROCESS. TO APPEAR. 1 Discrete Signal Processing on Graphs: Sampling Theory Siheng Chen, Rohan Varma, Aliaksei Sandryhaila, Jelena Kovačević arxiv:153.543v [cs.it] 8 Aug 15 Abstract
More informationFiltering and Sampling Graph Signals, and its Application to Compressive Spectral Clustering
Filtering and Sampling Graph Signals, and its Application to Compressive Spectral Clustering Nicolas Tremblay (1,2), Gilles Puy (1), Rémi Gribonval (1), Pierre Vandergheynst (1,2) (1) PANAMA Team, INRIA
More informationIntroducing the Laplacian
Introducing the Laplacian Prepared by: Steven Butler October 3, 005 A look ahead to weighted edges To this point we have looked at random walks where at each vertex we have equal probability of moving
More informationMulti-Robotic Systems
CHAPTER 9 Multi-Robotic Systems The topic of multi-robotic systems is quite popular now. It is believed that such systems can have the following benefits: Improved performance ( winning by numbers ) Distributed
More informationStatistical Machine Learning
Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x
More informationMultiscale Analysis and Diffusion Semigroups With Applications
Multiscale Analysis and Diffusion Semigroups With Applications Karamatou Yacoubou Djima Advisor: Wojciech Czaja Norbert Wiener Center Department of Mathematics University of Maryland, College Park http://www.norbertwiener.umd.edu
More informationSpectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity
Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing
More informationPreprocessing & dimensionality reduction
Introduction to Data Mining Preprocessing & dimensionality reduction CPSC/AMTH 445a/545a Guy Wolf guy.wolf@yale.edu Yale University Fall 2016 CPSC 445 (Guy Wolf) Dimensionality reduction Yale - Fall 2016
More informationSemi-Supervised Learning by Multi-Manifold Separation
Semi-Supervised Learning by Multi-Manifold Separation Xiaojin (Jerry) Zhu Department of Computer Sciences University of Wisconsin Madison Joint work with Andrew Goldberg, Zhiting Xu, Aarti Singh, and Rob
More informationRKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets Class 22, 2004 Tomaso Poggio and Sayan Mukherjee
RKHS, Mercer s theorem, Unbounded domains, Frames and Wavelets 9.520 Class 22, 2004 Tomaso Poggio and Sayan Mukherjee About this class Goal To introduce an alternate perspective of RKHS via integral operators
More informationSpectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course
Spectral Algorithms I Slides based on Spectral Mesh Processing Siggraph 2010 course Why Spectral? A different way to look at functions on a domain Why Spectral? Better representations lead to simpler solutions
More informationGraph Partitioning Using Random Walks
Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient
More informationAlgorithms for Scientific Computing
Algorithms for Scientific Computing Hierarchical Methods and Sparse Grids d-dimensional Hierarchical Basis Michael Bader Technical University of Munich Summer 208 Intermezzo/ Big Picture : Archimedes Quadrature
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More informationEE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)
EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in
More informationMultiscale Frame-based Kernels for Image Registration
Multiscale Frame-based Kernels for Image Registration Ming Zhen, Tan National University of Singapore 22 July, 16 Ming Zhen, Tan (National University of Singapore) Multiscale Frame-based Kernels for Image
More informationLECTURE NOTE #11 PROF. ALAN YUILLE
LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform
More informationStochastic Proximal Gradient Algorithm
Stochastic Institut Mines-Télécom / Telecom ParisTech / Laboratoire Traitement et Communication de l Information Joint work with: Y. Atchade, Ann Arbor, USA, G. Fort LTCI/Télécom Paristech and the kind
More informationA Tutorial on Wavelets and their Applications. Martin J. Mohlenkamp
A Tutorial on Wavelets and their Applications Martin J. Mohlenkamp University of Colorado at Boulder Department of Applied Mathematics mjm@colorado.edu This tutorial is designed for people with little
More informationDATA defined on network-like structures are encountered
Graph Fourier Transform based on Directed Laplacian Rahul Singh, Abhishek Chakraborty, Graduate Student Member, IEEE, and B. S. Manoj, Senior Member, IEEE arxiv:6.v [cs.it] Jan 6 Abstract In this paper,
More informationReproducing formulas associated with symbols
Reproducing formulas associated with symbols Filippo De Mari Ernesto De Vito Università di Genova, Italy Modern Methods of Time-Frequency Analysis II Workshop on Applied Coorbit space theory September
More informationSum-of-Squares Method, Tensor Decomposition, Dictionary Learning
Sum-of-Squares Method, Tensor Decomposition, Dictionary Learning David Steurer Cornell Approximation Algorithms and Hardness, Banff, August 2014 for many problems (e.g., all UG-hard ones): better guarantees
More informationPrivacy and Fault-Tolerance in Distributed Optimization. Nitin Vaidya University of Illinois at Urbana-Champaign
Privacy and Fault-Tolerance in Distributed Optimization Nitin Vaidya University of Illinois at Urbana-Champaign Acknowledgements Shripad Gade Lili Su argmin x2x SX i=1 i f i (x) Applications g f i (x)
More informationDIMENSION REDUCTION. min. j=1
DIMENSION REDUCTION 1 Principal Component Analysis (PCA) Principal components analysis (PCA) finds low dimensional approximations to the data by projecting the data onto linear subspaces. Let X R d and
More informationDiffusion Geometries, Global and Multiscale
Diffusion Geometries, Global and Multiscale R.R. Coifman, S. Lafon, MM, J.C. Bremer Jr., A.D. Szlam, P.W. Jones, R.Schul Papers, talks, other materials available at: www.math.yale.edu/~mmm82 Data and functions
More informationValue Function Approximation with Diffusion Wavelets and Laplacian Eigenfunctions
Value Function Approximation with Diffusion Wavelets and Laplacian Eigenfunctions Sridhar Mahadevan Department of Computer Science University of Massachusetts Amherst, MA 13 mahadeva@cs.umass.edu Mauro
More informationRandom walks and anisotropic interpolation on graphs. Filip Malmberg
Random walks and anisotropic interpolation on graphs. Filip Malmberg Interpolation of missing data Assume that we have a graph where we have defined some (real) values for a subset of the nodes, and that
More informationLearning Parametric Dictionaries for. Signals on Graphs
Learning Parametric Dictionaries for 1 Signals on Graphs Dorina Thanou, David I Shuman, and Pascal Frossard arxiv:141.887v1 [cs.lg] 5 Jan 214 Abstract In sparse signal representation, the choice of a dictionary
More informationLaplacian Eigenmaps for Dimensionality Reduction and Data Representation
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Neural Computation, June 2003; 15 (6):1373-1396 Presentation for CSE291 sp07 M. Belkin 1 P. Niyogi 2 1 University of Chicago, Department
More informationMultiresolution analysis
Multiresolution analysis Analisi multirisoluzione G. Menegaz gloria.menegaz@univr.it The Fourier kingdom CTFT Continuous time signals + jωt F( ω) = f( t) e dt + f() t = F( ω) e jωt dt The amplitude F(ω),
More informationAn Introduction to Filterbank Frames
An Introduction to Filterbank Frames Brody Dylan Johnson St. Louis University October 19, 2010 Brody Dylan Johnson (St. Louis University) An Introduction to Filterbank Frames October 19, 2010 1 / 34 Overview
More informationClustering using Mixture Models
Clustering using Mixture Models The full posterior of the Gaussian Mixture Model is p(x, Z, µ,, ) =p(x Z, µ, )p(z )p( )p(µ, ) data likelihood (Gaussian) correspondence prob. (Multinomial) mixture prior
More informationSpace-Variant Computer Vision: A Graph Theoretic Approach
p.1/65 Space-Variant Computer Vision: A Graph Theoretic Approach Leo Grady Cognitive and Neural Systems Boston University p.2/65 Outline of talk Space-variant vision - Why and how of graph theory Anisotropic
More informationLecture: Some Practical Considerations (3 of 4)
Stat260/CS294: Spectral Graph Methods Lecture 14-03/10/2015 Lecture: Some Practical Considerations (3 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.
More informationConstruction of Orthonormal Quasi-Shearlets based on quincunx dilation subsampling
Construction of Orthonormal Quasi-Shearlets based on quincunx dilation subsampling Rujie Yin Department of Mathematics Duke University USA Email: rujie.yin@duke.edu arxiv:1602.04882v1 [math.fa] 16 Feb
More informationNeural Network Training
Neural Network Training Sargur Srihari Topics in Network Training 0. Neural network parameters Probabilistic problem formulation Specifying the activation and error functions for Regression Binary classification
More informationDiffusion Geometries, Diffusion Wavelets and Harmonic Analysis of large data sets.
Diffusion Geometries, Diffusion Wavelets and Harmonic Analysis of large data sets. R.R. Coifman, S. Lafon, MM Mathematics Department Program of Applied Mathematics. Yale University Motivations The main
More informationNon-linear Dimensionality Reduction
Non-linear Dimensionality Reduction CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Introduction Laplacian Eigenmaps Locally Linear Embedding (LLE)
More information