Reduced Complexity Models in the Identification of Dynamical Networks: links with sparsification problems
|
|
- Blaze Lawrence
- 5 years ago
- Views:
Transcription
1 Reduced Complexity Models in the Identification of Dynamical Networks: links with sparsification problems Donatello Materassi, Giacomo Innocenti, Laura Giarré December 15, 2009
2 Summary 1 Reconstruction of a topology Motivation Problem Formulation 2 Links with Sparsification problems Sparse representation and Compressive sensing Greedy algorithms Cycling Orthogonal Least Squares 3 Examples 4 Conclusions
3 Networks and their applications Networks are everywhere Communications Systems (Internet, wireless communications) Transportations (traffic control) Automatic Learning (neural networks) Parallel Computing (cellular automata) Formation Control (unmanned vehicles) Process Scheduling (Petri nets)
4 Networks and their applications Design Advantages Redundancy Modularity Distributed algorithms Technological applications the network topology is observable it is the objective of the design
5 Topology Identification: Motivations Reconstruction Genetics (Phylogenetic Trees) Cognitive Science (Computational Theory of Mind) Biochemistry (Gene Regulatory Networks) Complexity Reduction Cognitive Science (CTM and brain modularity) Biochemistry (gene regulatory networks) Economics (stock market, currency exchange) Social Systems There are not many techniques to identify a topology
6 Problem formulation N scalar time series {X i } i=1,...,n are observed. Assume zero-mean and wide-sense stationarity Linear smoothing via Wiener Filtering along all X j s: N 2 N min E Q j (z) X j W (j) i (z)x i (z) W (j) i j=1 j=1,j i (1) Draw an arc (i, j) if W (j) i 0 Leads to a complete graph
7 Problem formulation Consider to choose only m j > 0 arcs for X j [ ( m j )] 2 min E Q W (j) (z) 0 m j j (z) X j W (j) i (z)x i k=1 (2) where W (j) (z) 0 = # of non-zero entries of W (j) (z) Q j are frequency weights Draw an arc (i, j) if W ji 0 Leads to reduced complexity
8 Sparse representation Consider a full-rank matrix Ψ R p q with q >> p (Ψ is called dictionary ) Define the problem min x 0 Ψw (3) w 0 m We are looking for the best representation of x 0 with at most m non-zero entries This is a formulation of the Compressive Sensing problem
9 Links with Sparsification problems The two problems look very similar [ ( m j )] 2 min E Q W (j) (z) 0 m j j (z) X j W (j) i (z)x i k=1 (4) min x 0 Ψw (5) w 0 m Minimization of a norm induced by a scalar product Sparsity constraint of the solution Non-convex problems Common greedy algorithms
10 Greedy algorithms Selection Find the atom that has the highest scalar product with the residual ψ i = arg max ψ Ψ < r i 1, ψ > and add it to the the selected atoms Ψ i = Ψ i 1 ψi Update Update the coefficients w i and the residual r i = x0 Ψ i w i minimizing the new error. MP: optimize the coefficients on the last selected atom; OLS: optimize the minimum error over the set of selected atoms
11 Orthogonal Least Squares OLS projects a vector on the elements (atoms) of a redundant base (dictionary) and chooses the element that provides the largest cost reduction x 3 x 2 x 1
12 Orthogonal Least Squares OLS projects a vector on the elements (atoms) of a redundant base (dictionary) and chooses the element that provides the largest cost reduction x 3 x 2 x 1
13 Orthogonal Least Squares OLS projects a vector on the elements (atoms) of a redundant base (dictionary) and chooses the element that provides the largest cost reduction x 3 x 2 x 1
14 Orthogonal Least Squares OLS projects a vector on the elements (atoms) of a redundant base (dictionary) and chooses the element that provides the largest cost reduction x 3 x 2 x 1
15 Orthogonal Least Squares OLS projects a vector on the elements (atoms) of a redundant base (dictionary) and chooses the element that provides the largest cost reduction x 3 x 2 x 1
16 Orthogonal Projection What we mean by orthogonal projection in our space of stochastic processes X j? By orthogonal projection of a stochastic process X j on a set of stochastic processes {X αi } i=1,...,mj, we mean the best estimate of X j which can be given, in the sense of the least squares, as a linear dynamic combination of {X αji } i=1,...,mj. Such an estimate is given by the well-known Wiener Filter.
17 Cycling Orthogonal Least Squares (COLS) Orthogonal Least Squares greedy algorithm similar to Matching Pursuit (but slower) terminates in m steps may not stop in a local minimum Cycling Orthogonal Least Squares (COLS) first m steps as in OLS any chosen atom can be replaced by another one if there is cost reduction slower that OLS (makes more steps) terminates in a local minimum
18 Cycling Orthogonal Least Squares (COLS) Cycling Orthogonal Least Squares: 0. define X 0 := 0 (null time series) and c = initialize the m j -ple S = X 0, X 0..., X 0 2. while c m j 2a. for i = 1,..., m, i j define S i as the m j ple where X i replaces the i th element of S 0 Define r (i) k 2b. α = arg max i r (i) k as the projection of r on to S i 2c. if X alpha = S[j] then c = c + 1 2d. else S[j] = X alpha, c = 1, j = j + 1 mod m j 3. return S
19 Network reconstruction The transfer functions were randomly generated third order FIR models, simulation steps, white additive and mutually not correlated noises acting on every single node. Q j = 1, non-causal Wiener Filters. Thus it can be said that we are considering a smoothing scenario for our modeling. The Wiener filters have been computed estimating the cross-spectral densities of the signals X i, under the assumption of ergodicity.
20 Reconstruction via COLS (a) (b) (c) (d) Figure: The actual structure (a), and the reconstruction obtained using the Cycling OLS algorithm with m j = 1 (b), m j = 2 (c) and m j = 3 (d).
21 Application to Currency Exchange Rates SEK NOK CAD GBP DKK USD EU CNY KRW JPY MXN INR THB HKD TWD LKR MYR BRL SGD AUD ZAR NZD
22 Application to Currency Exchange Rates CAD USD MXN BRL CAD USD MXN BRL CAD USD MXN BRL EU INR EU INR EU INR GBP LKR GBP LKR GBP LKR DKK SGD DKK SGD DKK SGD NOK MYR NOK MYR NOK MYR SEK THB SEK THB SEK THB ZAR KRW ZAR KRW ZAR KRW AUD JPY AUD JPY AUD JPY NZD CNY HKD TWD NZD CNY HKD TWD NZD CNY HKD TWD (a) (b) (c) Figure: The reconstructed topologies obtaining applying the Cycling OLS to the exchange rate time series of the 22 selected currencies.
23 Conclusions We derive a topological structure from a set of time series. Every time series is represented as a node in a graph. No a priori info on the network. In order to modulate the complexity of the final graph a maximum number m j of arcs pointing at X j is assumed. The network complexity reduction is strictly related to the problem of sparsification Suboptimal greedy algorithms with a modification of the Orthogonal Least Squares. Applications to real data: currencies (daily data of the past 10 years normalized to the swiss frank)
24 Bibliography E. Candès, M. Wakin, and S. Boyd, Enhancing sparsity by reweighted l 1 minimization, Journal of Fourier Analysis and Applications, M. Timme, Revealing network connectivity from response dynamics, Phys. Rev. Lett., J. A. Tropp, Greed is good: Algorithmic results for sparse approximation, IEEE Trans. Inform. Theory, E. J. Candès and T. Tao, Decoding by linear programming, IEEE Transactions on Information Theory, S. Mallat and Z. Zhang, Matching pursuits with time-frequency dictionaries, IEEE Transactions on Signal Processing, 1993.
25 Suboptimal Modeling Approach The transfer functions {W ji (z)} describe the linear dynamical component of the dependencies among the processes. Hence, the corresponding errors {e j } are responsible for the remaining unmodeled components. Such sets provide respectively a qualitative and quantitative description of the dynamics among the time series. Suboptimal strategy For any given X j, find the process X k and the corresponding transfer function W jk, which provide the best contribution to X j in term of the realted modeling error e j, i.e. such that [ E (Q(z)(X j W jk (z)x k )) 2] = min E j i [ (Q(z)(X j W ji (z)x i )) 2] (6)
26 Frequency Domain Approach Given two stochastic processes X i, X j and a transfer function W (z), consider the quadratic cost E [ (ε Q ) 2] (7) where ε Q := Q(z) (X j W (z)x i ), (8) being Q(z) an arbitrary stable and causally invertible function weighting the modeling error X j W (z)x i. The frequency domain solution The problem of evaluating the transfer function Ŵ (z) such that the quadratic cost (7) is minimized is solved by the Wiener filter.
27 Wiener Filter Proposition (Wiener filter) The Wiener filter modeling X j by X i is the linear stable filter Ŵ ji minimizing the filtered quantity (7). Its expression is given by Ŵ ji (z) = Φ X i X j (z) Φ Xi (z), (9) Φ Xi (z) and Φ Xi X j (z) being the power spectral density and the cross-power spectral density respectively. The Wiener filter does not depend upon Q(z).
28 Coherence Function Since the weighting function Q(z) does not affect the Wiener filter we choose Q(z) equal to F j (z), the inverse of the spectral factor of Φ Xj (z), that is Φ Xj (z) = F 1 j (z)(f 1 j (z)) (10) with F j (z) stable and causally invertible. Then, the minimum cost assumes the form min E[ε 2 F j ] = π π ( 1 Φ ) X j X i (ω) 2 dω (11) Φ Xi (ω)φ Xj (ω) and it turns out to depend explicitly on the coherence function: C Xi X j (ω) := Φ X j X i (ω) 2 Φ Xi (ω)φ Xj (ω). (12)
29 Coherence-based Distance Define on the field Θ of the discrete zero mean and wide sense stationary stochastic processes the binary function [ 1 π ( d(x i, X j ) := 1 CXi X 2π j (ω) ) ] 1/2 dω X i, X j Θ. (13) Proposition π The function d(, ) as defined in (13) is a metric on Θ, that is d(x 1, X 2 ) 0 d(x 1, X 2 ) = 0 X 1 = X 2 d(x 1, X 2 ) = d(x 2, X 1 ) d(x 1, X 3 ) d(x 1, X 2 ) + d(x 2, X 3 ) for all X 1, X 2, X 3 Θ.
arxiv: v1 [math.ds] 3 Mar 2011
Model Identification of a network as compressing sensing arxiv:.0744v [math.ds] Mar 0 D. Materassi b, G. Innocenti a, L. Giarré c, M. Salapaka b a Dipartimento di Ingegneria dell Informazione Universitá
More informationˆ GDP t = GDP t SCAN t (1) t stat : (3.71) (5.53) (3.27) AdjustedR 2 : 0.652
We study the relationship between SCAN index and GDP growth for some major countries. The SCAN data is taken as of Oct. The SCAN is of monthly frequency so we first compute its three-month average as quarterly
More informationExact Topology Identification of Large-Scale Interconnected Dynamical Systems from Compressive Observations
Exact Topology Identification of arge-scale Interconnected Dynamical Systems from Compressive Observations Borhan M Sanandaji, Tyrone Vincent, and Michael B Wakin Abstract In this paper, we consider the
More informationNew Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit
New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence
More informationSGN Advanced Signal Processing Project bonus: Sparse model estimation
SGN 21006 Advanced Signal Processing Project bonus: Sparse model estimation Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 12 Sparse models Initial problem: solve
More informationOptimisation Combinatoire et Convexe.
Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix
More informationSparsity in Underdetermined Systems
Sparsity in Underdetermined Systems Department of Statistics Stanford University August 19, 2005 Classical Linear Regression Problem X n y p n 1 > Given predictors and response, y Xβ ε = + ε N( 0, σ 2
More informationBayesian Methods for Sparse Signal Recovery
Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery
More informationCompressed Sensing and Neural Networks
and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications
More informationApplied Machine Learning for Biomedical Engineering. Enrico Grisan
Applied Machine Learning for Biomedical Engineering Enrico Grisan enrico.grisan@dei.unipd.it Data representation To find a representation that approximates elements of a signal class with a linear combination
More informationAn Introduction to Sparse Approximation
An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,
More informationMLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT
MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net
More informationL-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise
L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise Srdjan Stanković, Irena Orović and Moeness Amin 1 Abstract- A modification of standard
More informationSparse Solutions of an Undetermined Linear System
1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research
More informationCompressive Sensing and Beyond
Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered
More informationEUSIPCO
EUSIPCO 013 1569746769 SUBSET PURSUIT FOR ANALYSIS DICTIONARY LEARNING Ye Zhang 1,, Haolong Wang 1, Tenglong Yu 1, Wenwu Wang 1 Department of Electronic and Information Engineering, Nanchang University,
More informationBhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego
Bhaskar Rao Department of Electrical and Computer Engineering University of California, San Diego 1 Outline Course Outline Motivation for Course Sparse Signal Recovery Problem Applications Computational
More informationA new method on deterministic construction of the measurement matrix in compressed sensing
A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central
More informationStability and Robustness of Weak Orthogonal Matching Pursuits
Stability and Robustness of Weak Orthogonal Matching Pursuits Simon Foucart, Drexel University Abstract A recent result establishing, under restricted isometry conditions, the success of sparse recovery
More informationIntroduction to Compressed Sensing
Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral
More informationNew Applications of Sparse Methods in Physics. Ra Inta, Centre for Gravitational Physics, The Australian National University
New Applications of Sparse Methods in Physics Ra Inta, Centre for Gravitational Physics, The Australian National University 2 Sparse methods A vector is S-sparse if it has at most S non-zero coefficients.
More informationModel-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk
Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional
More informationGeneralized Orthogonal Matching Pursuit- A Review and Some
Generalized Orthogonal Matching Pursuit- A Review and Some New Results Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur, INDIA Table of Contents
More informationMotivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble
Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Zhilin Zhang and Ritwik Giri Motivation Sparse Signal Recovery is an interesting
More informationA tutorial on sparse modeling. Outline:
A tutorial on sparse modeling. Outline: 1. Why? 2. What? 3. How. 4. no really, why? Sparse modeling is a component in many state of the art signal processing and machine learning tasks. image processing
More informationSPARSE signal representations have gained popularity in recent
6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying
More informationGREEDY SIGNAL RECOVERY REVIEW
GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin
More informationof Orthogonal Matching Pursuit
A Sharp Restricted Isometry Constant Bound of Orthogonal Matching Pursuit Qun Mo arxiv:50.0708v [cs.it] 8 Jan 205 Abstract We shall show that if the restricted isometry constant (RIC) δ s+ (A) of the measurement
More informationConvex relaxation for Combinatorial Penalties
Convex relaxation for Combinatorial Penalties Guillaume Obozinski Equipe Imagine Laboratoire d Informatique Gaspard Monge Ecole des Ponts - ParisTech Joint work with Francis Bach Fête Parisienne in Computation,
More informationLecture: Introduction to Compressed Sensing Sparse Recovery Guarantees
Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin
More informationOslo Class 6 Sparsity based regularization
RegML2017@SIMULA Oslo Class 6 Sparsity based regularization Lorenzo Rosasco UNIGE-MIT-IIT May 4, 2017 Learning from data Possible only under assumptions regularization min Ê(w) + λr(w) w Smoothness Sparsity
More informationMATCHING PURSUIT WITH STOCHASTIC SELECTION
2th European Signal Processing Conference (EUSIPCO 22) Bucharest, Romania, August 27-3, 22 MATCHING PURSUIT WITH STOCHASTIC SELECTION Thomas Peel, Valentin Emiya, Liva Ralaivola Aix-Marseille Université
More informationMIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design
MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation
More informationThe Iteration-Tuned Dictionary for Sparse Representations
The Iteration-Tuned Dictionary for Sparse Representations Joaquin Zepeda #1, Christine Guillemot #2, Ewa Kijak 3 # INRIA Centre Rennes - Bretagne Atlantique Campus de Beaulieu, 35042 Rennes Cedex, FRANCE
More informationA NEW FRAMEWORK FOR DESIGNING INCOHERENT SPARSIFYING DICTIONARIES
A NEW FRAMEWORK FOR DESIGNING INCOERENT SPARSIFYING DICTIONARIES Gang Li, Zhihui Zhu, 2 uang Bai, 3 and Aihua Yu 3 School of Automation & EE, Zhejiang Univ. of Sci. & Tech., angzhou, Zhejiang, P.R. China
More informationSparse analysis Lecture III: Dictionary geometry and greedy algorithms
Sparse analysis Lecture III: Dictionary geometry and greedy algorithms Anna C. Gilbert Department of Mathematics University of Michigan Intuition from ONB Key step in algorithm: r, ϕ j = x c i ϕ i, ϕ j
More informationColor Scheme. swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July / 14
Color Scheme www.cs.wisc.edu/ swright/pcmi/ M. Figueiredo and S. Wright () Inference and Optimization PCMI, July 2016 1 / 14 Statistical Inference via Optimization Many problems in statistical inference
More informationPre-weighted Matching Pursuit Algorithms for Sparse Recovery
Journal of Information & Computational Science 11:9 (214) 2933 2939 June 1, 214 Available at http://www.joics.com Pre-weighted Matching Pursuit Algorithms for Sparse Recovery Jingfei He, Guiling Sun, Jie
More informationLecture 22: More On Compressed Sensing
Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an
More informationRSP-Based Analysis for Sparsest and Least l 1 -Norm Solutions to Underdetermined Linear Systems
1 RSP-Based Analysis for Sparsest and Least l 1 -Norm Solutions to Underdetermined Linear Systems Yun-Bin Zhao IEEE member Abstract Recently, the worse-case analysis, probabilistic analysis and empirical
More informationMIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco
MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications Class 08: Sparsity Based Regularization Lorenzo Rosasco Learning algorithms so far ERM + explicit l 2 penalty 1 min w R d n n l(y
More informationIntroduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin
1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)
More informationGreedy Signal Recovery and Uniform Uncertainty Principles
Greedy Signal Recovery and Uniform Uncertainty Principles SPIE - IE 2008 Deanna Needell Joint work with Roman Vershynin UC Davis, January 2008 Greedy Signal Recovery and Uniform Uncertainty Principles
More informationSparse & Redundant Signal Representation, and its Role in Image Processing
Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique
More informationMachine Learning for Signal Processing Sparse and Overcomplete Representations
Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA
More informationOn convergence of Approximate Message Passing
On convergence of Approximate Message Passing Francesco Caltagirone (1), Florent Krzakala (2) and Lenka Zdeborova (1) (1) Institut de Physique Théorique, CEA Saclay (2) LPS, Ecole Normale Supérieure, Paris
More informationOn methods to assess the significance of community structure in networks of financial time series.
On methods to assess the significance of community structure in networks of financial time series. Argimiro Arratia 1 & Martí Renedo 1 Universitat Politècnica de Catalunya, SPAIN Department of Computer
More informationSparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images
Sparse Solutions of Systems of Equations and Sparse Modelling of Signals and Images Alfredo Nava-Tudela ant@umd.edu John J. Benedetto Department of Mathematics jjb@umd.edu Abstract In this project we are
More informationRecovering overcomplete sparse representations from structured sensing
Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix
More informationOn the l 1 -Norm Invariant Convex k-sparse Decomposition of Signals
On the l 1 -Norm Invariant Convex -Sparse Decomposition of Signals arxiv:1305.6021v2 [cs.it] 11 Nov 2013 Guangwu Xu and Zhiqiang Xu Abstract Inspired by an interesting idea of Cai and Zhang, we formulate
More informationSketching for Large-Scale Learning of Mixture Models
Sketching for Large-Scale Learning of Mixture Models Nicolas Keriven Université Rennes 1, Inria Rennes Bretagne-atlantique Adv. Rémi Gribonval Outline Introduction Practical Approach Results Theoretical
More informationPhase recovery with PhaseCut and the wavelet transform case
Phase recovery with PhaseCut and the wavelet transform case Irène Waldspurger Joint work with Alexandre d Aspremont and Stéphane Mallat Introduction 2 / 35 Goal : Solve the non-linear inverse problem Reconstruct
More informationAnalysis of Greedy Algorithms
Analysis of Greedy Algorithms Jiahui Shen Florida State University Oct.26th Outline Introduction Regularity condition Analysis on orthogonal matching pursuit Analysis on forward-backward greedy algorithm
More informationRecent developments on sparse representation
Recent developments on sparse representation Zeng Tieyong Department of Mathematics, Hong Kong Baptist University Email: zeng@hkbu.edu.hk Hong Kong Baptist University Dec. 8, 2008 First Previous Next Last
More informationOptimization. Benjamin Recht University of California, Berkeley Stephen Wright University of Wisconsin-Madison
Optimization Benjamin Recht University of California, Berkeley Stephen Wright University of Wisconsin-Madison optimization () cost constraints might be too much to cover in 3 hours optimization (for big
More informationA simple test to check the optimality of sparse signal approximations
A simple test to check the optimality of sparse signal approximations Rémi Gribonval, Rosa Maria Figueras I Ventura, Pierre Vergheynst To cite this version: Rémi Gribonval, Rosa Maria Figueras I Ventura,
More informationStrengthened Sobolev inequalities for a random subspace of functions
Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)
More informationMATCHING-PURSUIT DICTIONARY PRUNING FOR MPEG-4 VIDEO OBJECT CODING
MATCHING-PURSUIT DICTIONARY PRUNING FOR MPEG-4 VIDEO OBJECT CODING Yannick Morvan, Dirk Farin University of Technology Eindhoven 5600 MB Eindhoven, The Netherlands email: {y.morvan;d.s.farin}@tue.nl Peter
More informationCOMPRESSED SENSING IN PYTHON
COMPRESSED SENSING IN PYTHON Sercan Yıldız syildiz@samsi.info February 27, 2017 OUTLINE A BRIEF INTRODUCTION TO COMPRESSED SENSING A BRIEF INTRODUCTION TO CVXOPT EXAMPLES A Brief Introduction to Compressed
More informationLEARNING DATA TRIAGE: LINEAR DECODING WORKS FOR COMPRESSIVE MRI. Yen-Huan Li and Volkan Cevher
LARNING DATA TRIAG: LINAR DCODING WORKS FOR COMPRSSIV MRI Yen-Huan Li and Volkan Cevher Laboratory for Information Inference Systems École Polytechnique Fédérale de Lausanne ABSTRACT The standard approach
More informationSparse linear models
Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time
More informationIEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER
IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1239 Preconditioning for Underdetermined Linear Systems with Sparse Solutions Evaggelia Tsiligianni, StudentMember,IEEE, Lisimachos P. Kondi,
More informationInterpolation-Based Trust-Region Methods for DFO
Interpolation-Based Trust-Region Methods for DFO Luis Nunes Vicente University of Coimbra (joint work with A. Bandeira, A. R. Conn, S. Gratton, and K. Scheinberg) July 27, 2010 ICCOPT, Santiago http//www.mat.uc.pt/~lnv
More informationCSC 576: Variants of Sparse Learning
CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in
More informationUniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
Claremont Colleges Scholarship @ Claremont CMC Faculty Publications and Research CMC Faculty Scholarship 6-5-2008 Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit
More informationLab 8: Measuring Graph Centrality - PageRank. Monday, November 5 CompSci 531, Fall 2018
Lab 8: Measuring Graph Centrality - PageRank Monday, November 5 CompSci 531, Fall 2018 Outline Measuring Graph Centrality: Motivation Random Walks, Markov Chains, and Stationarity Distributions Google
More informationEE 381V: Large Scale Optimization Fall Lecture 24 April 11
EE 381V: Large Scale Optimization Fall 2012 Lecture 24 April 11 Lecturer: Caramanis & Sanghavi Scribe: Tao Huang 24.1 Review In past classes, we studied the problem of sparsity. Sparsity problem is that
More informationEquivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 12, DECEMBER 2008 2009 Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation Yuanqing Li, Member, IEEE, Andrzej Cichocki,
More informationRandomized Coordinate Descent with Arbitrary Sampling: Algorithms and Complexity
Randomized Coordinate Descent with Arbitrary Sampling: Algorithms and Complexity Zheng Qu University of Hong Kong CAM, 23-26 Aug 2016 Hong Kong based on joint work with Peter Richtarik and Dominique Cisba(University
More informationStopping Condition for Greedy Block Sparse Signal Recovery
Stopping Condition for Greedy Block Sparse Signal Recovery Yu Luo, Ronggui Xie, Huarui Yin, and Weidong Wang Department of Electronics Engineering and Information Science, University of Science and Technology
More informationCompressed sensing for radio interferometry: spread spectrum imaging techniques
Compressed sensing for radio interferometry: spread spectrum imaging techniques Y. Wiaux a,b, G. Puy a, Y. Boursier a and P. Vandergheynst a a Institute of Electrical Engineering, Ecole Polytechnique Fédérale
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu
More informationNetwork Topology Inference from Non-stationary Graph Signals
Network Topology Inference from Non-stationary Graph Signals Rasoul Shafipour Dept. of Electrical and Computer Engineering University of Rochester rshafipo@ece.rochester.edu http://www.ece.rochester.edu/~rshafipo/
More informationORTHOGONAL matching pursuit (OMP) is the canonical
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 9, SEPTEMBER 2010 4395 Analysis of Orthogonal Matching Pursuit Using the Restricted Isometry Property Mark A. Davenport, Member, IEEE, and Michael
More informationLecture Notes 9: Constrained Optimization
Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form
More informationCompressed Sensing: a Subgradient Descent Method for Missing Data Problems
Compressed Sensing: a Subgradient Descent Method for Missing Data Problems ANZIAM, Jan 30 Feb 3, 2011 Jonathan M. Borwein Jointly with D. Russell Luke, University of Goettingen FRSC FAAAS FBAS FAA Director,
More informationIterative reweighted l 1 design of sparse FIR filters
Iterative reweighted l 1 design of sparse FIR filters Cristian Rusu, Bogdan Dumitrescu Abstract Designing sparse 1D and 2D filters has been the object of research in recent years due mainly to the developments
More informationSIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS
TR-IIS-4-002 SIGNAL SEPARATION USING RE-WEIGHTED AND ADAPTIVE MORPHOLOGICAL COMPONENT ANALYSIS GUAN-JU PENG AND WEN-LIANG HWANG Feb. 24, 204 Technical Report No. TR-IIS-4-002 http://www.iis.sinica.edu.tw/page/library/techreport/tr204/tr4.html
More informationAlgorithms for sparse analysis Lecture I: Background on sparse approximation
Algorithms for sparse analysis Lecture I: Background on sparse approximation Anna C. Gilbert Department of Mathematics University of Michigan Tutorial on sparse approximations and algorithms Compress data
More informationFast Hard Thresholding with Nesterov s Gradient Method
Fast Hard Thresholding with Nesterov s Gradient Method Volkan Cevher Idiap Research Institute Ecole Polytechnique Federale de ausanne volkan.cevher@epfl.ch Sina Jafarpour Department of Computer Science
More informationCOMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION
COMPARATIVE ANALYSIS OF ORTHOGONAL MATCHING PURSUIT AND LEAST ANGLE REGRESSION By Mazin Abdulrasool Hameed A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for
More informationSimultaneous Sparsity
Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp annacg martinjs}@umich.edu Department of Mathematics The University of Michigan 1 Simple Sparse Approximation Work in the d-dimensional,
More informationSignal Recovery from Permuted Observations
EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,
More informationProbabilistic Low-Rank Matrix Completion with Adaptive Spectral Regularization Algorithms
Probabilistic Low-Rank Matrix Completion with Adaptive Spectral Regularization Algorithms François Caron Department of Statistics, Oxford STATLEARN 2014, Paris April 7, 2014 Joint work with Adrien Todeschini,
More informationThe uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008
The uniform uncertainty principle and compressed sensing Harmonic analysis and related topics, Seville December 5, 2008 Emmanuel Candés (Caltech), Terence Tao (UCLA) 1 Uncertainty principles A basic principle
More informationRobust Principal Component Analysis
ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M
More informationLinear Regression with Strongly Correlated Designs Using Ordered Weigthed l 1
Linear Regression with Strongly Correlated Designs Using Ordered Weigthed l 1 ( OWL ) Regularization Mário A. T. Figueiredo Instituto de Telecomunicações and Instituto Superior Técnico, Universidade de
More informationDesigning Information Devices and Systems I Discussion 13B
EECS 6A Fall 7 Designing Information Devices and Systems I Discussion 3B. Orthogonal Matching Pursuit Lecture Orthogonal Matching Pursuit (OMP) algorithm: Inputs: A set of m songs, each of length n: S
More informationStructured matrix factorizations. Example: Eigenfaces
Structured matrix factorizations Example: Eigenfaces An extremely large variety of interesting and important problems in machine learning can be formulated as: Given a matrix, find a matrix and a matrix
More information1 Sparsity and l 1 relaxation
6.883 Learning with Combinatorial Structure Note for Lecture 2 Author: Chiyuan Zhang Sparsity and l relaxation Last time we talked about sparsity and characterized when an l relaxation could recover the
More informationSparse analysis Lecture II: Hardness results for sparse approximation problems
Sparse analysis Lecture II: Hardness results for sparse approximation problems Anna C. Gilbert Department of Mathematics University of Michigan Sparse Problems Exact. Given a vector x R d and a complete
More informationSparse Approximation of Signals with Highly Coherent Dictionaries
Sparse Approximation of Signals with Highly Coherent Dictionaries Bishnu P. Lamichhane and Laura Rebollo-Neira b.p.lamichhane@aston.ac.uk, rebollol@aston.ac.uk Support from EPSRC (EP/D062632/1) is acknowledged
More informationCompressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery
Compressibility of Infinite Sequences and its Interplay with Compressed Sensing Recovery Jorge F. Silva and Eduardo Pavez Department of Electrical Engineering Information and Decision Systems Group Universidad
More informationA Brief Overview of Practical Optimization Algorithms in the Context of Relaxation
A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation Zhouchen Lin Peking University April 22, 2018 Too Many Opt. Problems! Too Many Opt. Algorithms! Zero-th order algorithms:
More informationInfo-Greedy Sequential Adaptive Compressed Sensing
Info-Greedy Sequential Adaptive Compressed Sensing Yao Xie Joint work with Gabor Braun and Sebastian Pokutta Georgia Institute of Technology Presented at Allerton Conference 2014 Information sensing for
More informationOn the Projection Matrices Influence in the Classification of Compressed Sensed ECG Signals
On the Projection Matrices Influence in the Classification of Compressed Sensed ECG Signals Monica Fira, Liviu Goras Institute of Computer Science Romanian Academy Iasi, Romania Liviu Goras, Nicolae Cleju,
More informationNumerical Methods. Rafał Zdunek Underdetermined problems (2h.) Applications) (FOCUSS, M-FOCUSS,
Numerical Methods Rafał Zdunek Underdetermined problems (h.) (FOCUSS, M-FOCUSS, M Applications) Introduction Solutions to underdetermined linear systems, Morphological constraints, FOCUSS algorithm, M-FOCUSS
More informationSPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION. Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte
SPECTRAL COMPRESSIVE SENSING WITH POLAR INTERPOLATION Karsten Fyhn, Hamid Dadkhahi, Marco F. Duarte Dept. of Electronic Systems, Aalborg University, Denmark. Dept. of Electrical and Computer Engineering,
More informationCONTROL SYSTEMS, ROBOTICS, AND AUTOMATION Vol. VI - System Identification Using Wavelets - Daniel Coca and Stephen A. Billings
SYSTEM IDENTIFICATION USING WAVELETS Daniel Coca Department of Electrical Engineering and Electronics, University of Liverpool, UK Department of Automatic Control and Systems Engineering, University of
More informationInverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France
Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse
More information