Degrees of Freedom: Uncertainty Principle and Sampling

Size: px
Start display at page:

Download "Degrees of Freedom: Uncertainty Principle and Sampling"

Transcription

1 Degrees of Freedom: Uncertainty Principle and Sampling Sergio Barbarossa Ack: M. Tsitsvero, P. Di Lorenzo, S. SardelliI 30/06/6 Sapienza Università di Roma July, 206

2 Overall summary. The WhiLaker- Nyquist- Kotelnikov- Shannon sampling theorem 2. Degrees of freedom 3. Sparse representatons 4. Uncertainty principles 5. Signals on graph 6. Conclusion 29/06/6 Sapienza Università di Roma July, 206 2

3 Sampling theorem The sampling theorem (as proved by Shannon): If a function s(t) contains no frequencies higher than B cps, it is completely determined by giving its ordinates at a series of points spaced /2B seconds apart Proof: By definition where S(f) with coefficients admits the Fourier series expansion: S n = 2B s(t) = S(f) = Z B B Z B B X S n e S(f)e j2 ft df j2 nf 2B S(f) e j2 nf/2b df = n 2B s 2B 27/06/6 Sapienza Università di Roma July, 206 3

4 Sampling theorem Substituting back, we get s(t) = X n= s n 2B sinc 2 B t n 2B As Shannon says, This is a fact which is common knowledge in the communication art. Theorem has been given previously in other forms by mathematcians but in spite of its evident importance seems not to have appeared explicitly in the literature of communicaton theory. Nyquist, however, and more recently Gabor, have pointed out that approximately 2BT numbers are sufficient, basing their arguments on a Fourier series expansion of the functon over the Tme interval T. 27/06/6 Sapienza Università di Roma July, 206 4

5 Sampling theorem A theorem with many inventors: Whittaker (95), Nyquist (928), Kotelnikov (933), Whittaker (935), Raabe (938), Gabor (946), Shannon (948), Isao Someya (949) E. T. WhiLaker, On the functons which are represented by the expansions of the interpolaton theory, 95 C(x) = X n= f(a + rw)sinc( (x a rw)/w) it is possible to interpolate given sampling values at intervals of w such that the Fourier transform of this interpolaton functon does not contain any terms with periods less than 2w V. A. Kotelnikov, On the transmission capacity of ether and wire in electrical communicatons, 933 Any func)on F(t) which consists of frequencies from 0 to f periods may be represented by the following series: X n F (t) = F sinc(2 f (t k/2f )) 2f k= 29/06/6 Sapienza Università di Roma July, 206 5

6 Sampling theorem Alternative perspective: A signal s(t) is perfectly band-limited between B and B if it remains unaltered when passing through an ideal lowpass filter with cut-off frequency B, i.e. s(t) But, since - B H(f) Substituting back, we get: B sinc(2 B(t )) = s(t) = f X n= X n= s(t) s sinc s(t) = 2 B t Z n 2B n sinc 2 B t 2B s( )sinc(2 B(t sinc n 2B n 2 B 2B )) d Note: This formulation does not require the existence of the Fourier Transform of s(t)! 29/06/6 Sapienza Università di Roma July, 206 6

7 Degrees of freedom Extensions Again, from Shannon If the function is limited to the time interval T and the samples are spaced /2B seconds apart, there will be a total of 2BT samples in the interval. All samples outside will be substantially zero. The numbers used to specify the function need not be the equally spaced samples used above One can further show that the value of the function and its derivative at every other sample point are sufficient. The value and first and second derivatives at every third sample point give a still different set of parameters which uniquely determine the function. Generally speaking, any set of 2BT independent numbers associated with the function can be used to describe it. Essentially, we have replaced a complex entity (say, a television signal) in a simple environment by a simple entity (a point) in a complex environment ( 2BT dimensional space ). 30/06/6 Sapienza Università di Roma July, 206 7

8 Degrees of freedom Formal approach (Slepian, Landau-Pollack) s(t) A band-limited signal is approximately time-limited in (-T/2, T/2), at level " T, if R t applet/2 s(t) 2 dt R s(t) 2 dt = " 2 T Let us denote by E(" T ) the set of band-limited functions approximately time-limited, at level " T E(" T ) i(t),i=0,...,n, The set is approximately N-dimensional if there exist N linearly independent functions such that Z NX min s(t) a k k (t) 2 dt apple N 2 {a k } k=0 A proper estimation of the dimension N requires the set min max min { i} s2e(" T ) {a k } Z s(t) { i (t)} NX k=0 to be chosen so that it achieves a k k (t) 2 dt 28/06/6 Sapienza Università di Roma July, 206 8

9 Degrees of freedom Def.: Time-limited operator Ds ( s(t) if t apple T/2 0 if t >T/2 Def.: Band-limited operator The optimal set is given by the eigenfunctions of the operator, i.e. and it provides the bound Bs Z B B S(f)e j2 ft df { i (t)} BDB BDB i = i i Z s(t) [2BT] X k=0 a k k (t) 2 dt apple C" 2 T where C is independent of s(t), " T, and 2BT 29/06/6 Sapienza Università di Roma July, 206 9

10 Degrees of freedom Landau attributes the following theorem to Shannon (unfortunately with no reference): > 0 C 3 = C 3 ( ) C 4 = C 4 ( ) s 2 E(" T ) Given any, there exist constants and, so that, for inf a k ks(t) N(BT) X k=0 a k k (t)k 2 apple ( + )" 2 T where N(BT) =[2BT]+C 3 log + (2BT)+C 4 In other words, for any given band-limited signal, lim T! N(BT) T =2B This result gives back the common understanding that the essential number of dimensions of a bandlimited signal, approximately concentrated in a time interval of duration T, scales as 2BT, as the time-bandwidth product increases Remark: The theorem assumes the use of the optimal waveforms (prolate spheroidal wave functions) 29/06/6 Sapienza Università di Roma July, 206 0

11 Degrees of freedom Extensions The prolate spheroidal wave functions are eigenfunctions of the linear time-varying system x(t) rect T (t) - B H(f) B f s(t) x(t) h(t, ) s(t) More generally, a class of signals can be defined as the set of possible outputs of a linear system In general, not much is known about the eigenfunctions of (non-normal) linear operators However, in case of underspread systems, good, although approximate, models are available 29/06/6 Sapienza Università di Roma July, 206

12 Degrees of freedom Extensions Suppose h(t, ) 2 L 2, then h(t, ) = X i u i (t) v i ( ) where and are the left and right singular functions associated to i=0 u i (t) v i (t) i If the system is underspread, left singular functions can be well approximated as u i (t) = X m A i,m (t) e j' i,m(t) where, and H(t, f i,m (t) 2 = i 2 f i,m (t) = 2 A i,m (t) = H(t,f) d' i,m (t) dt f=f i,m (t) 29/06/6 Sapienza Università di Roma July, 206 2

13 Degrees of freedom Example of eigenfunctions of underspread LTV systems 2 i e j' i,m(t) ' i,m (t) instantaneous frequency f i,m (t) 29/06/6 Sapienza Università di Roma July, 206 3

14 Degrees of freedom Are all level possible? No, the only admissible levels are such that the instantaneous frequencies respect an area rule f S i n i 2 t A natural discretization is inherent to the system 0/07/6 Sapienza Università di Roma July, 206 4

15 Degrees of freedom Example: Multipath channel affected by delay and Doppler shifts Contour level plots of H(t, f) 2 PSWVD of function associated to level Frequency Frequency Time PSWVD of function associated to level Time PSWVD of function associated to level Frequency Frequency Time PSWVD = Pseudo-Smoothed Wigner-Ville Distribution with Reassignment Time 29/06/6 Sapienza Università di Roma July, 206 5

16 Degrees of freedom Example: PSWD of prolate spheroidal wave functions 29/06/6 Sapienza Università di Roma July, 206 6

17 Degrees of freedom Studying the eigenfunctions of LTV systems opens the door to an infinite variety of tiling of the time-frequency domain Gabor (STFT) wavelets eigenfunctions frequency frequency frequency time time time 29/06/6 Sapienza Università di Roma July, 206 7

18 Sparse representations Given an underdetermined linear observation model y = Ax A 2 R m n x y m<n with, under what conditions can be recovered from when? y = A = observation matrix x measurement matrix s Recovery may be possible only if (or ) is sparse and satisfies some properties x x s sparsitying basis might not be directly sparse, but there should exist a sparsifying transformation leading to a sparse A s 30/06/6 Sapienza Università di Roma July, 206 8

19 Sparse representations Conditions for unique recovery x Denote by k the sparsity of and by Sk n the set of vectors of size n having at most k nonzero entries The right place to look for the conditions of unique recovery of a sparse vector is the null space of A N (A) ={z : Az = 0} x, x 0 2 S n k A(x x 0 ) 6= 0 N (A) Given any two vectors, the need to ensure requires that does not contain any vector in S2k n Spark (Kruskal rank) of a matrix: The spark of a matrix is the smallest number of columns of that are linearly dependent A A Uniqueness of sparse recovery: For any vector x 2 S n k y 2 R n, there exists at most one k-sparse y = Ax signal such that if and only if spark(a) > 2k To guarantee uniqueness, we also need m 2k 29/06/6 Sapienza Università di Roma July, 206 9

20 Sparse representations Alternative conditions for unique recovery Restricted isometry property (RIP) A k 2 (0, ) A matrix satisfies the RIP of order k if there exists a such that ( k)kxk 2 2 applekaxk 2 2 apple ( + k )kxk 2 2 for all x 2 S n k In words, if a matrix satisfies the RIP of order 2k, that matrix preserves the distance between any pair of k-sparse vectors Problem: Computing RIP constants is typically NP-hard! 29/06/6 Sapienza Università di Roma July,

21 Sparse representations Recovery algorithms Original problem (nonconvex): Basis pursuit (convex) : min kxk 0 subject to Ax = y min kxk subject to Ax = y Variants (QCBP) min kxk subject to kax yk 2 apple Problem: Under what conditions, do the relaxed problems yield same solution as the original problem? 29/06/6 Sapienza Università di Roma July, 206 2

22 Sparse representations The power of randomness Alternative algorithms require different conditions on the RIP to ensure unique recovery However, computing the RIP is NP-hard Interestingly, if A is drawn at random, unique recovery is possible with high probability provided that m n Ck ln k 29/06/6 Sapienza Università di Roma July,

23 Uncertainty principles ConTnuous- Tme signals R 2 Time spread: T = (t t 0) 2 x(t) 2 dt R, x(t) 2 dt t 0 = R t x(t) 2 dt R x(t) 2 dt R 2 Frequency spread: F = (f f 0) 2 X(f) 2 df R, X(f) 2 df f 0 = R f X(f) 2 df R X(f) 2 df, Heisenberg s principle: T F 4 A signal that is perfectly localized in Tme cannot be perfectly localized in frequency and viceversa 29/06/6 Sapienza Università di Roma July,

24 Uncertainty principles AlternaTve approach (Slepian- Landau- Pollack) Define Tme and frequency spread as the dimensions T and W of the intervals in Tme and frequency such that a given percentage of energy falls within them: Z t0 +T/2 t 0 T/2 Z x(t) 2 dt x(t) 2 dt = 2. Z f0 +W/2 f 0 W/2 Z X(f) 2 df X(f) 2 df = 2. Not all pairs (, ) are admissible Aim of the uncertainty principle is to find out the region of all admissible pairs 29/06/6 Sapienza Università di Roma July,

25 Uncertainty principles Uncertainty principle for sparse vectors Given two orthonormal bases and, any vector can be expanded uniquely in terms of each one of these bases as Setting and, uncertainty relations show that A and B cannot be small simultaneously x = A = kak 0 B = kbk 0 { `, apple ` apple n} { `, apple ` apple n} x nx a` ` = nx `= `= b` ` Mutual coherence: Thm: µ(, ) = max `,r ` r 2 (A + B) p AB µ(, ) 29/06/6 Sapienza Università di Roma July,

26 Uncertainty principles Uncertainty principle for sparse vectors If and are chosen as the cardinal (identity) basis and the Fourier bases, then µ(, )= p n In such a case, 2 (A + B) p AB p n A =[, ] Thm: Let be the concatenation of two orthonormal bases and where both matrices are unitary and of size m x m. If then, for each measurement vector such that y = Ax k< µ(, ) = µ(a) y 2 R m there exists at most one k-sparse vector x 30/06/6 Sapienza Università di Roma July,

27 Uncertainty principles Is there a good common basis for spikes and sinusoids? Numerical example 30/06/6 Sapienza Università di Roma July,

28 Signals on graph Given a graph G(V, E), a graph signal is defined as a mapping from the set of vertces to the real space of dimension equal to the cardinality of : V MoTvaTng examples f : V! R Wireless sensor networks (temperature, stress, ) Biological networks (molecule concentraton, gene actvaton, ) Social networks (politcal orientaton, ranking, ) Basic notaton: adjacency matrix: Laplacian matrix: A L = diag{a} A 29/06/6 Sapienza Università di Roma July,

29 Signals on graph Graph Fourier Transform (GFT) - Laplacian decompositon [Pes08, ZhuRab2, ShuNarFroOrtVan3, ] NX Given L = U U T = iu i u T i, GFT: i= ˆx = U T x Inverse GFT: x = U ˆx - Adjacency decompositon [SanMou3, CheVarSanKov5, ] Given A = VJV GFT: ˆx = V x Inverse GFT: x = V ˆx The key point is to find a basis that admits a compact representaton: x = Us 30/06/6 Sapienza Università di Roma July,

30 Signals on graph Graph Fourier Transform (GFT) Note: If graph is circulant the eigenvectors of the Laplacian concide with the Fourier basis and the GFT boils down to conventonal DFT Discrete- Time Signal Processing can be seen as a par8cular example of Graph Signal Processing 30/06/6 Sapienza Università di Roma July,

31 Signals on graph Examples of GFT Temperature field GFT Latitude (deg) Longitude (deg) /06/6 Sapienza Università di Roma July, 206 3

32 Signals on graph Examples of GFT Temperature field Fiedler vector Latitude (deg) Latitude (deg) Longitude (deg) Longitude (deg) 30/06/6 Sapienza Università di Roma July,

33 Signals on graph Examples of GFT Noisy temp. field ReconstrucTon with 2 eigenvectors Latitude (deg) Longitude (deg) /06/6 Sapienza Università di Roma July,

34 Signals on graph Examples of GFT Noisy temp. field ReconstrucTon with 24 eigenvectors Latitude (deg) Longitude (deg) /06/6 Sapienza Università di Roma July,

35 Signals on graph Given a graph G(V, E) and a subset of vertces S V, we define - vertex- limitng operator (projector over set ) : D = diag(d,...,d NN ) where d ii =, if i 2 S; d ii =0, if i/2 S, - band- limitng operator (projector over set ): where and U B = U U T, is a diagonal matrix selectng the frequency indices, i.e. ii =, if i 2 F; ii =0, if i/2 F is the matrix whose columns are the basis of signal expansion L F A (e.g., eigenvectors of, or eigenvectors of, ) S 30/06/6 Sapienza Università di Roma July,

36 Signals on graph Perfect localizaton conditons Def.: A vector is perfectly localized over the subset if and it is perfectly band- limited over Theorem: A vector is perfectly localized over both vertex set and frequency set if and only if In such a case, x x Dx = x F if Bx = x x S F max(bdb) = is the eigenvector associated to the the unit eigenvalue of S BDB Equivalently kbdk = ; kdbk = 30/06/6 Sapienza Università di Roma July,

37 Signals on graph Uncertainty principle Define concentraton measures: kdxk 2 kxk 2 = 2 ; kbxk 2 kxk 2 = 2 Theorem: The only admissible concentra)on pairs (, ) specified by the following inequali)es belong to the region cos + cos cos max (BD) cos p 2 + cos cos max BD cos + cos p 2 cos max BD cos p 2 + cos p 2 cos max BD B where D and represent projector on complement sets and S F 30/06/6 Sapienza Università di Roma July,

38 Signals on graph Uncertainty principle Example of admissible region 2 2 max B D 2 max (BD) cos + cos cos max (BD) Maximum sum concentraton is achieved with = 2 max B D 2 max BD 2 2 = 2 ( + max) Note: A corner curve shrinks to the corner point in case of perfect localizaton over the corresponding domain 30/06/6 Sapienza Università di Roma July,

39 Signals on graph Sampling Let us denote by r = Ds the sampled signal Theorem: Given any band- limited signal, i.e. satsfying it is possible to recover from its sampled version if and only if In words, a band- limited signal can be perfectly reconstructed iff there exists no band- limited signal that is perfectly localized on the same frequency set the complement vertex set S Bs = s s kdbk < r F and on 30/06/6 Sapienza Università di Roma July,

40 Signals on graph Sampling and uncertainty principle The upper let corner of the uncertainty region specifies the conditons for signal recovery from its samples: 2 Given 2 =, we need to stay away from the point 2 =0 2 max B D 2 max (BD) More generally, if 2 < we need to guarantee 2 max(bd) < 2 2 max B D 2 max BD 2 30/06/6 Sapienza Università di Roma July,

41 Signals on graph ReconstrucTon Algorithms: A.: A.2 (AlternaTng ProjecTon) ŝ = I DB r r (0) = r r () = r + DB r (0) r (k) = r + DB r (k ) A.3 where i are soluton of ŝ = F X i= BDB i = 2 hdr, i i i i i i 30/06/6 Sapienza Università di Roma July, 206 4

42 Signals on graph Not all samples are equal 30/06/6 Sapienza Università di Roma July,

43 Signals on graph Not all samples are equal NMSE, db Random MaxFro MinUniSet MaxSigMin MaxVol MinPinv Exhaustive NMSE, db Random MaxFro MinUniSet MaxSigMin MaxVol MinPinv Number of samples Number of samples and going random can be extremely bad! 30/06/6 Sapienza Università di Roma July,

44 Signals on graph Ques8on: Is perfect recovery possible in the presence of (very large) spiky noise? Given the observaton model r = s + Dn Apply ` - norm minimizaton ŝ = arg min s2b kr sk 30/06/6 Sapienza Università di Roma July,

45 Signals on graph Ques8on: Is perfect recovery possible in the presence of (very large) spiky noise? Numerical results 0 20 MSE, (db) F =5 F = 0 F = Number of noisy samples, S Answer: Yes, provided that # samples is sufficiently larger than # of noise samples 30/06/6 Sapienza Università di Roma July,

46 Conclusion The real number of degrees of freedom is the number of independent disciplines that Shannon mastered and brought to synthesis: communicaton theory, switching circuits, biology, Having had to invent InformaTon Theory almost from scratch looks like a necessary occurrence of an incredibly brilliant mind Sampling theory is stll well alive, its lille kids, Compressive Sensing, Finite Rate of InnovaTon Sampling, Union of Subspace Sampling, are growing well 30/06/6 Sapienza Università di Roma July,

47 Conclusion Developments Information Theory and Biology What is it that makes living systems to strive for decreasing (local) entropy? Information Theory and Topology Algebraic topology should provide the tools to classify qualitatively different information sources Topological Signal Processing and Machine Learning Graph-based methods are just the simplest way to capture two-way relations among variables, multiway relations are the general setting Information Theory and Physics Quantum gravity may receive a significant contribution from IT approaches Semantic Information Theory 0/07/6 Sapienza Università di Roma July,

Uncertainty Principle and Sampling of Signals Defined on Graphs

Uncertainty Principle and Sampling of Signals Defined on Graphs Uncertainty Principle and Sampling of Signals Defined on Graphs Mikhail Tsitsvero, Sergio Barbarossa, and Paolo Di Lorenzo 2 Department of Information Eng., Electronics and Telecommunications, Sapienza

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

5742 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER /$ IEEE

5742 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER /$ IEEE 5742 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 12, DECEMBER 2009 Uncertainty Relations for Shift-Invariant Analog Signals Yonina C. Eldar, Senior Member, IEEE Abstract The past several years

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Constrained optimization

Constrained optimization Constrained optimization DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Compressed sensing Convex constrained

More information

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France.

Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France. Inverse problems and sparse models (6/6) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Overview of the course Introduction sparsity & data compression inverse problems

More information

Random Sampling of Bandlimited Signals on Graphs

Random Sampling of Bandlimited Signals on Graphs Random Sampling of Bandlimited Signals on Graphs Pierre Vandergheynst École Polytechnique Fédérale de Lausanne (EPFL) School of Engineering & School of Computer and Communication Sciences Joint work with

More information

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France

Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France Inverse problems and sparse models (1/2) Rémi Gribonval INRIA Rennes - Bretagne Atlantique, France remi.gribonval@inria.fr Structure of the tutorial Session 1: Introduction to inverse problems & sparse

More information

Super-resolution via Convex Programming

Super-resolution via Convex Programming Super-resolution via Convex Programming Carlos Fernandez-Granda (Joint work with Emmanuel Candès) Structure and Randomness in System Identication and Learning, IPAM 1/17/2013 1/17/2013 1 / 44 Index 1 Motivation

More information

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin

More information

Compressive Sensing and Beyond

Compressive Sensing and Beyond Compressive Sensing and Beyond Sohail Bahmani Gerorgia Tech. Signal Processing Compressed Sensing Signal Models Classics: bandlimited The Sampling Theorem Any signal with bandwidth B can be recovered

More information

Compressive sensing in the analog world

Compressive sensing in the analog world Compressive sensing in the analog world Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Compressive Sensing A D -sparse Can we really acquire analog signals

More information

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016

Random projections. 1 Introduction. 2 Dimensionality reduction. Lecture notes 5 February 29, 2016 Lecture notes 5 February 9, 016 1 Introduction Random projections Random projections are a useful tool in the analysis and processing of high-dimensional data. We will analyze two applications that use

More information

Contents. 0.1 Notation... 3

Contents. 0.1 Notation... 3 Contents 0.1 Notation........................................ 3 1 A Short Course on Frame Theory 4 1.1 Examples of Signal Expansions............................ 4 1.2 Signal Expansions in Finite-Dimensional

More information

arxiv: v1 [cs.it] 26 Sep 2018

arxiv: v1 [cs.it] 26 Sep 2018 SAPLING THEORY FOR GRAPH SIGNALS ON PRODUCT GRAPHS Rohan A. Varma, Carnegie ellon University rohanv@andrew.cmu.edu Jelena Kovačević, NYU Tandon School of Engineering jelenak@nyu.edu arxiv:809.009v [cs.it]

More information

Analog-to-Information Conversion

Analog-to-Information Conversion Analog-to-Information Conversion Sergiy A. Vorobyov Dept. Signal Processing and Acoustics, Aalto University February 2013 Winter School on Compressed Sensing, Ruka 1/55 Outline 1 Compressed Sampling (CS)

More information

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector

Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Primal Dual Pursuit A Homotopy based Algorithm for the Dantzig Selector Muhammad Salman Asif Thesis Committee: Justin Romberg (Advisor), James McClellan, Russell Mersereau School of Electrical and Computer

More information

Lecture Notes 9: Constrained Optimization

Lecture Notes 9: Constrained Optimization Optimization-based data analysis Fall 017 Lecture Notes 9: Constrained Optimization 1 Compressed sensing 1.1 Underdetermined linear inverse problems Linear inverse problems model measurements of the form

More information

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION A Thesis by MELTEM APAYDIN Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment of the

More information

Towards a Mathematical Theory of Super-resolution

Towards a Mathematical Theory of Super-resolution Towards a Mathematical Theory of Super-resolution Carlos Fernandez-Granda www.stanford.edu/~cfgranda/ Information Theory Forum, Information Systems Laboratory, Stanford 10/18/2013 Acknowledgements This

More information

Various signal sampling and reconstruction methods

Various signal sampling and reconstruction methods Various signal sampling and reconstruction methods Rolands Shavelis, Modris Greitans 14 Dzerbenes str., Riga LV-1006, Latvia Contents Classical uniform sampling and reconstruction Advanced sampling and

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

Ch. 15 Wavelet-Based Compression

Ch. 15 Wavelet-Based Compression Ch. 15 Wavelet-Based Compression 1 Origins and Applications The Wavelet Transform (WT) is a signal processing tool that is replacing the Fourier Transform (FT) in many (but not all!) applications. WT theory

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

The Fundamentals of Compressive Sensing

The Fundamentals of Compressive Sensing The Fundamentals of Compressive Sensing Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Sensor Explosion Data Deluge Digital Revolution If we sample a signal

More information

Compressed Sensing: Extending CLEAN and NNLS

Compressed Sensing: Extending CLEAN and NNLS Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction

More information

1 Mathematical preliminaries

1 Mathematical preliminaries 1 Mathematical preliminaries The mathematical language of quantum mechanics is that of vector spaces and linear algebra. In this preliminary section, we will collect the various definitions and mathematical

More information

Sampling, Inference and Clustering for Data on Graphs

Sampling, Inference and Clustering for Data on Graphs Sampling, Inference and Clustering for Data on Graphs Pierre Vandergheynst École Polytechnique Fédérale de Lausanne (EPFL) School of Engineering & School of Computer and Communication Sciences Joint work

More information

Lecture 5 : Sparse Models

Lecture 5 : Sparse Models Lecture 5 : Sparse Models Homework 3 discussion (Nima) Sparse Models Lecture - Reading : Murphy, Chapter 13.1, 13.3, 13.6.1 - Reading : Peter Knee, Chapter 2 Paolo Gabriel (TA) : Neural Brain Control After

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

ACCORDING to Shannon s sampling theorem, an analog

ACCORDING to Shannon s sampling theorem, an analog 554 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL 59, NO 2, FEBRUARY 2011 Segmented Compressed Sampling for Analog-to-Information Conversion: Method and Performance Analysis Omid Taheri, Student Member,

More information

Constructing Explicit RIP Matrices and the Square-Root Bottleneck

Constructing Explicit RIP Matrices and the Square-Root Bottleneck Constructing Explicit RIP Matrices and the Square-Root Bottleneck Ryan Cinoman July 18, 2018 Ryan Cinoman Constructing Explicit RIP Matrices July 18, 2018 1 / 36 Outline 1 Introduction 2 Restricted Isometry

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

Uniqueness Conditions for A Class of l 0 -Minimization Problems

Uniqueness Conditions for A Class of l 0 -Minimization Problems Uniqueness Conditions for A Class of l 0 -Minimization Problems Chunlei Xu and Yun-Bin Zhao October, 03, Revised January 04 Abstract. We consider a class of l 0 -minimization problems, which is to search

More information

Signal Recovery, Uncertainty Relations, and Minkowski Dimension

Signal Recovery, Uncertainty Relations, and Minkowski Dimension Signal Recovery, Uncertainty Relations, and Minkowski Dimension Helmut Bőlcskei ETH Zurich December 2013 Joint work with C. Aubel, P. Kuppinger, G. Pope, E. Riegler, D. Stotz, and C. Studer Aim of this

More information

Structured matrix factorizations. Example: Eigenfaces

Structured matrix factorizations. Example: Eigenfaces Structured matrix factorizations Example: Eigenfaces An extremely large variety of interesting and important problems in machine learning can be formulated as: Given a matrix, find a matrix and a matrix

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Sparse linear models

Sparse linear models Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time

More information

Digital Image Processing

Digital Image Processing Digital Image Processing, 2nd ed. Digital Image Processing Chapter 7 Wavelets and Multiresolution Processing Dr. Kai Shuang Department of Electronic Engineering China University of Petroleum shuangkai@cup.edu.cn

More information

Inverse problems, Dictionary based Signal Models and Compressed Sensing

Inverse problems, Dictionary based Signal Models and Compressed Sensing Inverse problems, Dictionary based Signal Models and Compressed Sensing Rémi Gribonval METISS project-team (audio signal processing, speech recognition, source separation) INRIA, Rennes, France Ecole d

More information

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012

Introduction to Sparsity. Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Introduction to Sparsity Xudong Cao, Jake Dreamtree & Jerry 04/05/2012 Outline Understanding Sparsity Total variation Compressed sensing(definition) Exact recovery with sparse prior(l 0 ) l 1 relaxation

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

Optimization for Compressed Sensing

Optimization for Compressed Sensing Optimization for Compressed Sensing Robert J. Vanderbei 2014 March 21 Dept. of Industrial & Systems Engineering University of Florida http://www.princeton.edu/ rvdb Lasso Regression The problem is to solve

More information

Design of Spectrally Shaped Binary Sequences via Randomized Convex Relaxations

Design of Spectrally Shaped Binary Sequences via Randomized Convex Relaxations Design of Spectrally Shaped Binary Sequences via Randomized Convex Relaxations Dian Mo Department of Electrical and Computer Engineering University of Massachusetts Amherst, MA 3 mo@umass.edu Marco F.

More information

Sparse Legendre expansions via l 1 minimization

Sparse Legendre expansions via l 1 minimization Sparse Legendre expansions via l 1 minimization Rachel Ward, Courant Institute, NYU Joint work with Holger Rauhut, Hausdorff Center for Mathematics, Bonn, Germany. June 8, 2010 Outline Sparse recovery

More information

Blind Compressed Sensing

Blind Compressed Sensing 1 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE arxiv:1002.2586v2 [cs.it] 28 Apr 2010 Abstract The fundamental principle underlying compressed sensing is that a signal,

More information

Sparse Optimization Lecture: Sparse Recovery Guarantees

Sparse Optimization Lecture: Sparse Recovery Guarantees Those who complete this lecture will know Sparse Optimization Lecture: Sparse Recovery Guarantees Sparse Optimization Lecture: Sparse Recovery Guarantees Instructor: Wotao Yin Department of Mathematics,

More information

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices Compressed Sensing Yonina C. Eldar Electrical Engineering Department, Technion-Israel Institute of Technology, Haifa, Israel, 32000 1 Introduction Compressed sensing (CS) is an exciting, rapidly growing

More information

A Survey of Compressive Sensing and Applications

A Survey of Compressive Sensing and Applications A Survey of Compressive Sensing and Applications Justin Romberg Georgia Tech, School of ECE ENS Winter School January 10, 2012 Lyon, France Signal processing trends DSP: sample first, ask questions later

More information

COMPSCI 514: Algorithms for Data Science

COMPSCI 514: Algorithms for Data Science COMPSCI 514: Algorithms for Data Science Arya Mazumdar University of Massachusetts at Amherst Fall 2018 Lecture 8 Spectral Clustering Spectral clustering Curse of dimensionality Dimensionality Reduction

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

In English, this means that if we travel on a straight line between any two points in C, then we never leave C.

In English, this means that if we travel on a straight line between any two points in C, then we never leave C. Convex sets In this section, we will be introduced to some of the mathematical fundamentals of convex sets. In order to motivate some of the definitions, we will look at the closest point problem from

More information

Graph Partitioning Using Random Walks

Graph Partitioning Using Random Walks Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient

More information

Discrete Signal Processing on Graphs: Sampling Theory

Discrete Signal Processing on Graphs: Sampling Theory IEEE TRANS. SIGNAL PROCESS. TO APPEAR. 1 Discrete Signal Processing on Graphs: Sampling Theory Siheng Chen, Rohan Varma, Aliaksei Sandryhaila, Jelena Kovačević arxiv:153.543v [cs.it] 8 Aug 15 Abstract

More information

LINEAR ALGEBRA QUESTION BANK

LINEAR ALGEBRA QUESTION BANK LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,

More information

Cambridge University Press The Mathematics of Signal Processing Steven B. Damelin and Willard Miller Excerpt More information

Cambridge University Press The Mathematics of Signal Processing Steven B. Damelin and Willard Miller Excerpt More information Introduction Consider a linear system y = Φx where Φ can be taken as an m n matrix acting on Euclidean space or more generally, a linear operator on a Hilbert space. We call the vector x a signal or input,

More information

Inverse Power Method for Non-linear Eigenproblems

Inverse Power Method for Non-linear Eigenproblems Inverse Power Method for Non-linear Eigenproblems Matthias Hein and Thomas Bühler Anubhav Dwivedi Department of Aerospace Engineering & Mechanics 7th March, 2017 1 / 30 OUTLINE Motivation Non-Linear Eigenproblems

More information

INDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina

INDUSTRIAL MATHEMATICS INSTITUTE. B.S. Kashin and V.N. Temlyakov. IMI Preprint Series. Department of Mathematics University of South Carolina INDUSTRIAL MATHEMATICS INSTITUTE 2007:08 A remark on compressed sensing B.S. Kashin and V.N. Temlyakov IMI Preprint Series Department of Mathematics University of South Carolina A remark on compressed

More information

Going off the grid. Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison

Going off the grid. Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Going off the grid Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Joint work with Badri Bhaskar Parikshit Shah Gonnguo Tang We live in a continuous world... But we work

More information

TIME-FREQUENCY ANALYSIS: TUTORIAL. Werner Kozek & Götz Pfander

TIME-FREQUENCY ANALYSIS: TUTORIAL. Werner Kozek & Götz Pfander TIME-FREQUENCY ANALYSIS: TUTORIAL Werner Kozek & Götz Pfander Overview TF-Analysis: Spectral Visualization of nonstationary signals (speech, audio,...) Spectrogram (time-varying spectrum estimation) TF-methods

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

Coding the Matrix Index - Version 0

Coding the Matrix Index - Version 0 0 vector, [definition]; (2.4.1): 68 2D geometry, transformations in, [lab]; (4.15.0): 196-200 A T (matrix A transpose); (4.5.4): 157 absolute value, complex number; (1.4.1): 43 abstract/abstracting, over

More information

A Friendly Guide to the Frame Theory. and Its Application to Signal Processing

A Friendly Guide to the Frame Theory. and Its Application to Signal Processing A Friendly uide to the Frame Theory and Its Application to Signal Processing inh N. Do Department of Electrical and Computer Engineering University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ minhdo

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach

Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach Athina P. Petropulu Department of Electrical and Computer Engineering Rutgers, the State University of New Jersey Acknowledgments Shunqiao

More information

Data representation and approximation

Data representation and approximation Representation and approximation of data February 3, 2015 Outline 1 Outline 1 Approximation The interpretation of polynomials as functions, rather than abstract algebraic objects, forces us to reinterpret

More information

Mathematical foundations - linear algebra

Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar

More information

Sparse Solutions of an Undetermined Linear System

Sparse Solutions of an Undetermined Linear System 1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research

More information

Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

More information

COMPRESSED SENSING IN PYTHON

COMPRESSED SENSING IN PYTHON COMPRESSED SENSING IN PYTHON Sercan Yıldız syildiz@samsi.info February 27, 2017 OUTLINE A BRIEF INTRODUCTION TO COMPRESSED SENSING A BRIEF INTRODUCTION TO CVXOPT EXAMPLES A Brief Introduction to Compressed

More information

Spin Glass Approach to Restricted Isometry Constant

Spin Glass Approach to Restricted Isometry Constant Spin Glass Approach to Restricted Isometry Constant Ayaka Sakata 1,Yoshiyuki Kabashima 2 1 Institute of Statistical Mathematics 2 Tokyo Institute of Technology 1/29 Outline Background: Compressed sensing

More information

Real Analysis, 2nd Edition, G.B.Folland Elements of Functional Analysis

Real Analysis, 2nd Edition, G.B.Folland Elements of Functional Analysis Real Analysis, 2nd Edition, G.B.Folland Chapter 5 Elements of Functional Analysis Yung-Hsiang Huang 5.1 Normed Vector Spaces 1. Note for any x, y X and a, b K, x+y x + y and by ax b y x + b a x. 2. It

More information

Z Algorithmic Superpower Randomization October 15th, Lecture 12

Z Algorithmic Superpower Randomization October 15th, Lecture 12 15.859-Z Algorithmic Superpower Randomization October 15th, 014 Lecture 1 Lecturer: Bernhard Haeupler Scribe: Goran Žužić Today s lecture is about finding sparse solutions to linear systems. The problem

More information

Computational Methods. Eigenvalues and Singular Values

Computational Methods. Eigenvalues and Singular Values Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations

More information

1 Introduction to Compressed Sensing

1 Introduction to Compressed Sensing 1 Introduction to Compressed Sensing Mark A. Davenport Stanford University, Department of Statistics Marco F. Duarte Duke University, Department of Computer Science Yonina C. Eldar Technion, Israel Institute

More information

Conditions for Robust Principal Component Analysis

Conditions for Robust Principal Component Analysis Rose-Hulman Undergraduate Mathematics Journal Volume 12 Issue 2 Article 9 Conditions for Robust Principal Component Analysis Michael Hornstein Stanford University, mdhornstein@gmail.com Follow this and

More information

Compressed Sensing: Lecture I. Ronald DeVore

Compressed Sensing: Lecture I. Ronald DeVore Compressed Sensing: Lecture I Ronald DeVore Motivation Compressed Sensing is a new paradigm for signal/image/function acquisition Motivation Compressed Sensing is a new paradigm for signal/image/function

More information

Finite Frames and Graph Theoretical Uncertainty Principles

Finite Frames and Graph Theoretical Uncertainty Principles Finite Frames and Graph Theoretical Uncertainty Principles (pkoprows@math.umd.edu) University of Maryland - College Park April 13, 2015 Outline 1 Motivation 2 Definitions 3 Results Outline 1 Motivation

More information

Analysis Preliminary Exam Workshop: Hilbert Spaces

Analysis Preliminary Exam Workshop: Hilbert Spaces Analysis Preliminary Exam Workshop: Hilbert Spaces 1. Hilbert spaces A Hilbert space H is a complete real or complex inner product space. Consider complex Hilbert spaces for definiteness. If (, ) : H H

More information

Modulation & Coding for the Gaussian Channel

Modulation & Coding for the Gaussian Channel Modulation & Coding for the Gaussian Channel Trivandrum School on Communication, Coding & Networking January 27 30, 2017 Lakshmi Prasad Natarajan Dept. of Electrical Engineering Indian Institute of Technology

More information

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5 CS 229r: Algorithms for Big Data Fall 215 Prof. Jelani Nelson Lecture 19 Nov 5 Scribe: Abdul Wasay 1 Overview In the last lecture, we started discussing the problem of compressed sensing where we are given

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Optimization-based sparse recovery: Compressed sensing vs. super-resolution

Optimization-based sparse recovery: Compressed sensing vs. super-resolution Optimization-based sparse recovery: Compressed sensing vs. super-resolution Carlos Fernandez-Granda, Google Computational Photography and Intelligent Cameras, IPAM 2/5/2014 This work was supported by a

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

Spectral Processing. Misha Kazhdan

Spectral Processing. Misha Kazhdan Spectral Processing Misha Kazhdan [Taubin, 1995] A Signal Processing Approach to Fair Surface Design [Desbrun, et al., 1999] Implicit Fairing of Arbitrary Meshes [Vallet and Levy, 2008] Spectral Geometry

More information

Sparsity and Compressed Sensing

Sparsity and Compressed Sensing Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A

More information

Least squares regularized or constrained by L0: relationship between their global minimizers. Mila Nikolova

Least squares regularized or constrained by L0: relationship between their global minimizers. Mila Nikolova Least squares regularized or constrained by L0: relationship between their global minimizers Mila Nikolova CMLA, CNRS, ENS Cachan, Université Paris-Saclay, France nikolova@cmla.ens-cachan.fr SIAM Minisymposium

More information

Graph theoretic uncertainty principles

Graph theoretic uncertainty principles Graph theoretic uncertainty principles John J. Benedetto and Paul J. Koprowski Norbert Wiener Center Department of Mathematics University of Maryland, College Park http://www.norbertwiener.umd.edu Acknowledgements

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Least Sparsity of p-norm based Optimization Problems with p > 1

Least Sparsity of p-norm based Optimization Problems with p > 1 Least Sparsity of p-norm based Optimization Problems with p > Jinglai Shen and Seyedahmad Mousavi Original version: July, 07; Revision: February, 08 Abstract Motivated by l p -optimization arising from

More information

Lecture Notes 5: Multiresolution Analysis

Lecture Notes 5: Multiresolution Analysis Optimization-based data analysis Fall 2017 Lecture Notes 5: Multiresolution Analysis 1 Frames A frame is a generalization of an orthonormal basis. The inner products between the vectors in a frame and

More information

10 Transfer Matrix Models

10 Transfer Matrix Models MIT EECS 6.241 (FALL 26) LECTURE NOTES BY A. MEGRETSKI 1 Transfer Matrix Models So far, transfer matrices were introduced for finite order state space LTI models, in which case they serve as an important

More information

Multiresolution schemes

Multiresolution schemes Multiresolution schemes Fondamenti di elaborazione del segnale multi-dimensionale Multi-dimensional signal processing Stefano Ferrari Università degli Studi di Milano stefano.ferrari@unimi.it Elaborazione

More information