Learning-based compression

Size: px
Start display at page:

Download "Learning-based compression"

Transcription

1 Nature Learning-based compression Value / Knowledge Volkan Cevher Laboratory for Information and Inference Systems

2 A paradigm shift in data generation hours

3 A paradigm shift in data generation hours

4 Key tool: Compression

5 A familiar example 12MPix & 24bits/pixel = 36MB Power: OK Storage: NO

6 A familiar example 12MPix & 24bits/pixel = 36MB Power: OK Storage: NO 1000 images + no apps

7 A familiar example 12MPix & 24bits/pixel = 36MB Power: OK Storage: OK Compression actual: 1.4MB images + no apps (vs 1000 images)

8 A familiar example Bandwidth: OK 12MPix & 24bits/pixel = 36MB Power: OK Storage: OK Compression actual: 1.4MB images + no apps (vs 1000 images)

9 Compression helps! Bandwidth: OK 12MPix & 24bits/pixel = 36MB Power: OK Storage: OK Compression actual: 1.4MB images + no apps (vs 1000 images)

10 Compression: The basics p x \ y \ y \ = x \

11 Compression: The basics = n n p p x \ y \ y \ = x \ JPEG2000: Wavelets sparsity

12 Compression: The basics = n n p p x \ y \ Add decades & math + eng JPEG2000: Wavelets y \ = x \ sparsity

13 Compression: The basics = n n p p x \ y \ Add decades & math + eng JPEG2000: Wavelets y \ = x \ Strategy: Encode b = P x \ sparsity Decode ˆx = P : Subset selector P b

14 Compression: The basics = n n p p x \ y \ Add decades & math + eng JPEG: DCT y \ = x \ Strategy: Encode b = P x \ sparsity Decode ˆx = P : Subset selector P b

15 The core challenge: Can we automatically teach any sensor how to compress its own data well?

16 Compression helps! Bandwidth: OK 12MPix & 24bits/pixel = 36MB Power: OK Storage: OK Compression actual: 1.4MB Caveats for generalization: Collected the full data & Performed a full transformation!

17 The core challenge: Can we automatically teach any sensor how to compress its own data well? Our twist: Compress without transforming or sampling the whole data!

18

19 (Old) Compressive sensing (CS) Goal: Directly obtain the compressed version Off-load the difficulty to computation - encoding model: b = P Fx \ & x \ is s sparse in - decoding algorithm: convex optimization ˆx = arg min x {k xk 1 : b = P Fx} - Theorem: If s (log p) & is su ciently random then ˆx = x \ with hp [Candes et al 2006; ]

20 Challenges to the old CS High computational cost & latency: O(n 2 p 1.5 ) Oversampling: p vs s vs s(log p) Dictionary : hidden need for training data

21 ``When solving a given problem, try to avoid a more general problem as an intermediate step.'' Vladimir Vapnik [main developer of statistical learning theory (along with Alexey Chervonenkis)] Given training data, we will bypass dictionary learning & design the whole compressive sampling system directly

22 Statistical Learning Theory meets Compressive Sensing Learning data triage (simplified)

23 A statistical learning framework for CS with sample signals I Probabilistic model: y = P Fx I x follows some unknown probability distribution P. I Sample signals: {xi } iæm, i.i.d. random vectors following P I Fix an estimator: ˆx = F H P T y =(P F) y I Loss function: L(x ; )= Έx x Î 2 2 Îx Î 2 2 I Goal: Fix = n. Findasub-samplingpattern, given {x i } iæm, such that the risk E L(x ; ) is minimized.

24 A statistical learning framework for CS with sample signals I Probabilistic model: y = P Fx I x follows some unknown probability distribution P. I Sample signals: {xi } iæm, i.i.d. random vectors following P I Fix an estimator: ˆx = F H P T y =(P F) y simplification is here I Loss function: L(x ; )= Έx x Î 2 2 Îx Î 2 2 I Goal: Fix = n. Findasub-samplingpattern, given {x i } iæm, such that the risk E L(x ; ) is minimized. simplification is here

25 Empirical risk minimization-i If P were known, the optimal optimization problem: is given by solving the discrete opt œ arg min : Æn E L(x ; ) Proposition We have L(x ; )=1 ÎP Fx Î 2 2 Îx Î 2 2 =: 1 (x ; ). Therefore, we can write opt œ arg max : Æn E (x ; ), and we have E L(x ; opt) = min : Æn E L(x ; )=1 E ÎP opt Fx Î 2 2 Îx Î 2 2 =: 1 Á P.

26 Empirical risk minimization-ii While P is unknown, we have i.i.d. samples {x i } iæn from P. Hence we may consider the empirical risk minimizer given by: ˆ œ arg max : Æn 1 m ÿ iæm (x i; ). Since in general ˆ, opt, we can only expect that E L(x ; ˆ) Æ E L(x ; opt)+á m =1 Á P + Á m.

27 Statistical analysis Recall that E L(x ; ˆ) Æ E L(x ; opt)+á m =1 Á P + Á m. Theorem For any œ (0, 1), wehave Û 2 Á m Æ m 5 log 3 4 p n + log , with probability at least 1. Corollary Number of sample signals required is of O(n log p).

28 Solving the discrete optimization problem Define x i = x i /Îx i Î 2.Recallthat ˆ œ= arg max : Æn ÿ iæm ÎP Fx i Î 2 2 Îx i Î 2 2 = arg max : Æn ÿ iæm ÎP F x iî 2 2. Proposition (Existence of a simple greedy algorithm) Let i be the i-th row of F. We can compute ˆ exactly by the following greedy algorithm. 1. For all i Æ p, compute v i = q jæm È i, x j Í Let be the set of indices of the n largest v i s. I If F is the Fourier transform, then the computational complexity is O(mp log p), nearly linear time.

29 Applications

30 Images - I Old CS vs Learning based CS vs JPEG

31 Images - II 1Gpix at 1MPix rate! Opens up the possibility of streaming video at 30FPS

32 Wireless neural implants - I µ-electrode input AFE ADC DSP TX f s 5KHz ~.3μW/accumulator 1nJ/bit > 30dB quality Stream out Full comp. LBCS AFE + ADC 10μW 10μW 10μW DSP 0 80μW ~2.5μW TX 50μW ~2.5μW ~3μW

33 Wireless neural implants - I µ-electrode input AFE ADC DSP TX f s 5KHz ~.3μW/accumulator 1nJ/bit > 30dB quality Stream out Full comp. LBCS AFE + ADC 10μW 10μW 10μW DSP 0 80μW ~2.5μW TX 50μW ~2.5μW ~3μW

34 Wireless neural implants - II µ-electrode input AFE ADC DSP TX f s I Dataset: billion samples length from ieeg.org LBCS: Indices are learned for Hadamard basis on the training data SNR comparison in [db], for N=256 and B i = 10 Old CS Method Compression rate LBCS SHS BERN MCS n.a. n.a. SHS: Structured Hadamard Sampling [Baldassarre, 15] BERN: Random Bernoulli [Chen, JSSC 12] MCS: Multi-Channel Sampling [Shoaran, TBioCAS 15]

35 Wireless neural implants - II µ-electrode input AFE ADC DSP TX f s

36 other trade-offs are possible! Wireless neural implants - II µ-electrode input AFE ADC DSP TX f s 5KHz ~.3μW/accumulator 1nJ/bit > 30dB quality Stream out Full comp. LBCS AFE + ADC 10μW 10μW 10μW DSP 0 80μW ~2.5μW TX 50μW ~2.5μW ~3μW

37 Wireless neural implants - III µ-electrode input AFE ADC DSP TX f s Actual circuit:

38 Magnetic Resonance Imaging MRI of the Brain minute scan time. MRI of the Orbits minute scan time. MRI of the TMJ minute scan time. MRI of the Soft Tissue Neck minute scan time. MRI of the Cervical Spine minute scan time. MRI of the Upper Extremity minute scan time. MRI of the Thoracic Spine minute scan time. MRI of the Chest minute scan time. MRI of the Abdomen minute scan time.

39 MRI - multi coil (4xaccel)

40 VD MRI - multi coil (4xaccel)

41 MRI - multi coil (4xaccel) VD LBCS Decoder: BP with shearlets

42 MRI - multi coil (4xaccel) 6dB improvement on the average

43 MRI - multi coil (4xaccel) LBCS Patient #22 VD 40.80dB 35.08dB

44 MRI - multi coil (4xaccel) LBCS Patient #29 VD 41.39dB 34.80dB

45 Learning-based CS Middle-out compression unleashed with machine learning

LEARNING DATA TRIAGE: LINEAR DECODING WORKS FOR COMPRESSIVE MRI. Yen-Huan Li and Volkan Cevher

LEARNING DATA TRIAGE: LINEAR DECODING WORKS FOR COMPRESSIVE MRI. Yen-Huan Li and Volkan Cevher LARNING DATA TRIAG: LINAR DCODING WORKS FOR COMPRSSIV MRI Yen-Huan Li and Volkan Cevher Laboratory for Information Inference Systems École Polytechnique Fédérale de Lausanne ABSTRACT The standard approach

More information

Learning-Based Near-Optimal Area-Power Trade-offs in Hardware Design for Neural Signal Acquisition

Learning-Based Near-Optimal Area-Power Trade-offs in Hardware Design for Neural Signal Acquisition Learning-Based Near-Optimal Area-Power Trade-offs in Hardware Design for Neural Signal Acquisition Cosimo Aprile LIONS and LSM, EPFL cosimo.aprile@epfl.ch Juhwan Yoo Broadcom United States of America Luca

More information

Applied Machine Learning for Biomedical Engineering. Enrico Grisan

Applied Machine Learning for Biomedical Engineering. Enrico Grisan Applied Machine Learning for Biomedical Engineering Enrico Grisan enrico.grisan@dei.unipd.it Data representation To find a representation that approximates elements of a signal class with a linear combination

More information

Strengthened Sobolev inequalities for a random subspace of functions

Strengthened Sobolev inequalities for a random subspace of functions Strengthened Sobolev inequalities for a random subspace of functions Rachel Ward University of Texas at Austin April 2013 2 Discrete Sobolev inequalities Proposition (Sobolev inequality for discrete images)

More information

Oslo Class 6 Sparsity based regularization

Oslo Class 6 Sparsity based regularization RegML2017@SIMULA Oslo Class 6 Sparsity based regularization Lorenzo Rosasco UNIGE-MIT-IIT May 4, 2017 Learning from data Possible only under assumptions regularization min Ê(w) + λr(w) w Smoothness Sparsity

More information

Solution Recovery via L1 minimization: What are possible and Why?

Solution Recovery via L1 minimization: What are possible and Why? Solution Recovery via L1 minimization: What are possible and Why? Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Eighth US-Mexico Workshop on Optimization

More information

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery

Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Sparse analysis Lecture V: From Sparse Approximation to Sparse Signal Recovery Anna C. Gilbert Department of Mathematics University of Michigan Connection between... Sparse Approximation and Compressed

More information

Sparsifying Transform Learning for Compressed Sensing MRI

Sparsifying Transform Learning for Compressed Sensing MRI Sparsifying Transform Learning for Compressed Sensing MRI Saiprasad Ravishankar and Yoram Bresler Department of Electrical and Computer Engineering and Coordinated Science Laborarory University of Illinois

More information

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Jarvis Haupt University of Minnesota Department of Electrical and Computer Engineering Supported by Motivation New Agile Sensing

More information

Compressive Sensing Theory and L1-Related Optimization Algorithms

Compressive Sensing Theory and L1-Related Optimization Algorithms Compressive Sensing Theory and L1-Related Optimization Algorithms Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, USA CAAM Colloquium January 26, 2009 Outline:

More information

Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach

Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Compressive Sensing under Matrix Uncertainties: An Approximate Message Passing Approach Asilomar 2011 Jason T. Parker (AFRL/RYAP) Philip Schniter (OSU) Volkan Cevher (EPFL) Problem Statement Traditional

More information

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design

MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications. Class 19: Data Representation by Design MIT 9.520/6.860, Fall 2017 Statistical Learning Theory and Applications Class 19: Data Representation by Design What is data representation? Let X be a data-space X M (M) F (M) X A data representation

More information

Bayesian Methods for Sparse Signal Recovery

Bayesian Methods for Sparse Signal Recovery Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Jason Palmer, Zhilin Zhang and Ritwik Giri Motivation Motivation Sparse Signal Recovery

More information

Compressed Sensing: Extending CLEAN and NNLS

Compressed Sensing: Extending CLEAN and NNLS Compressed Sensing: Extending CLEAN and NNLS Ludwig Schwardt SKA South Africa (KAT Project) Calibration & Imaging Workshop Socorro, NM, USA 31 March 2009 Outline 1 Compressed Sensing (CS) Introduction

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond

Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Large-Scale L1-Related Minimization in Compressive Sensing and Beyond Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Arizona State University March

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD

SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD EE-731: ADVANCED TOPICS IN DATA SCIENCES LABORATORY FOR INFORMATION AND INFERENCE SYSTEMS SPRING 2016 INSTRUCTOR: VOLKAN CEVHER SCRIBERS: SOROOSH SHAFIEEZADEH-ABADEH, MICHAËL DEFFERRARD STRUCTURED SPARSITY

More information

Sparse linear models

Sparse linear models Sparse linear models Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 2/22/2016 Introduction Linear transforms Frequency representation Short-time

More information

Compressive sensing of low-complexity signals: theory, algorithms and extensions

Compressive sensing of low-complexity signals: theory, algorithms and extensions Compressive sensing of low-complexity signals: theory, algorithms and extensions Laurent Jacques March 7, 9, 1, 14, 16 and 18, 216 9h3-12h3 (incl. 3 ) Graduate School in Systems, Optimization, Control

More information

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk

Model-Based Compressive Sensing for Signal Ensembles. Marco F. Duarte Volkan Cevher Richard G. Baraniuk Model-Based Compressive Sensing for Signal Ensembles Marco F. Duarte Volkan Cevher Richard G. Baraniuk Concise Signal Structure Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional

More information

Compressed sensing and imaging

Compressed sensing and imaging Compressed sensing and imaging The effect and benefits of local structure Ben Adcock Department of Mathematics Simon Fraser University 1 / 45 Overview Previously: An introduction to compressed sensing.

More information

2.3. Clustering or vector quantization 57

2.3. Clustering or vector quantization 57 Multivariate Statistics non-negative matrix factorisation and sparse dictionary learning The PCA decomposition is by construction optimal solution to argmin A R n q,h R q p X AH 2 2 under constraint :

More information

Compressive Sampling for Energy Efficient Event Detection

Compressive Sampling for Energy Efficient Event Detection Compressive Sampling for Energy Efficient Event Detection Zainul Charbiwala, Younghun Kim, Sadaf Zahedi, Jonathan Friedman, and Mani B. Srivastava Physical Signal Sampling Processing Communication Detection

More information

Greedy Dictionary Selection for Sparse Representation

Greedy Dictionary Selection for Sparse Representation Greedy Dictionary Selection for Sparse Representation Volkan Cevher Rice University volkan@rice.edu Andreas Krause Caltech krausea@caltech.edu Abstract We discuss how to construct a dictionary by selecting

More information

Cognitive Cyber-Physical System

Cognitive Cyber-Physical System Cognitive Cyber-Physical System Physical to Cyber-Physical The emergence of non-trivial embedded sensor units, networked embedded systems and sensor/actuator networks has made possible the design and implementation

More information

Blind Compressed Sensing

Blind Compressed Sensing 1 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE arxiv:1002.2586v2 [cs.it] 28 Apr 2010 Abstract The fundamental principle underlying compressed sensing is that a signal,

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

COMP9444: Neural Networks. Vapnik Chervonenkis Dimension, PAC Learning and Structural Risk Minimization

COMP9444: Neural Networks. Vapnik Chervonenkis Dimension, PAC Learning and Structural Risk Minimization : Neural Networks Vapnik Chervonenkis Dimension, PAC Learning and Structural Risk Minimization 11s2 VC-dimension and PAC-learning 1 How good a classifier does a learner produce? Training error is the precentage

More information

Compressed Sensing and Neural Networks

Compressed Sensing and Neural Networks and Jan Vybíral (Charles University & Czech Technical University Prague, Czech Republic) NOMAD Summer Berlin, September 25-29, 2017 1 / 31 Outline Lasso & Introduction Notation Training the network Applications

More information

Algorithms for sparse analysis Lecture I: Background on sparse approximation

Algorithms for sparse analysis Lecture I: Background on sparse approximation Algorithms for sparse analysis Lecture I: Background on sparse approximation Anna C. Gilbert Department of Mathematics University of Michigan Tutorial on sparse approximations and algorithms Compress data

More information

Overview. Optimization-Based Data Analysis. Carlos Fernandez-Granda

Overview. Optimization-Based Data Analysis.   Carlos Fernandez-Granda Overview Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 1/25/2016 Sparsity Denoising Regression Inverse problems Low-rank models Matrix completion

More information

Compressed Sensing and Linear Codes over Real Numbers

Compressed Sensing and Linear Codes over Real Numbers Compressed Sensing and Linear Codes over Real Numbers Henry D. Pfister (joint with Fan Zhang) Texas A&M University College Station Information Theory and Applications Workshop UC San Diego January 31st,

More information

Is the test error unbiased for these programs? 2017 Kevin Jamieson

Is the test error unbiased for these programs? 2017 Kevin Jamieson Is the test error unbiased for these programs? 2017 Kevin Jamieson 1 Is the test error unbiased for this program? 2017 Kevin Jamieson 2 Simple Variable Selection LASSO: Sparse Regression Machine Learning

More information

Machine Learning. Regularization and Feature Selection. Fabio Vandin November 14, 2017

Machine Learning. Regularization and Feature Selection. Fabio Vandin November 14, 2017 Machine Learning Regularization and Feature Selection Fabio Vandin November 14, 2017 1 Regularized Loss Minimization Assume h is defined by a vector w = (w 1,..., w d ) T R d (e.g., linear models) Regularization

More information

Supremum of simple stochastic processes

Supremum of simple stochastic processes Subspace embeddings Daniel Hsu COMS 4772 1 Supremum of simple stochastic processes 2 Recap: JL lemma JL lemma. For any ε (0, 1/2), point set S R d of cardinality 16 ln n S = n, and k N such that k, there

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline

More information

Particle Filtered Modified-CS (PaFiMoCS) for tracking signal sequences

Particle Filtered Modified-CS (PaFiMoCS) for tracking signal sequences Particle Filtered Modified-CS (PaFiMoCS) for tracking signal sequences Samarjit Das and Namrata Vaswani Department of Electrical and Computer Engineering Iowa State University http://www.ece.iastate.edu/

More information

An Overview of Compressed Sensing

An Overview of Compressed Sensing An Overview of Compressed Sensing Nathan Schneider November 18, 2009 Abstract In a large number of applications, the system will be designed to sample at a rate equal to at least the frequency bandwidth

More information

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices

Compressed Sensing. 1 Introduction. 2 Design of Measurement Matrices Compressed Sensing Yonina C. Eldar Electrical Engineering Department, Technion-Israel Institute of Technology, Haifa, Israel, 32000 1 Introduction Compressed sensing (CS) is an exciting, rapidly growing

More information

Motivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble

Motivation Sparse Signal Recovery is an interesting area with many potential applications. Methods developed for solving sparse signal recovery proble Bayesian Methods for Sparse Signal Recovery Bhaskar D Rao 1 University of California, San Diego 1 Thanks to David Wipf, Zhilin Zhang and Ritwik Giri Motivation Sparse Signal Recovery is an interesting

More information

Mathematical introduction to Compressed Sensing

Mathematical introduction to Compressed Sensing Mathematical introduction to Compressed Sensing Lesson 1 : measurements and sparsity Guillaume Lecué ENSAE Mardi 31 janvier 2016 Guillaume Lecué (ENSAE) Compressed Sensing Mardi 31 janvier 2016 1 / 31

More information

Analog-to-Information Conversion

Analog-to-Information Conversion Analog-to-Information Conversion Sergiy A. Vorobyov Dept. Signal Processing and Acoustics, Aalto University February 2013 Winter School on Compressed Sensing, Ruka 1/55 Outline 1 Compressed Sampling (CS)

More information

Machine Learning CSE546 Carlos Guestrin University of Washington. October 7, Efficiency: If size(w) = 100B, each prediction is expensive:

Machine Learning CSE546 Carlos Guestrin University of Washington. October 7, Efficiency: If size(w) = 100B, each prediction is expensive: Simple Variable Selection LASSO: Sparse Regression Machine Learning CSE546 Carlos Guestrin University of Washington October 7, 2013 1 Sparsity Vector w is sparse, if many entries are zero: Very useful

More information

Part IV Compressed Sensing

Part IV Compressed Sensing Aisenstadt Chair Course CRM September 2009 Part IV Compressed Sensing Stéphane Mallat Centre de Mathématiques Appliquées Ecole Polytechnique Conclusion to Super-Resolution Sparse super-resolution is sometime

More information

Lecture 16: Compressed Sensing

Lecture 16: Compressed Sensing Lecture 16: Compressed Sensing Introduction to Learning and Analysis of Big Data Kontorovich and Sabato (BGU) Lecture 16 1 / 12 Review of Johnson-Lindenstrauss Unsupervised learning technique key insight:

More information

Machine Learning. Regularization and Feature Selection. Fabio Vandin November 13, 2017

Machine Learning. Regularization and Feature Selection. Fabio Vandin November 13, 2017 Machine Learning Regularization and Feature Selection Fabio Vandin November 13, 2017 1 Learning Model A: learning algorithm for a machine learning task S: m i.i.d. pairs z i = (x i, y i ), i = 1,..., m,

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

arxiv: v1 [cs.it] 26 Oct 2018

arxiv: v1 [cs.it] 26 Oct 2018 Outlier Detection using Generative Models with Theoretical Performance Guarantees arxiv:1810.11335v1 [cs.it] 6 Oct 018 Jirong Yi Anh Duc Le Tianming Wang Xiaodong Wu Weiyu Xu October 9, 018 Abstract This

More information

BASICS OF COMPRESSION THEORY

BASICS OF COMPRESSION THEORY BASICS OF COMPRESSION THEORY Why Compression? Task: storage and transport of multimedia information. E.g.: non-interlaced HDTV: 0x0x0x = Mb/s!! Solutions: Develop technologies for higher bandwidth Find

More information

Sparse & Redundant Signal Representation, and its Role in Image Processing

Sparse & Redundant Signal Representation, and its Role in Image Processing Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique

More information

Problem 1 HW3. Question 2. i) We first show that for a random variable X bounded in [0, 1], since x 2 apple x, wehave

Problem 1 HW3. Question 2. i) We first show that for a random variable X bounded in [0, 1], since x 2 apple x, wehave Next, we show that, for any x Ø, (x) Æ 3 x x HW3 Problem i) We first show that for a random variable X bounded in [, ], since x apple x, wehave Var[X] EX [EX] apple EX [EX] EX( EX) Since EX [, ], therefore,

More information

Discriminative Models

Discriminative Models No.5 Discriminative Models Hui Jiang Department of Electrical Engineering and Computer Science Lassonde School of Engineering York University, Toronto, Canada Outline Generative vs. Discriminative models

More information

Bayesian probability theory and generative models

Bayesian probability theory and generative models Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using

More information

From approximation theory to machine learning

From approximation theory to machine learning 1/40 From approximation theory to machine learning New perspectives in the theory of function spaces and their applications September 2017, Bedlewo, Poland Jan Vybíral Charles University/Czech Technical

More information

CS242: Probabilistic Graphical Models Lecture 4A: MAP Estimation & Graph Structure Learning

CS242: Probabilistic Graphical Models Lecture 4A: MAP Estimation & Graph Structure Learning CS242: Probabilistic Graphical Models Lecture 4A: MAP Estimation & Graph Structure Learning Professor Erik Sudderth Brown University Computer Science October 4, 2016 Some figures and materials courtesy

More information

Compressing Tabular Data via Pairwise Dependencies

Compressing Tabular Data via Pairwise Dependencies Compressing Tabular Data via Pairwise Dependencies Amir Ingber, Yahoo! Research TCE Conference, June 22, 2017 Joint work with Dmitri Pavlichin, Tsachy Weissman (Stanford) Huge datasets: everywhere - Internet

More information

GREEDY SIGNAL RECOVERY REVIEW

GREEDY SIGNAL RECOVERY REVIEW GREEDY SIGNAL RECOVERY REVIEW DEANNA NEEDELL, JOEL A. TROPP, ROMAN VERSHYNIN Abstract. The two major approaches to sparse recovery are L 1-minimization and greedy methods. Recently, Needell and Vershynin

More information

Discriminative Models

Discriminative Models No.5 Discriminative Models Hui Jiang Department of Electrical Engineering and Computer Science Lassonde School of Engineering York University, Toronto, Canada Outline Generative vs. Discriminative models

More information

Exponential decay of reconstruction error from binary measurements of sparse signals

Exponential decay of reconstruction error from binary measurements of sparse signals Exponential decay of reconstruction error from binary measurements of sparse signals Deanna Needell Joint work with R. Baraniuk, S. Foucart, Y. Plan, and M. Wootters Outline Introduction Mathematical Formulation

More information

Sparse Algorithms are not Stable: A No-free-lunch Theorem

Sparse Algorithms are not Stable: A No-free-lunch Theorem Sparse Algorithms are not Stable: A No-free-lunch Theorem Huan Xu Shie Mannor Constantine Caramanis Abstract We consider two widely used notions in machine learning, namely: sparsity algorithmic stability.

More information

A High-Yield Area-Power Efficient DWT Hardware for Implantable Neural Interface Applications

A High-Yield Area-Power Efficient DWT Hardware for Implantable Neural Interface Applications Neural Engineering 27 A High-Yield Area-Power Efficient DWT Hardware for Implantable Neural Interface Applications Awais M. Kamboh, Andrew Mason, Karim Oweiss {Kambohaw, Mason, Koweiss} @msu.edu Department

More information

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett jonathan.scarlett@epfl.ch Laboratory for Information and Inference Systems (LIONS)

More information

Breaking the coherence barrier - A new theory for compressed sensing

Breaking the coherence barrier - A new theory for compressed sensing Breaking the coherence barrier - A new theory for compressed sensing Anders C. Hansen (Cambridge) Joint work with: B. Adcock (Simon Fraser University) C. Poon (Cambridge) B. Roman (Cambridge) UCL-Duke

More information

Compressive Sensing (CS)

Compressive Sensing (CS) Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)

More information

Optimization methods

Optimization methods Optimization methods Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda /8/016 Introduction Aim: Overview of optimization methods that Tend to

More information

Linear Models for Regression CS534

Linear Models for Regression CS534 Linear Models for Regression CS534 Example Regression Problems Predict housing price based on House size, lot size, Location, # of rooms Predict stock price based on Price history of the past month Predict

More information

VC dimension, Model Selection and Performance Assessment for SVM and Other Machine Learning Algorithms

VC dimension, Model Selection and Performance Assessment for SVM and Other Machine Learning Algorithms 03/Feb/2010 VC dimension, Model Selection and Performance Assessment for SVM and Other Machine Learning Algorithms Presented by Andriy Temko Department of Electrical and Electronic Engineering Page 2 of

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

A Survey of Compressive Sensing and Applications

A Survey of Compressive Sensing and Applications A Survey of Compressive Sensing and Applications Justin Romberg Georgia Tech, School of ECE ENS Winter School January 10, 2012 Lyon, France Signal processing trends DSP: sample first, ask questions later

More information

Application to Hyperspectral Imaging

Application to Hyperspectral Imaging Compressed Sensing of Low Complexity High Dimensional Data Application to Hyperspectral Imaging Kévin Degraux PhD Student, ICTEAM institute Université catholique de Louvain, Belgium 6 November, 2013 Hyperspectral

More information

CSC 576: Variants of Sparse Learning

CSC 576: Variants of Sparse Learning CSC 576: Variants of Sparse Learning Ji Liu Department of Computer Science, University of Rochester October 27, 205 Introduction Our previous note basically suggests using l norm to enforce sparsity in

More information

11. Learning graphical models

11. Learning graphical models Learning graphical models 11-1 11. Learning graphical models Maximum likelihood Parameter learning Structural learning Learning partially observed graphical models Learning graphical models 11-2 statistical

More information

Compressed Sensing and Sparse Recovery

Compressed Sensing and Sparse Recovery ELE 538B: Sparsity, Structure and Inference Compressed Sensing and Sparse Recovery Yuxin Chen Princeton University, Spring 217 Outline Restricted isometry property (RIP) A RIPless theory Compressed sensing

More information

cxx ab.ec Warm up OH 2 ax 16 0 axtb Fix any a, b, c > What is the x 2 R that minimizes ax 2 + bx + c

cxx ab.ec Warm up OH 2 ax 16 0 axtb Fix any a, b, c > What is the x 2 R that minimizes ax 2 + bx + c Warm up D cai.yo.ie p IExrL9CxsYD Sglx.Ddl f E Luo fhlexi.si dbll Fix any a, b, c > 0. 1. What is the x 2 R that minimizes ax 2 + bx + c x a b Ta OH 2 ax 16 0 x 1 Za fhkxiiso3ii draulx.h dp.d 2. What is

More information

A summary of Deep Learning without Poor Local Minima

A summary of Deep Learning without Poor Local Minima A summary of Deep Learning without Poor Local Minima by Kenji Kawaguchi MIT oral presentation at NIPS 2016 Learning Supervised (or Predictive) learning Learn a mapping from inputs x to outputs y, given

More information

The Pros and Cons of Compressive Sensing

The Pros and Cons of Compressive Sensing The Pros and Cons of Compressive Sensing Mark A. Davenport Stanford University Department of Statistics Compressive Sensing Replace samples with general linear measurements measurements sampled signal

More information

Recovering overcomplete sparse representations from structured sensing

Recovering overcomplete sparse representations from structured sensing Recovering overcomplete sparse representations from structured sensing Deanna Needell Claremont McKenna College Feb. 2015 Support: Alfred P. Sloan Foundation and NSF CAREER #1348721. Joint work with Felix

More information

Efficient Data-Driven Learning of Sparse Signal Models and Its Applications

Efficient Data-Driven Learning of Sparse Signal Models and Its Applications Efficient Data-Driven Learning of Sparse Signal Models and Its Applications Saiprasad Ravishankar Department of Electrical Engineering and Computer Science University of Michigan, Ann Arbor Dec 10, 2015

More information

Compressed Sensing Using Bernoulli Measurement Matrices

Compressed Sensing Using Bernoulli Measurement Matrices ITSchool 11, Austin Compressed Sensing Using Bernoulli Measurement Matrices Yuhan Zhou Advisor: Wei Yu Department of Electrical and Computer Engineering University of Toronto, Canada Motivation Motivation

More information

Linear Models for Regression CS534

Linear Models for Regression CS534 Linear Models for Regression CS534 Example Regression Problems Predict housing price based on House size, lot size, Location, # of rooms Predict stock price based on Price history of the past month Predict

More information

Optimization methods

Optimization methods Lecture notes 3 February 8, 016 1 Introduction Optimization methods In these notes we provide an overview of a selection of optimization methods. We focus on methods which rely on first-order information,

More information

Algorithms other than SGD. CS6787 Lecture 10 Fall 2017

Algorithms other than SGD. CS6787 Lecture 10 Fall 2017 Algorithms other than SGD CS6787 Lecture 10 Fall 2017 Machine learning is not just SGD Once a model is trained, we need to use it to classify new examples This inference task is not computed with SGD There

More information

Content. Learning. Regression vs Classification. Regression a.k.a. function approximation and Classification a.k.a. pattern recognition

Content. Learning. Regression vs Classification. Regression a.k.a. function approximation and Classification a.k.a. pattern recognition Content Andrew Kusiak Intelligent Systems Laboratory 239 Seamans Center The University of Iowa Iowa City, IA 52242-527 andrew-kusiak@uiowa.edu http://www.icaen.uiowa.edu/~ankusiak Introduction to learning

More information

Compressed Sensing: Lecture I. Ronald DeVore

Compressed Sensing: Lecture I. Ronald DeVore Compressed Sensing: Lecture I Ronald DeVore Motivation Compressed Sensing is a new paradigm for signal/image/function acquisition Motivation Compressed Sensing is a new paradigm for signal/image/function

More information

Sparse analysis Lecture II: Hardness results for sparse approximation problems

Sparse analysis Lecture II: Hardness results for sparse approximation problems Sparse analysis Lecture II: Hardness results for sparse approximation problems Anna C. Gilbert Department of Mathematics University of Michigan Sparse Problems Exact. Given a vector x R d and a complete

More information

Tree-Structured Compressive Sensing with Variational Bayesian Analysis

Tree-Structured Compressive Sensing with Variational Bayesian Analysis 1 Tree-Structured Compressive Sensing with Variational Bayesian Analysis Lihan He, Haojun Chen and Lawrence Carin Department of Electrical and Computer Engineering Duke University Durham, NC 27708-0291

More information

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC 2018 Variable Selection and Sparsity. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 Variable Selection and Sparsity Lorenzo Rosasco UNIGE-MIT-IIT Outline Variable Selection Subset Selection Greedy Methods: (Orthogonal) Matching Pursuit Convex Relaxation: LASSO & Elastic Net

More information

Learning Decentralized Goal-based Vector Quantization

Learning Decentralized Goal-based Vector Quantization Learning Decentralized Goal-based Vector Quantization Piyush Gupta * Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL Vivek S. Borkar Department

More information

Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming)

Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming) Solving Underdetermined Linear Equations and Overdetermined Quadratic Equations (using Convex Programming) Justin Romberg Georgia Tech, ECE Caltech ROM-GR Workshop June 7, 2013 Pasadena, California Linear

More information

Lecture 5 : Sparse Models

Lecture 5 : Sparse Models Lecture 5 : Sparse Models Homework 3 discussion (Nima) Sparse Models Lecture - Reading : Murphy, Chapter 13.1, 13.3, 13.6.1 - Reading : Peter Knee, Chapter 2 Paolo Gabriel (TA) : Neural Brain Control After

More information

Lecture 22: More On Compressed Sensing

Lecture 22: More On Compressed Sensing Lecture 22: More On Compressed Sensing Scribed by Eric Lee, Chengrun Yang, and Sebastian Ament Nov. 2, 207 Recap and Introduction Basis pursuit was the method of recovering the sparsest solution to an

More information

CS598 Machine Learning in Computational Biology (Lecture 5: Matrix - part 2) Professor Jian Peng Teaching Assistant: Rongda Zhu

CS598 Machine Learning in Computational Biology (Lecture 5: Matrix - part 2) Professor Jian Peng Teaching Assistant: Rongda Zhu CS598 Machine Learning in Computational Biology (Lecture 5: Matrix - part 2) Professor Jian Peng Teaching Assistant: Rongda Zhu Feature engineering is hard 1. Extract informative features from domain knowledge

More information

Leveraging Machine Learning for High-Resolution Restoration of Satellite Imagery

Leveraging Machine Learning for High-Resolution Restoration of Satellite Imagery Leveraging Machine Learning for High-Resolution Restoration of Satellite Imagery Daniel L. Pimentel-Alarcón, Ashish Tiwari Georgia State University, Atlanta, GA Douglas A. Hope Hope Scientific Renaissance

More information

Various signal sampling and reconstruction methods

Various signal sampling and reconstruction methods Various signal sampling and reconstruction methods Rolands Shavelis, Modris Greitans 14 Dzerbenes str., Riga LV-1006, Latvia Contents Classical uniform sampling and reconstruction Advanced sampling and

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

Least Squares and Kalman Filtering Questions: me,

Least Squares and Kalman Filtering Questions:  me, Least Squares and Kalman Filtering Questions: Email me, namrata@ece.gatech.edu Least Squares and Kalman Filtering 1 Recall: Weighted Least Squares y = Hx + e Minimize Solution: J(x) = (y Hx) T W (y Hx)

More information

Lec 05 Arithmetic Coding

Lec 05 Arithmetic Coding ECE 5578 Multimedia Communication Lec 05 Arithmetic Coding Zhu Li Dept of CSEE, UMKC web: http://l.web.umkc.edu/lizhu phone: x2346 Z. Li, Multimedia Communciation, 208 p. Outline Lecture 04 ReCap Arithmetic

More information

MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco

MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications. Class 08: Sparsity Based Regularization. Lorenzo Rosasco MIT 9.520/6.860, Fall 2018 Statistical Learning Theory and Applications Class 08: Sparsity Based Regularization Lorenzo Rosasco Learning algorithms so far ERM + explicit l 2 penalty 1 min w R d n n l(y

More information

Sparse Approximation and Variable Selection

Sparse Approximation and Variable Selection Sparse Approximation and Variable Selection Lorenzo Rosasco 9.520 Class 07 February 26, 2007 About this class Goal To introduce the problem of variable selection, discuss its connection to sparse approximation

More information