MEASUREMENT MATRIX DESIGN FOR COMPRESSIVE SENSING WITH SIDE INFORMATION AT THE ENCODER

Size: px
Start display at page:

Download "MEASUREMENT MATRIX DESIGN FOR COMPRESSIVE SENSING WITH SIDE INFORMATION AT THE ENCODER"

Transcription

1 MEASUREMENT MATRIX DESIGN FOR COMPRESSIVE SENSING WITH SIDE INFORMATION AT THE ENCODER Pingfan Song João F. C. Mota Nikos Deligiannis Miguel Raul Dias Rodrigues Electronic and Electrical Engineering Departent, University College London, UK Departent of Electronics and Inforatics, Vrije Universiteit Brussel, Belgiu iminds vzw, Ghent, Belgiu ABSTRACT We study the proble of easureent atri design for Copressive Sensing (CS when the encoder has access to side inforation, a signal analogous to the signal of interest. In particular, we propose to incorporate this etra inforation into the signal acquisition stage via a new design for the easureent atri. The goal is to reduce the nuber of encoding easureents, while still allowing perfect signal reconstruction at the decoder. Then, the reconstruction perforance of the resulting CS syste is analysed in detail assuing the decoder reconstructs the original signal via Basis Pursuit. Finally, Gaussian width tools are eploited to establish a tight theoretical bound for the nuber of required easureents. Etensive nuerical eperients not only validate our approach, but also deonstrate that our design requires fewer easureents for successful signal reconstruction copared with alternative designs, such as an i.i.d. Gaussian atri. Inde Ters easureent atri design, side inforation, Copressive Sensing, Basis Pursuit. INTRODUCTION Copressive Sensing (CS [, ] is a signal acquisition paradig that leverages signal sparsity to acquire signals with far less easureents than classical sapling schees. It has been applied, for eaple, in edical iaging [3], radar detection [4], sensor networks [5], and copressive video [6]. In any of these applications, one has access to side inforation, a signal analogous to the signal we want to reconstruct. For eaple, in copressive video past reconstructed fraes can be used as side inforation to reconstruct the current frae [7,8]; in edical iaging, prior patient scans can also be used as side inforation in iage reconstruction [9]. One of the ain benefits of using side inforation in these applications, where easureents are epensive, is that it allows reducing the nuber of required easureents for reconstruction. Much research has used side or prior inforation to obtain even lower acquisition rates in CS. This typically involves odifying the reconstruction procedure, e.g., Basis Pursuit (BP [] by using estiates of the support of the signal, either deterinistically, as in odified-cs [ 5], probabilistically, as in [6, 7], or in the for of a signal analogous to the signal to be reconstructed, as in l -l and l -l iniization [8 ]. Other work, such as [], has used side inforation to perfor classification and reconstruction under This work is supported by China Scholarship Council(CSC, UCL Overseas Research Scholarship (UCL-ORS, the VUB-UGent-UCL-Duke International Joint Research Group grant (VUB: DEFIS4, and by EPSRC grant EP/K3366/. Encoder Decoder A y Optiization ˆ w Fig.. CS with Side Inforation at the encoder. probabilistic odels. In all these work, however, side inforation is used to aid only the sparse reconstruction process, not the acquisition one. In practice, side inforation ay be available both at the encoder and decoder. In this paper, the goal is to integrate side inforation into the easureent atri to aid the signal acquisition process. Note that our approach differs fro those where the encoder uses richer odels to iprove perforance, e.g., [3, 4]. We show that, conceptually, our easureent atri design is equivalent to solving a weighted l iniization proble at the decoder. In that sense, our work is closely related to [5], which establishes bounds on the nuber of easureents required to reconstruct sparse signals using weighted l iniization. Although we use tools siilar to the ones in [5] (naely the concept of Gaussian width [6 9] to establish related bounds, our bounds are uch sipler and, as our eperients show, also uch tighter. Proble stateent. Fig. illustrates the situation where side inforation is available at the encoder. Let R n be the signal of interest, fro which we take linear easureents y = A, where A R n is the easureent atri. Then the vector of easureents y is sent to the decoder. In turn, the decoder reconstructs a vector R n fro y by, e.g., solving an optiization proble like BP. If there are enough easureents, should be a good approiation to. In the figure, w R n coprises the side inforation, a vector that we assue to be siilar to. In this contet, we study the proble of designing the easureent atri A at the encoder using the side inforation w, and then analyze the reconstruction perforance of the resulting syste. Contributions. In the presence of side inforation at the encoder, we propose a new design schee for the easureent atri A in which each row vector is independently drawn fro the Gaussian distribution N (, Σ, where the covariance atri Σ R n n is, assued diagonal, is designed according to the side inforation w. Then, based on this design, we establish a bound on the nuber of easureents that guarantees perfect reconstruction of the original signal via solving BP. Eperiental results illustrate the gains of the proposed easureent atri design with respect to the scenarios where no side inforation is used.

2 . BACKGROUND Let R n be the sparse signal of interest. When there is no side inforation available, it is a standard CS proble to reconstruct fro a given vector of easureents y = A. One coon approach is to solve Basis Pursuit []: iniize f( = subject to y = A, where := n i= i denotes the l-nor of. Assuing A R n is coposed of i.i.d. Gaussian entries with zero ean and variance /, there are several different tools to analyse (, for eaple, the Restricted Isoetry Property (RIP [3] and the Gaussian width/distance [6 9]. We will use the latter, since it allows providing sharper perforance characterizations, although with the shortcoing of being applicable only to Gaussian atrices. Concretely, [8] establishes the following theore. Theore (Corollary 3.3 and Proposition 3. in [8]. Let R n be the signal of interest with sparsity s := {i : i }. Given a vector of easureents y = A, where atri A R n is coposed of i.i.d. zero-ean Gaussian rando variables with variance /, then, is the unique optial solution of ( with probability at least ep ( (λ ω(ω, provided ( n s ln + 7 s 5 s +. ( In the above theore, λ := [ g ] denotes the epected length of a zero-ean, unit-variance Gaussian ] vector g N (, I in R. Moreover, ω(ω := [sup g T z denotes the Gaussian width z Ω of the set Ω, where Ω = T f ( S n is the spherical part of the tangent cone T f ( and g N (, I n R n is a zero-ean, unit-variance Gaussian vector. Thus, when no prior inforation is available, the nuber of easureents necessary to recover is O(s ln n. Net, we will see that providing side inforation to the encoder allows obtaining a bound saller than (. 3. MEASUREMENT DESIGN WITH SIDE INFORMATION In this section, we consider the case where side inforation is available at the encoder. We start by presenting our design schee for the easureent atri; then, we analyse the resulting schee and establish a theoretical bound for the nuber of easureents required for perfect reconstruction in subsection 3., and give the proof for the bound in subsection Design Schee Assuing the encoder has access to w, the side inforation, we design the easureent atri A R n as follows: each row of A is generated independently as a realization of a rando Gaussian vector with zero ean and covariance atri Σ = diag(σ,..., σ i,..., σ n R n n, where each variance σ i is given by { if wi σ i = (3 ε (, ] if w i =. In (3, ε is a predefined paraeter that deterines the gains with respect to the no-side-inforation case. Driven by the side inforation w, the intuition of setting ε coes fro energy considerations: less energy should be spent on acquiring the coponents of that the side inforation indicates to be zero. ( bound of eas (/n * : w: I: = { i: * ¹ } I c : = { i: * i = } r i P: = { i: w i ¹ } Fig.. Visualization of the isatch paraeters r and v noralized sparsity (s/n v Bound in ( Bound in (5, ǫ =. Bound in (5, ǫ =.3 Bound in (5, ǫ =.5 Bound in (5, ǫ =.7 Bound in (5, ǫ =.9 Bound in [ref], ǫ =. Bound in [ref], ǫ =.3 Bound in [ref], ǫ =.5 Bound in [ref], ǫ =.7 Bound in [ref], ǫ =.9 Fig. 3. Coparison of our theoretical bound (5 with the classic CS bound (, and Mansour and Saab s bound [5], with r/s =., v/s = Main result: CS with side inforation at the encoder In this section, we analyse the perforance of the resulting CS syste and present a new bound on the nuber of easureents required for successful reconstruction when the easureent atri is designed as above. To present our result, we define two paraeters that capture the aount of isatch between and w: r := {i : i, w i = }, v := {i : i =, w i }. (4a (4b In other words, r counts the nuber of coponents issed by w, and v counts the nuber of coponents that w overestiates as shown in Fig.. Denoting the sparsity of by s and the sparsity of w by s w, there holds r s and v in{s w, n s}. Proposition. Let R n be the signal of interest with sparsity s := {i : i }, and w R n be the side inforation. Let r > and v >, defined as (4a and (4b, denote the two types of isatch between and w. Given a vector of easureents y = A, where A R n is designed as in Section 3., then is the unique optial solution of ( with probability at least ep ( (λ ω(ω, provided ( εs + ( εr ( n ln + 7 s 5 s + ( εv +. (5 Reark. Proposition establishes that our easureent atri design schee reduces the nuber of required easureents fro O(s ln n [cf.(] to O((ε s + ( εr ln n. In particular, if the nuber of coponents issed by w, naely r, is sall and the diension n is large, the reduction in the nuber of easureents can be significant, as the doinant logarithic ter in (5 is reduced rearkably copared to the counterpart ter in the CS bound (, as shown in (6a. On the other hand, (6b ensures that the unfavourable

3 increase fro the non-doinant linear ter in (5 is liited to no ore than the isatch v. ε ; r s = εs + ( εr s. (6a ε ; v = v ( εv. (6b Reark. Proposition also shows that ε deterines the sensitivity of the bound to the quality of the side inforation. For high quality side inforation with sall isatch r and v, a saller ε should be selected. Otherwise, a larger ε is ore favourable. Reark 3. It can also be seen that our bound (5 generalizes the classical CS bound (. Concretely, as shown in Fig. 3, bound (5 asyptotically approaches the classical CS bound ( with the increase of ε. Note that, taking ε =, (5 siplifies to (, as ε s + ( ε r = s and ( ε v =. Reark 4. Section 3.3 will show that solving BP with our easureent atri design as in Proposition is equivalent to solving a weighted l iniization proble at the decoder. In that contet, Mansour and Saab s work [5] established a result siilar to Proposition ; see Theore 5 in [5]. Their result is, however, considerably ore coplicated and looser than ours, as shown in Fig. 3. To copare the roughly, assue n is large enough and set ε. The doinant ter of the bound in (5 becoes r[ ln n, and the doinant ter of the bound in [5, Th.5] becoes (r + v ln n + (r + v ln n + ] (r + v ln n, where r and v are as in ( Outline of the proof of Proposition This section gives the proof outline of Proposition which involves ainly two stages: converting the BP optiization with our easureent atri design to a weighted l iniization with an i.i.d. Gaussian atri and coputing the upper bound for the Gaussian width/distance. Equivalence to weighted l iniization. We notice the equivalence between BP optiization with our easureent atri design A and the weighted l iniization proble with an i.i.d. Gaussian atri Ã. Concretely, proble ( with the easureent atri A as proposed in 3. can be forulated as in s.t. y = A in s.t. y = (AD(D in z Dz (7 s.t. y = Ãz where à := AD and z = D for D := diag(d,..., d n. Note that fro ( to (7, the optiization variable is changed fro to z = D, where the weights atri D is invertible because ε >. The goal of D is to transfor the current non-i.i.d. (anisotropic Gaussian easureent atri A to an i.i.d. Gaussian atri Ã, tha is isotropic in this case. In addition, the diagonal for of D ensures that the optial solution z of (7 has the sae support as. Recall that the rando colun vector X is called isotropic if EXX T = I n. In order to ake à = AD isotropic, each row of à needs to satisfy Eã T i ã i = I n, where ã i = [ã i,, ã ij,, ã in] is the i-th row vector of Ã. Thus, for each eleent, there holds E [ ã ij] = E [ a ij d i ] = d i [Da ij + (Ea ij ] = d i σ i =, since Ea ij =. The variance σ i = Da ij is the i-th eleent in the diagonal of the covariance atri Σ, defined as in (3. Finally, we take D = diag(d,..., d n with { d i = ; if wi = σi / ε; if w i = < ε. (8 Coputation of the bounds. The required nuber of easureents that guarantees successful reconstruction is upper bounded by the Gaussian width [8, 9]. As it is difficult to copute the Gaussian width in a closed for, an upper bound based on Gaussian distance is coputed instead. Concretely, f(z = Dz is a conve function. Suppose / f(z for a given z R n, then [dist ( g, cone f(z ] + = w(ω + (9 guarantees perfect recovery with high probability, where cone f(z is the cone generated by the subdifferential of the objective function f(z at the optial point z. Let dist(g, S := in{ z g : z S} denote the Euclidean distance between a point g and the set S. To copute an upper bound for the Gaussian distance in (9, the objective is decoposed as f(z = Dz = n i= f (i (z i = di zi and the corresponding cone is coputed as n i= ( cone f(z = t f ( (z, t f ( (z,, t f (n (zn, { t di sign(d i zi ; if i I t f (i (z i = I (, t d i =: [ t di, t d i]; if i / I where i =,..., n, I := {i : zi } is the support of z, and I(a, b denotes an interval with centre a and radius b. Then, Jensen s inequality is applied to derive that [ dist ( g, cone f(z ] i I + i / I i [dist ( g i, t d i sign(d i z i ] (a i [dist ( g i, I(, t d i ] (b which holds for arbitrary t >. In our case, t = ε ln (n/s is selected to copute the upper bound. For siplicity, an auiliary function A( for R n is defined as A( = ϕ( + ( + Q(. ( where ϕ( = ep ( / / π is the probability density function of a scalar Gaussian rando variable with zero-ean and unit variance, and Q( = + ϕ(t dt is the Q-function. Then, it is proved that for a scalar zero-ean Gaussian rando variable with unit variance, i.e., g N (,, the Gaussian distance can be epressed as [dist ( g, I(a, b ] = A(b a + A(b + a. ( Specifically, when b =, interval I(a, reduces to a point a, and [dist(g, a ] = A( a + A( a = a +. (3 When the interval is centered at, i.e., a =, there holds [dist ( g, I(, b ] = A(b (4

4 Then, (3 and (4 are plugged into (a and (b to obtain (a = [ + t d ] ( ( n i = s + εs + ( εr ln, s i I (b = i / I A(t d i = (n s va(t + va(t ε = (n sa(t + va(t ε va(t 5 s + ( εv, where the first ter involving s/5 is derived fro Lea 3 in [], / i.e., for all >, and the last ter is fro the π ln( 5 ( property of function A(, that is, a A(α A( ( α/ for < α. Finally, (a and (b are cobined to obtain the upper bound (5. 4. EXPERIMENTAL RESULTS A set of eperients has been conducted to evaluate the efficacy of the proposed easureent atri design. An i.i.d. Gaussian atri is used as the benchark. In the signal reconstruction stage, we use the SPGL Toolbo [3] to solve the sae BP with both an i.i.d. Gaussian atri and our easureent atri design. The paraeter setting is shown in Table. varies fro to with a step of and the relative sparsity s/n varies fro.5 to.6 with a step of.5 so that the nuber of easureents corresponding to the success rate eceeding 85% can be found as the epirical threshold. To indicate the phase transitions, s/ varies fro. to with a step of.5. In these eperients, instances of A are generated for each pair of s and. The eperient results are shown in Fig. 4. Fig. 4(a shows the success rate varying as a function of the nuber of easureents for different sparsity levels. Fig. 4(b copares the theoretical bounds and epirical bounds. Fig. 4(c and Fig. 4(d copares the phase transition for the two types of easureent atri. The results indicate that our easureent atri design outperfors the i.i.d. Gaussian atri for sall isatch, and our bound is tight and practically coincides with the epirical phase transition well. Table. Paraeters setting for the eperients n /n s/n s/ r/s v/s ε.:.5:.6.: CONCLUSIONS In this paper, a new easureent atri design schee is proposed to integrate side inforation at the encoder into a CS syste. Based on the design schee, the perforance of the resulting syste is analysed in ters of the nuber of easureents required for perfect reconstruction. Etensive eperients indicate that our easureent atri design allows reducing the nuber of easureents provided the side inforation has reasonable quality. In addition, it is deonstrated that our theoretical bound is siple and tight. We believe this work can contribute to better design of CS systes in scenarios where side inforation is available, such as edical iaging, sensor networks, and ulti-view caera systes. In the future, we will consider not only the case where the easureents are noisy, but also the scenario where the side inforation is available at both the encoder and the decoder..5 s = s = s = s = 4 6 s = s = s = s = s = (a Success ratio with easureents for different sparsity levels. ρ = s/ ρ = s/ bound of eas (/n noralized sparsity (s/n (b Theoretical (curves and epirical (arkers bounds theor, i.i.d Gauss theor, Our Matri epir, i.i.d Gauss epir, Our Matri theor, i.i.d Gauss δ = /n (c Phase transition for i.i.d. Gaussian atri. theor, Our Matri δ = /n (d Phase transition for proposed easureent atri. Fig. 4. Eperiental results. (a In each sub-figure, the black line with arker represents the recovery perforance of i.i.d. Gaussian atri and BP, the red line with arker represents our easureent atri design with BP. In (c and (d, the colorbar represents the success ratio. For each pair of /n and s/, the higher the succes ratio is, the whiter the point is

5 6. REFERENCES [] D. Donoho, Copressed sensing, IEEE Trans. Infor. Theory, vol. 5, no. 4, pp , 6. [] E. Candès, J. Roberg, and T. Tao, Robust uncertainty principles: Eact signal reconstruction fro highly incoplete frequency inforation, IEEE Trans. Infor. Theory, vol. 5, no., pp , 6. [3] M. Lustig, D. Donoho, and J. Pauly, Sparse MRI: The application of copressed sensing to rapid MR iaging, Magnetic Resonance in Medicine, vol. 58, pp. 8 95, 7. [4] M. Heran and T. Stroher, High-resolution radar via copressed sensing, IEEE Trans. Signal Process., vol. 57, no. 6, pp , 9. [5] W. Bajwa, J. Haupt, A Sayeed, and R Nowak, Copressive wireless sensing, in Intern. Conf. Infor. Process. in Sensor Networks (IPSN, 6, pp [6] V. Cevher, A. Sankaranarayanan, M. Duarte, D. Reddy, R. Baraniuk, and R. Chellappa, Copressive sensing for background subtraction, in European Conf. on Coputer Vision (ECCV, 8. [7] J. F. C. Mota, N. Deligiannis, A. C. Sankaranarayanan, V. Cevher, and M. R. D. Rodrigues, Dynaic sparse state estiation using l -l iniization: Adaptive-rate easureent bounds, algoriths, and applications, in IEEE Intern. Conf. Acoustics, Speech, and Sig. Processing (ICASSP, 5. [8] J. F. C. Mota, N. Deligiannis, A. C. Sankaranarayanan, V. Cevher, and M. R. D. Rodrigues, Adaptive-rate sparse signal reconstruction with application in copressive foreground etraction, IEEE Trans. Signal Process., 5, (in press. [9] L. Weizan, Y. C. Eldar, and D. Ben-Bashat, Copressed sensing for longitudinal MRI: An adaptive-weighted approach, Medical Physics, vol. 4, no. 9, pp , 5. [] S. Chen, D. Donoho, and M. Saunders, Atoic decoposition by basis pursuit, SIAM J. Sci. Cop., vol., no., pp. 33 6, 998. [] N. Vaswani and W. Lu, Modified-CS: Modifying copressive sensing for probles with partially known support, IEEE Trans. Signal Process., vol. 58, no. 9, pp ,. [] M Ain Khajehnejad, Weiyu Xu, Babak Hassibi, et al., Weighted l iniization for sparse recovery with prior inforation, in IEEE Intern. Syp. Infor. Theory (ISIT. [3] S. Oyak, M. A. Khajehnejad, and B. Hassibi, Recovery threshold for optial weight l iniization, in IEEE Intern. Syp. Infor. Theory (ISIT. IEEE,, pp [4] W. Lu and N. Vaswani, Eact reconstruction conditions and error bounds for regularized odified basis pursuit (regodified-bp, in Asiloar Conf. Signals, Systes and Coputers (ASILOMAR. [5] M. P. Friedlander, H. Mansour, R. Saab, and O. Yilaz, Recovering copressively sapled signals using partial support inforation, IEEE Trans. Infor. Theory, vol. 58, no., pp. 34,. [6] J. Scarlett, J. Evans, and S. Dey, Copressed sensing with prior inforation: Inforation-theoretic liits and practical decoders, IEEE Trans. Signal Process., vol. 6, no., pp , 3. [7] E. Zios, J. Mota, M. Rodrigues, and N. Deligiannis, Bayesian copressed sensing with heterogeneous side inforation, in 6 IEEE Data Copression Conference, (DCC6, Snowbird, Utah, USA, Apr. 6. [8] A. Charles, M. Asif, J. Roberg, and C. Rozell, Sparsity penalties in dynaical syste estiation, in IEEE Conf. Inforation Sciences and Systes,, pp. 6. [9] X. Wang and J. Liang, Side inforation-aided copressed sensing reconstruction via approiate essage passing, Preprint: 576v, 3. [] J. F. C. Mota, N. Deligiannis, and M. R. D. Rodrigues, Copressed sensing with prior inforation: Optial strategies, geoetry, and bounds, arxiv preprint arxiv:48.55, 4. [] J. Mota, N. Deligiannis, and M. Rodrigues, Copressed sensing with side inforation: Geoetrical interpretation and perforance bounds, in IEEE Global Conf. on Signal and Inforation Processing (GlobalSIP, 4, pp [] F. Renna, L. Wang, X. Yuan, J. Yang, G. Reeves, R. Calderbank, L. Carin, and M. R. D. Rodrigues, Classification and reconstruction of high-diensional signals fro lowdiensional noisy features in the presence of side inforation, 4. [3] B. Adcock, A. C. Hansen, C. Poon, and B. Roan, Breaking the coherence barrier: A new theory for copressed sensing, Preprint: 4. [4] W. Chen, M. R. D. Rodrigues, and I. J. Wassell, Projections design for statistical copressive sensing: A tight frae based approach, IEEE Trans. Signal Process., vol. 6, pp. 6 9, 3. [5] H. Mansour and R. Saab, Recovery analysis for weighted l -iniization using a null space property, arxiv preprint arxiv:4.565, 4. [6] Y. Gordon, On Milan s inequality and rando subspaces which escape through a esh in R n, Springer, 988. [7] M. Rudelson and R. Vershynin, On sparse reconstruction fro Fourier and Gaussian easureents, Counications on Pure and Applied Matheatics, vol. 6, no. 8, pp. 5 45, 8. [8] V. Chandrasekaran, B. Recht, P. Parrilo, and A. Willsky, The conve geoetry of linear inverse probles, Found. Coputational Matheatics, vol., pp ,. [9] D. Aelunen, M. Lotz, M. Mccoy, and J. Tropp, Living on the edge: Phase transitions in conve progras with rando data, Inforation and Inference: A Journal of the IMA, pp. 7, 4. [3] E. Candès and T. Tao, Decoding by linear prograing, IEEE Trans. Infor. Theory, vol. 5, no., pp , 5. [3] Ewout Van Den Berg and Michael P Friedlander, Probing the pareto frontier for basis pursuit solutions, SIAM Journal on Scientific Coputing, vol. 3, no., pp. 89 9, 8.

Weighted- 1 minimization with multiple weighting sets

Weighted- 1 minimization with multiple weighting sets Weighted- 1 iniization with ultiple weighting sets Hassan Mansour a,b and Özgür Yılaza a Matheatics Departent, University of British Colubia, Vancouver - BC, Canada; b Coputer Science Departent, University

More information

Randomized Recovery for Boolean Compressed Sensing

Randomized Recovery for Boolean Compressed Sensing Randoized Recovery for Boolean Copressed Sensing Mitra Fatei and Martin Vetterli Laboratory of Audiovisual Counication École Polytechnique Fédéral de Lausanne (EPFL) Eail: {itra.fatei, artin.vetterli}@epfl.ch

More information

Compressive Distilled Sensing: Sparse Recovery Using Adaptivity in Compressive Measurements

Compressive Distilled Sensing: Sparse Recovery Using Adaptivity in Compressive Measurements 1 Copressive Distilled Sensing: Sparse Recovery Using Adaptivity in Copressive Measureents Jarvis D. Haupt 1 Richard G. Baraniuk 1 Rui M. Castro 2 and Robert D. Nowak 3 1 Dept. of Electrical and Coputer

More information

Sharp Time Data Tradeoffs for Linear Inverse Problems

Sharp Time Data Tradeoffs for Linear Inverse Problems Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used

More information

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering

More information

On the theoretical analysis of cross validation in compressive sensing

On the theoretical analysis of cross validation in compressive sensing MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.erl.co On the theoretical analysis of cross validation in copressive sensing Zhang, J.; Chen, L.; Boufounos, P.T.; Gu, Y. TR2014-025 May 2014 Abstract

More information

An RIP-based approach to Σ quantization for compressed sensing

An RIP-based approach to Σ quantization for compressed sensing An RIP-based approach to Σ quantization for copressed sensing Joe-Mei Feng and Felix Kraher October, 203 Abstract In this paper, we provide new approach to estiating the error of reconstruction fro Σ quantized

More information

Support recovery in compressed sensing: An estimation theoretic approach

Support recovery in compressed sensing: An estimation theoretic approach Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de

More information

Novel Realization of Adaptive Sparse Sensing With Sparse Least Mean Fourth Algorithms

Novel Realization of Adaptive Sparse Sensing With Sparse Least Mean Fourth Algorithms ovel Realization of Adaptive Sparse Sensing With Sparse Least ean Fourth Algoriths Guan Gui Li Xu Xiao-ei Zhu and Zhang-in Chen 3. Dept. of Electronics and Inforation Systes Akita Prefectural University

More information

Feature Extraction Techniques

Feature Extraction Techniques Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that

More information

A Simple Regression Problem

A Simple Regression Problem A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where

More information

arxiv: v5 [cs.it] 16 Mar 2012

arxiv: v5 [cs.it] 16 Mar 2012 ONE-BIT COMPRESSED SENSING BY LINEAR PROGRAMMING YANIV PLAN AND ROMAN VERSHYNIN arxiv:09.499v5 [cs.it] 6 Mar 0 Abstract. We give the first coputationally tractable and alost optial solution to the proble

More information

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion Suppleentary Material for Fast and Provable Algoriths for Spectrally Sparse Signal Reconstruction via Low-Ran Hanel Matrix Copletion Jian-Feng Cai Tianing Wang Ke Wei March 1, 017 Abstract We establish

More information

Lower Bounds for Quantized Matrix Completion

Lower Bounds for Quantized Matrix Completion Lower Bounds for Quantized Matrix Copletion Mary Wootters and Yaniv Plan Departent of Matheatics University of Michigan Ann Arbor, MI Eail: wootters, yplan}@uich.edu Mark A. Davenport School of Elec. &

More information

Compressive Sensing Over Networks

Compressive Sensing Over Networks Forty-Eighth Annual Allerton Conference Allerton House, UIUC, Illinois, USA Septeber 29 - October, 200 Copressive Sensing Over Networks Soheil Feizi MIT Eail: sfeizi@it.edu Muriel Médard MIT Eail: edard@it.edu

More information

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t. CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when

More information

A Simple Homotopy Algorithm for Compressive Sensing

A Simple Homotopy Algorithm for Compressive Sensing A Siple Hootopy Algorith for Copressive Sensing Lijun Zhang Tianbao Yang Rong Jin Zhi-Hua Zhou National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China Departent of Coputer

More information

Estimation of the Mean of the Exponential Distribution Using Maximum Ranked Set Sampling with Unequal Samples

Estimation of the Mean of the Exponential Distribution Using Maximum Ranked Set Sampling with Unequal Samples Open Journal of Statistics, 4, 4, 64-649 Published Online Septeber 4 in SciRes http//wwwscirporg/ournal/os http//ddoiorg/436/os4486 Estiation of the Mean of the Eponential Distribution Using Maiu Ranked

More information

Hamming Compressed Sensing

Hamming Compressed Sensing Haing Copressed Sensing Tianyi Zhou, and Dacheng Tao, Meber, IEEE Abstract arxiv:.73v2 [cs.it] Oct 2 Copressed sensing CS and -bit CS cannot directly recover quantized signals and require tie consuing

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay

More information

Proc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES

Proc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Proc. of the IEEE/OES Seventh Working Conference on Current Measureent Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Belinda Lipa Codar Ocean Sensors 15 La Sandra Way, Portola Valley, CA 98 blipa@pogo.co

More information

Recovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup)

Recovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup) Recovering Data fro Underdeterined Quadratic Measureents (CS 229a Project: Final Writeup) Mahdi Soltanolkotabi Deceber 16, 2011 1 Introduction Data that arises fro engineering applications often contains

More information

The proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013).

The proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013). A Appendix: Proofs The proofs of Theore 1-3 are along the lines of Wied and Galeano (2013) Proof of Theore 1 Let D[d 1, d 2 ] be the space of càdlàg functions on the interval [d 1, d 2 ] equipped with

More information

Optimal Jamming Over Additive Noise: Vector Source-Channel Case

Optimal Jamming Over Additive Noise: Vector Source-Channel Case Fifty-first Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 2-3, 2013 Optial Jaing Over Additive Noise: Vector Source-Channel Case Erah Akyol and Kenneth Rose Abstract This paper

More information

A PROBABILISTIC AND RIPLESS THEORY OF COMPRESSED SENSING. Emmanuel J. Candès Yaniv Plan. Technical Report No November 2010

A PROBABILISTIC AND RIPLESS THEORY OF COMPRESSED SENSING. Emmanuel J. Candès Yaniv Plan. Technical Report No November 2010 A PROBABILISTIC AND RIPLESS THEORY OF COMPRESSED SENSING By Eanuel J Candès Yaniv Plan Technical Report No 200-0 Noveber 200 Departent of Statistics STANFORD UNIVERSITY Stanford, California 94305-4065

More information

A Probabilistic and RIPless Theory of Compressed Sensing

A Probabilistic and RIPless Theory of Compressed Sensing A Probabilistic and RIPless Theory of Copressed Sensing Eanuel J Candès and Yaniv Plan 2 Departents of Matheatics and of Statistics, Stanford University, Stanford, CA 94305 2 Applied and Coputational Matheatics,

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016/2017 Lessons 9 11 Jan 2017 Outline Artificial Neural networks Notation...2 Convolutional Neural Networks...3

More information

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes

More information

A note on the multiplication of sparse matrices

A note on the multiplication of sparse matrices Cent. Eur. J. Cop. Sci. 41) 2014 1-11 DOI: 10.2478/s13537-014-0201-x Central European Journal of Coputer Science A note on the ultiplication of sparse atrices Research Article Keivan Borna 12, Sohrab Aboozarkhani

More information

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points

More information

Bayesian Compressed Sensing with Heterogeneous Side Information

Bayesian Compressed Sensing with Heterogeneous Side Information Bayesian Compressed Sensing with Heterogeneous Side Information Evangelos Zimos, João F. C. Mota, Miguel R. D. Rodrigues, and Nikos Deligiannis Vrije Universiteit Brussel - iminds Dept. Electronics and

More information

On the Use of A Priori Information for Sparse Signal Approximations

On the Use of A Priori Information for Sparse Signal Approximations ITS TECHNICAL REPORT NO. 3/4 On the Use of A Priori Inforation for Sparse Signal Approxiations Oscar Divorra Escoda, Lorenzo Granai and Pierre Vandergheynst Signal Processing Institute ITS) Ecole Polytechnique

More information

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

e-companion ONLY AVAILABLE IN ELECTRONIC FORM OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer

More information

Interactive Markov Models of Evolutionary Algorithms

Interactive Markov Models of Evolutionary Algorithms Cleveland State University EngagedScholarship@CSU Electrical Engineering & Coputer Science Faculty Publications Electrical Engineering & Coputer Science Departent 2015 Interactive Markov Models of Evolutionary

More information

Page 1 Lab 1 Elementary Matrix and Linear Algebra Spring 2011

Page 1 Lab 1 Elementary Matrix and Linear Algebra Spring 2011 Page Lab Eleentary Matri and Linear Algebra Spring 0 Nae Due /03/0 Score /5 Probles through 4 are each worth 4 points.. Go to the Linear Algebra oolkit site ransforing a atri to reduced row echelon for

More information

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition

More information

Non-Parametric Non-Line-of-Sight Identification 1

Non-Parametric Non-Line-of-Sight Identification 1 Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,

More information

Bayes Decision Rule and Naïve Bayes Classifier

Bayes Decision Rule and Naïve Bayes Classifier Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.

More information

On Constant Power Water-filling

On Constant Power Water-filling On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives

More information

THE KALMAN FILTER: A LOOK BEHIND THE SCENE

THE KALMAN FILTER: A LOOK BEHIND THE SCENE HE KALMA FILER: A LOOK BEHID HE SCEE R.E. Deain School of Matheatical and Geospatial Sciences, RMI University eail: rod.deain@rit.edu.au Presented at the Victorian Regional Survey Conference, Mildura,

More information

Hybrid System Identification: An SDP Approach

Hybrid System Identification: An SDP Approach 49th IEEE Conference on Decision and Control Deceber 15-17, 2010 Hilton Atlanta Hotel, Atlanta, GA, USA Hybrid Syste Identification: An SDP Approach C Feng, C M Lagoa, N Ozay and M Sznaier Abstract The

More information

Recovery of Sparsely Corrupted Signals

Recovery of Sparsely Corrupted Signals TO APPEAR IN IEEE TRANSACTIONS ON INFORMATION TEORY 1 Recovery of Sparsely Corrupted Signals Christoph Studer, Meber, IEEE, Patrick Kuppinger, Student Meber, IEEE, Graee Pope, Student Meber, IEEE, and

More information

Warning System of Dangerous Chemical Gas in Factory Based on Wireless Sensor Network

Warning System of Dangerous Chemical Gas in Factory Based on Wireless Sensor Network 565 A publication of CHEMICAL ENGINEERING TRANSACTIONS VOL. 59, 07 Guest Editors: Zhuo Yang, Junie Ba, Jing Pan Copyright 07, AIDIC Servizi S.r.l. ISBN 978-88-95608-49-5; ISSN 83-96 The Italian Association

More information

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical IEEE TRANSACTIONS ON INFORMATION THEORY Large Alphabet Source Coding using Independent Coponent Analysis Aichai Painsky, Meber, IEEE, Saharon Rosset and Meir Feder, Fellow, IEEE arxiv:67.7v [cs.it] Jul

More information

Data-Driven Imaging in Anisotropic Media

Data-Driven Imaging in Anisotropic Media 18 th World Conference on Non destructive Testing, 16- April 1, Durban, South Africa Data-Driven Iaging in Anisotropic Media Arno VOLKER 1 and Alan HUNTER 1 TNO Stieltjesweg 1, 6 AD, Delft, The Netherlands

More information

Low-complexity, Low-memory EMS algorithm for non-binary LDPC codes

Low-complexity, Low-memory EMS algorithm for non-binary LDPC codes Low-coplexity, Low-eory EMS algorith for non-binary LDPC codes Adrian Voicila,David Declercq, François Verdier ETIS ENSEA/CP/CNRS MR-85 954 Cergy-Pontoise, (France) Marc Fossorier Dept. Electrical Engineering

More information

A Nonlinear Sparsity Promoting Formulation and Algorithm for Full Waveform Inversion

A Nonlinear Sparsity Promoting Formulation and Algorithm for Full Waveform Inversion A Nonlinear Sparsity Prooting Forulation and Algorith for Full Wavefor Inversion Aleksandr Aravkin, Tristan van Leeuwen, Jaes V. Burke 2 and Felix Herrann Dept. of Earth and Ocean sciences University of

More information

REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION

REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION ISSN 139 14X INFORMATION TECHNOLOGY AND CONTROL, 008, Vol.37, No.3 REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION Riantas Barauskas, Vidantas Riavičius Departent of Syste Analysis, Kaunas

More information

Lecture 20 November 7, 2013

Lecture 20 November 7, 2013 CS 229r: Algoriths for Big Data Fall 2013 Prof. Jelani Nelson Lecture 20 Noveber 7, 2013 Scribe: Yun Willia Yu 1 Introduction Today we re going to go through the analysis of atrix copletion. First though,

More information

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions Journal of Matheatical Research with Applications Jul., 207, Vol. 37, No. 4, pp. 496 504 DOI:0.3770/j.issn:2095-265.207.04.0 Http://jre.dlut.edu.cn An l Regularized Method for Nuerical Differentiation

More information

Highly Robust Error Correction by Convex Programming

Highly Robust Error Correction by Convex Programming Highly Robust Error Correction by Convex Prograing Eanuel J. Candès and Paige A. Randall Applied and Coputational Matheatics, Caltech, Pasadena, CA 9115 Noveber 6; Revised Noveber 7 Abstract This paper

More information

Recovering Block-structured Activations Using Compressive Measurements

Recovering Block-structured Activations Using Compressive Measurements Recovering Block-structured Activations Using Copressive Measureents Sivaraan Balakrishnan, Mladen Kolar, Alessandro Rinaldo, and Aarti Singh Abstract We consider the probles of detection and support recovery

More information

OPTIMIZATION in multi-agent networks has attracted

OPTIMIZATION in multi-agent networks has attracted Distributed constrained optiization and consensus in uncertain networks via proxial iniization Kostas Margellos, Alessandro Falsone, Sione Garatti and Maria Prandini arxiv:603.039v3 [ath.oc] 3 May 07 Abstract

More information

COS 424: Interacting with Data. Written Exercises

COS 424: Interacting with Data. Written Exercises COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well

More information

Fixed-to-Variable Length Distribution Matching

Fixed-to-Variable Length Distribution Matching Fixed-to-Variable Length Distribution Matching Rana Ali Ajad and Georg Böcherer Institute for Counications Engineering Technische Universität München, Gerany Eail: raa2463@gail.co,georg.boecherer@tu.de

More information

The linear sampling method and the MUSIC algorithm

The linear sampling method and the MUSIC algorithm INSTITUTE OF PHYSICS PUBLISHING INVERSE PROBLEMS Inverse Probles 17 (2001) 591 595 www.iop.org/journals/ip PII: S0266-5611(01)16989-3 The linear sapling ethod and the MUSIC algorith Margaret Cheney Departent

More information

A NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS

A NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS A NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS Marta Martinez-Caara 1, Michael Mua 2, Abdelhak M. Zoubir 2, Martin Vetterli 1 1 School of Coputer and Counication

More information

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels Extension of CSRSM for the Paraetric Study of the Face Stability of Pressurized Tunnels Guilhe Mollon 1, Daniel Dias 2, and Abdul-Haid Soubra 3, M.ASCE 1 LGCIE, INSA Lyon, Université de Lyon, Doaine scientifique

More information

Multi-Scale/Multi-Resolution: Wavelet Transform

Multi-Scale/Multi-Resolution: Wavelet Transform Multi-Scale/Multi-Resolution: Wavelet Transfor Proble with Fourier Fourier analysis -- breaks down a signal into constituent sinusoids of different frequencies. A serious drawback in transforing to the

More information

Exact Reconstruction Conditions and Error Bounds for Regularized Modified Basis Pursuit (Reg-Modified-BP)

Exact Reconstruction Conditions and Error Bounds for Regularized Modified Basis Pursuit (Reg-Modified-BP) 1 Exact Reconstruction Conditions and Error Bounds for Regularized Modified Basis Pursuit (Reg-Modified-BP) Wei Lu and Namrata Vaswani Department of Electrical and Computer Engineering, Iowa State University,

More information

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016 Lessons 7 14 Dec 2016 Outline Artificial Neural networks Notation...2 1. Introduction...3... 3 The Artificial

More information

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

Inspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information

Inspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information Cite as: Straub D. (2014). Value of inforation analysis with structural reliability ethods. Structural Safety, 49: 75-86. Value of Inforation Analysis with Structural Reliability Methods Daniel Straub

More information

Error Exponents in Asynchronous Communication

Error Exponents in Asynchronous Communication IEEE International Syposiu on Inforation Theory Proceedings Error Exponents in Asynchronous Counication Da Wang EECS Dept., MIT Cabridge, MA, USA Eail: dawang@it.edu Venkat Chandar Lincoln Laboratory,

More information

Exact tensor completion with sum-of-squares

Exact tensor completion with sum-of-squares Proceedings of Machine Learning Research vol 65:1 54, 2017 30th Annual Conference on Learning Theory Exact tensor copletion with su-of-squares Aaron Potechin Institute for Advanced Study, Princeton David

More information

SPECTRUM sensing is a core concept of cognitive radio

SPECTRUM sensing is a core concept of cognitive radio World Acadey of Science, Engineering and Technology International Journal of Electronics and Counication Engineering Vol:6, o:2, 202 Efficient Detection Using Sequential Probability Ratio Test in Mobile

More information

Weighted Superimposed Codes and Constrained Integer Compressed Sensing

Weighted Superimposed Codes and Constrained Integer Compressed Sensing Weighted Superiposed Codes and Constrained Integer Copressed Sensing Wei Dai and Olgica Milenovic Dept. of Electrical and Coputer Engineering University of Illinois, Urbana-Chapaign Abstract We introduce

More information

CHAPTER 8 CONSTRAINED OPTIMIZATION 2: SEQUENTIAL QUADRATIC PROGRAMMING, INTERIOR POINT AND GENERALIZED REDUCED GRADIENT METHODS

CHAPTER 8 CONSTRAINED OPTIMIZATION 2: SEQUENTIAL QUADRATIC PROGRAMMING, INTERIOR POINT AND GENERALIZED REDUCED GRADIENT METHODS CHAPER 8 CONSRAINED OPIMIZAION : SEQUENIAL QUADRAIC PROGRAMMING, INERIOR POIN AND GENERALIZED REDUCED GRADIEN MEHODS 8. Introduction In the previous chapter we eained the necessary and sufficient conditions

More information

Constrained Consensus and Optimization in Multi-Agent Networks arxiv: v2 [math.oc] 17 Dec 2008

Constrained Consensus and Optimization in Multi-Agent Networks arxiv: v2 [math.oc] 17 Dec 2008 LIDS Report 2779 1 Constrained Consensus and Optiization in Multi-Agent Networks arxiv:0802.3922v2 [ath.oc] 17 Dec 2008 Angelia Nedić, Asuan Ozdaglar, and Pablo A. Parrilo February 15, 2013 Abstract We

More information

The generalized Lasso with non-linear observations

The generalized Lasso with non-linear observations The generalized Lasso with non-linear observations Yaniv Plan Roan Vershynin Abstract We study the proble of signal estiation fro nonlinear observations when the signal belongs to a low-diensional set

More information

Compression and Predictive Distributions for Large Alphabet i.i.d and Markov models

Compression and Predictive Distributions for Large Alphabet i.i.d and Markov models 2014 IEEE International Syposiu on Inforation Theory Copression and Predictive Distributions for Large Alphabet i.i.d and Markov odels Xiao Yang Departent of Statistics Yale University New Haven, CT, 06511

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer

More information

ORIE 6340: Mathematics of Data Science

ORIE 6340: Mathematics of Data Science ORIE 6340: Matheatics of Data Science Daek Davis Contents 1 Estiation in High Diensions 1 1.1 Tools for understanding high-diensional sets................. 3 1.1.1 Concentration of volue in high-diensions...............

More information

On Conditions for Linearity of Optimal Estimation

On Conditions for Linearity of Optimal Estimation On Conditions for Linearity of Optial Estiation Erah Akyol, Kuar Viswanatha and Kenneth Rose {eakyol, kuar, rose}@ece.ucsb.edu Departent of Electrical and Coputer Engineering University of California at

More information

Transformation-invariant Collaborative Sub-representation

Transformation-invariant Collaborative Sub-representation Transforation-invariant Collaborative Sub-representation Yeqing Li, Chen Chen, Jungzhou Huang Departent of Coputer Science and Engineering University of Texas at Arlington, Texas 769, USA. Eail: yeqing.li@avs.uta.edu,

More information

arxiv: v3 [quant-ph] 18 Oct 2017

arxiv: v3 [quant-ph] 18 Oct 2017 Self-guaranteed easureent-based quantu coputation Masahito Hayashi 1,, and Michal Hajdušek, 1 Graduate School of Matheatics, Nagoya University, Furocho, Chikusa-ku, Nagoya 464-860, Japan Centre for Quantu

More information

Introduction to Machine Learning. Recitation 11

Introduction to Machine Learning. Recitation 11 Introduction to Machine Learning Lecturer: Regev Schweiger Recitation Fall Seester Scribe: Regev Schweiger. Kernel Ridge Regression We now take on the task of kernel-izing ridge regression. Let x,...,

More information

CS Lecture 13. More Maximum Likelihood

CS Lecture 13. More Maximum Likelihood CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee227c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee227c@berkeley.edu October

More information

ANALYSIS OF A NUMERICAL SOLVER FOR RADIATIVE TRANSPORT EQUATION

ANALYSIS OF A NUMERICAL SOLVER FOR RADIATIVE TRANSPORT EQUATION ANALYSIS OF A NUMERICAL SOLVER FOR RADIATIVE TRANSPORT EQUATION HAO GAO AND HONGKAI ZHAO Abstract. We analyze a nuerical algorith for solving radiative transport equation with vacuu or reflection boundary

More information

arxiv: v1 [math.na] 10 Oct 2016

arxiv: v1 [math.na] 10 Oct 2016 GREEDY GAUSS-NEWTON ALGORITHM FOR FINDING SPARSE SOLUTIONS TO NONLINEAR UNDERDETERMINED SYSTEMS OF EQUATIONS MÅRTEN GULLIKSSON AND ANNA OLEYNIK arxiv:6.395v [ath.na] Oct 26 Abstract. We consider the proble

More information

arxiv: v2 [cs.lg] 30 Mar 2017

arxiv: v2 [cs.lg] 30 Mar 2017 Batch Renoralization: Towards Reducing Minibatch Dependence in Batch-Noralized Models Sergey Ioffe Google Inc., sioffe@google.co arxiv:1702.03275v2 [cs.lg] 30 Mar 2017 Abstract Batch Noralization is quite

More information

Estimating Parameters for a Gaussian pdf

Estimating Parameters for a Gaussian pdf Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3

More information

AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS

AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Statistica Sinica 6 016, 1709-178 doi:http://dx.doi.org/10.5705/ss.0014.0034 AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Nilabja Guha 1, Anindya Roy, Yaakov Malinovsky and Gauri

More information

Chapter 6 1-D Continuous Groups

Chapter 6 1-D Continuous Groups Chapter 6 1-D Continuous Groups Continuous groups consist of group eleents labelled by one or ore continuous variables, say a 1, a 2,, a r, where each variable has a well- defined range. This chapter explores:

More information

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data Suppleentary to Learning Discriinative Bayesian Networks fro High-diensional Continuous Neuroiaging Data Luping Zhou, Lei Wang, Lingqiao Liu, Philip Ogunbona, and Dinggang Shen Proposition. Given a sparse

More information

Polygonal Designs: Existence and Construction

Polygonal Designs: Existence and Construction Polygonal Designs: Existence and Construction John Hegean Departent of Matheatics, Stanford University, Stanford, CA 9405 Jeff Langford Departent of Matheatics, Drake University, Des Moines, IA 5011 G

More information

TABLE FOR UPPER PERCENTAGE POINTS OF THE LARGEST ROOT OF A DETERMINANTAL EQUATION WITH FIVE ROOTS. By William W. Chen

TABLE FOR UPPER PERCENTAGE POINTS OF THE LARGEST ROOT OF A DETERMINANTAL EQUATION WITH FIVE ROOTS. By William W. Chen TABLE FOR UPPER PERCENTAGE POINTS OF THE LARGEST ROOT OF A DETERMINANTAL EQUATION WITH FIVE ROOTS By Willia W. Chen The distribution of the non-null characteristic roots of a atri derived fro saple observations

More information

Tail estimates for norms of sums of log-concave random vectors

Tail estimates for norms of sums of log-concave random vectors Tail estiates for nors of sus of log-concave rando vectors Rados law Adaczak Rafa l Lata la Alexander E. Litvak Alain Pajor Nicole Toczak-Jaegerann Abstract We establish new tail estiates for order statistics

More information

A Unified Approach to Universal Prediction: Generalized Upper and Lower Bounds

A Unified Approach to Universal Prediction: Generalized Upper and Lower Bounds 646 IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL 6, NO 3, MARCH 05 A Unified Approach to Universal Prediction: Generalized Upper and Lower Bounds Nuri Denizcan Vanli and Suleyan S Kozat,

More information

Kernel Methods and Support Vector Machines

Kernel Methods and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic

More information

AN EFFICIENT CLASS OF CHAIN ESTIMATORS OF POPULATION VARIANCE UNDER SUB-SAMPLING SCHEME

AN EFFICIENT CLASS OF CHAIN ESTIMATORS OF POPULATION VARIANCE UNDER SUB-SAMPLING SCHEME J. Japan Statist. Soc. Vol. 35 No. 005 73 86 AN EFFICIENT CLASS OF CHAIN ESTIMATORS OF POPULATION VARIANCE UNDER SUB-SAMPLING SCHEME H. S. Jhajj*, M. K. Shara* and Lovleen Kuar Grover** For estiating the

More information

Robust Spectral Compressed Sensing via Structured Matrix Completion Yuxin Chen, Student Member, IEEE, and Yuejie Chi, Member, IEEE

Robust Spectral Compressed Sensing via Structured Matrix Completion Yuxin Chen, Student Member, IEEE, and Yuejie Chi, Member, IEEE 6576 IEEE TRANSACTIONS ON INORMATION THEORY, VOL 60, NO 0, OCTOBER 04 Robust Spectral Copressed Sensing via Structured Matrix Copletion Yuxin Chen, Student Meber, IEEE, and Yuejie Chi, Meber, IEEE Abstract

More information

An Algorithm for Quantization of Discrete Probability Distributions

An Algorithm for Quantization of Discrete Probability Distributions An Algorith for Quantization of Discrete Probability Distributions Yuriy A. Reznik Qualco Inc., San Diego, CA Eail: yreznik@ieee.org Abstract We study the proble of quantization of discrete probability

More information

An Improved Particle Filter with Applications in Ballistic Target Tracking

An Improved Particle Filter with Applications in Ballistic Target Tracking Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing

More information

Stochastic Subgradient Methods

Stochastic Subgradient Methods Stochastic Subgradient Methods Lingjie Weng Yutian Chen Bren School of Inforation and Coputer Science University of California, Irvine {wengl, yutianc}@ics.uci.edu Abstract Stochastic subgradient ethods

More information

Fairness via priority scheduling

Fairness via priority scheduling Fairness via priority scheduling Veeraruna Kavitha, N Heachandra and Debayan Das IEOR, IIT Bobay, Mubai, 400076, India vavitha,nh,debayan}@iitbacin Abstract In the context of ulti-agent resource allocation

More information