Nonnegative Tensor Factorization with Smoothness Constraints
|
|
- Carmella Rose Bond
- 5 years ago
- Views:
Transcription
1 Nonnegative Tensor Factorization with Smoothness Constraints Rafal Zdunek 1 and Tomasz M. Rutkowski 2 1 Institute of Telecommunications, Teleinformatics and Acoustics, Wroclaw University of Technology, Wybrzeze Wyspianskiego 27, Wroclaw, Poland rafal.zdunek@pwr.wroc.pl 2 RIKEN Brain Science Institute, Wako-shi, Japan Abstract. Nonnegative Tensor Factorization (NTF) is an emerging technique in multidimensional signal analysis and it can be used to find partsbased representations of high-dimensional data. In many applications such as multichannel spectrogram processing or multiarray spectra analysis, the unknown features have locally smooth temporal or spatial structure. In this paper, we incorporate to an objective function in NTF additional smoothness constrains that considerably improve the unknown features. In our approach, we propose to use the Markov Random Field (MRF) model that is commonly-used in tomographic image reconstruction to model local smoothness properties of 2D reconstructed images. We extend this model to multidimensional case whereby smoothness can be enforced in all dimensions of a multi-dimensional array. We analyze different clique energy functions that are a subject to MRF. Some numerical results performed on a multidimensional image dataset are presented. Keywords: Nonnegative Tensor Factorization (NTF), multiarray spectra analysis, Markov Random Field (MRF). 1 Introduction Nonnegative Tensor Factorization (NMF) [1,2] is an extended version of Nonnegative Matrix Factorization (NMF) [3] to nonnegative multi-dimensional arrays. This method has already found a variety of applications, e.g. in multidimensional signal and image processing. Similarly as NMF, NTF also provides lower-rank sparse representations of nonnegative data. Moreover, spatial and temporal correlations between variables can be modeled more accurately with NTF than 2D matrix factorizations. Assuming the variables of interest have all smooth profiles (along each dimension) we can improve NTF with additional smoothness constraints. This approach is justified by observation that usually data representations have a locally smooth spatial and temporal structure, where the locality can be restricted to a few samples. The smoothness can be modeled in many ways, e.g. by the entropy measure or l 2 norm of the estimated components. In our approach, we used the Markov Random Field (MRF) model that is widely applied in image reconstruction and restoration. Such models, which are often expressed by the Gibbs prior, determine local D.-S. Huang et al. (Eds.): ICIC 2008, LNCS 5226, pp , c Springer-Verlag Berlin Heidelberg 2008
2 Nonnegative Tensor Factorization with Smoothness Constraints 301 roughness (smoothness) in the analyzed image with consideration of pair-wise interactions among adjacent pixels in a given neighborhood of a singe pixel. Thus, a total smoothness in an image can be expressed by a joint Gibbs distribution with a nonlinear energy function. In our approach, we use the Green s function for measuring strength of the pair-wise pixel interactions. Using a Bayesian framework, we get the Gibbs regularized cost function that is minimized with a gradient descent alternating minimization technique subject to nonnegativity constrains that can be imposed in many ways. One of them is achieved with standard multiplicative updates that were used, e.g. by Lee and Seung [3]. Let A be a tensor of the order N and dimension I 1 I 2... I n 1 I n I n+1... I N.Then-mode product of the tensor A and the matrix B R Jn In is the tensor A n B of the dimension I 1 I 2... I n 1 J n I n+1... I N [4]. For N =3,thek-th frontal slice of the tensor A is the matrix A k = A(:, :,k) R I1 I2. Applying the forward cycle matricizing by row-wise unfolding, the tensor A can be matricized as follows: Ā =[A 1, A 2,...,A k,...,a I3 ] R I1 I2I3. The paper has the following organization. The next section introduces the basic factorizations of nonnegative multi-dimensional arrays. Section 3 presents the main contribution, i.e. incorporation of smoothness constraints to the selected models. The simulation experiments are given in Section 5. Finally, some conclusions are drawn in the last section. 2 NTF Models I T K We assume the observed data are represented by the 3D array Y R+, where R + denotes the nonnegative orthant (subspace of nonnegative real numbers) of R. There are many models for factorizing the array Y, which follows from the underlying physical phenomena. Typical models [5,6] are as follows: PARAFAC with nonnegativity constraints Y = D 1 A 2 X T, (1) where A =[a ij ] R I J +, X =[x jt] R J T +,andd =[d jjk] R J J K + is a nonnegative tensor with diagonal frontal slices, i.e. k : D k =diag{d jj } R J J. NTF1 Y = X 1 A, (2) J T K where X =[x jtk ] R+ is a nonnegative tensor of lateral sources, and A =[a ij ] R I J + is a nonnegative mixing matrix. In practice, NTF 1 is an inexact factorization due to noisy perturbations, i.e. Y = X 1 A + N, (3) where N R I T K is a tensor of noisy perturbations. NTF2 Y = A 2 X T + N, (4)
3 302 R. Zdunek and T.M. Rutkowski where A =[a ijk ] R I J K + is a nonnegative tensor of weighting coefficients, X =[x jt ] R J T + is a nonnegative matrix of sources, N R I T K is a tensor of noisy perturbations. The objective is to estimate the multi-dimensional arrays D, X, A,andthematrices A and X, subject to nonnegativity constraints of all the entries, given the data tensor Y and possibly the prior knowledge on the nature of the true components to be estimated or on a statistical distribution of noisy disturbances in N. In the paper, we assume that the smoothness constraints apply only to the unknown sources, i.e. either to the matrix X or tensor X. The sources may be timeand space-varying with both smooth profiles. The prior knowledge on the sources is particularly strong when the sources have all the profiles locally smooth, i.e. the tensor X tends to be smooth in all 3 dimensions. This takes place very often when X models a series of smooth images with a temporal smooth profile. Thus, we restrict our considerations on the smoothness constraints only to the NTF 1 model, since the application of these constraints to the matrix X as in the PARAFAC and NTF 2 is straightforward, and it can be easily achieved. Furthermore, the MRFbased smoothing has been successfully applied to NMF with respect to blind separation of nonnegative signal in [7] and image processing in [8]. 3 Smoothness in NTF To estimate the factors A and X, we use the similar approach as to NMF [3]. The specific cost function D(Y X 1 A) that measures the distance between Y and X 1 A is minimized with the alternating minimization technique. Lee and Seung [3] were the first who proposed two types of NMF algorithms. One minimizes the Euclidean distance, which is optimal for a Gaussian distributed additive noise, and the other for minimization of the Kullback-Leibler (KL) divergence, which is suitable for a Poisson distributed noise. The NMF algorithms that are optimal for many other distribution of additive noise can be found, e.g. in [9,10,11]. Assuming a Poisson noise, which often occurs in image processing, D(Y X 1 A) is given by the KL divergence. Thus D(Y X 1 A)= itk y itk log y itk z itk + z itk y itk, (5) where Y =[y itk ]andz =[z itk ]=X 1 A. Using the Gibbs prior [12] to model total smoothness in X andwiththe Bayesian framework as in [7,8], we get the penalized KL divergence D(Y X 1 A)= itk y itk log y itk z itk + z itk y itk + βu(x ), (6) where β is a penalty parameter and U(X ) is a total energy function that measures total roughness (smoothness) in X.
4 Nonnegative Tensor Factorization with Smoothness Constraints Markov Random Field Model MRF models have been widely applied in many image reconstruction applications, especially in tomographic imaging. In our application, MRF models motivate the definition of the total energy function in (6). Thus U(X )= jtk p S (j) q S (t) w pqr (jtk) ψ (x jtk x pqr,δ), (7) r S (k) where S (j), S (t),ands (k) are sets of indices of the entries in the neighborhood of x jtk, w pqr (jtk) is a weighting factor between the entries x jtk and x pqr, δ is a scaling factor, and ψ (ξ,δ) is some potential function of ξ and δ, which can take various forms. Exemplary potential functions are listed in Table 1 Table 1. Potential functions Author(s) (Name) Reference Functions: ψ(ξ,δ) 2 ξ (Gaussian) [12,13] δ Besag (Laplacian) [13] fi fi fi ξ fi fi fi fi δ fi Hebert and Leahy [14] δ log»1+( ξδ )2 Geman and McClure [15] Geman and Reynolds [16] 16 3 (ξ/δ) 2 3 (1 + (ξ/δ) 2 ) ξ/δ 1+ ξ/δ Stevenson and Delp (Hubert) [17] min{ ξ δ 2, 2 ξ δ 1} Green [18] δ log[cosh(ξ/δ)] The sets S (j), S (t) and S (k), and the associated weighting factors w pqr (jtk) are usually defined by the MRF model. Taking into account the nearest neighborhood, we have S (j) = {j 1,j,j +1}, S (t) = {t 1,t,t +1}, ands (k) = {k 1,k,k+1}. In consequence, w pqr (jtk) = 1 for pixels adjacent along a horizontal or vertical line, and w pqr (jtk) = 1 2 for pixels adjacent along a diagonal line. For p = j, q = t, r = k, and otherwise, w (jtk) pqr =0.
5 304 R. Zdunek and T.M. Rutkowski 3.2 Algorithm Assuming the local stationarity of (6) with respect to A and X, and using the alternating minimization algorithm, NTF 1 can be obtained with the following algorithm: Set Randomly initialize with nonnegative real numbers: A (0), X (0), Ȳ =[Y 1, Y 2,...,Y K ] R J TK, % row-wise unfolding For s =1, 2,..., until convergence do θ jtk = p S (j) q S (t) r S x (s+1) jtk = x (s) jtk I i=1 a(s) ij + βθ jtk X (s+1) =[X (s+1) 1, X (s+1) a (s+1) ij = a (s) ij TK z=1 x(s+1) jz a (s+1) ij a(s+1) ij I i=1 a(s+1) ij (k) w(jtk) pqr x jtk ψ I i=1 2,...,X (s+1) K KT z=1 a (s) ij y itk J, c=1 a(s) ic x(s) ctk ȳ iz x (s+1) jz J c=1 a(s) ic x(s+1) cz, % normalization ( x jtk x (s) ] R J TK, % unfolding, pqr,δ), xjtk x (s) End Since the Green s function [18] in Table 1 satisfies all the properties mentioned in [19], i.e. at a constant δ>0, it is nonnegative, even, 0 at ξ = 0, strictly increasing for ξ>0, unbounded, convex, and has bounded first-derivative, we decided to select this function to our tests. Thus ( ) xjtk x pqr ψ (x jtk x pqr,δ)=tanh. x jtk δ jtk 4 Numerical Tests The proposed algorithm has been extensively tested for various benchmarks of nonnegative smooth and sparse signals and images. The exemplary four original 3D arrays of nonnegative smooth and focussed objects are illustrated in Fig. 1. The arrays are stored in 4D tensor X R The mixtures Y R are obtained by multiplying the tensor X across 4-mode with the uniformly distributed random matrix A R across the 2-nd dimension.
6 Nonnegative Tensor Factorization with Smoothness Constraints 305 Fig. 1. Volumetric slice plots of 4 original 3D arrays of nonnegative smooth and focussed objects Fig. 2. Volumetric slice plots of 2 selected 3D arrays from 4D tensor of linear mixtures To separate the original 3D arrays from the mixtures, we used the algorithm presented in Section 3.2. For β = 0, the algorithm becomes a standard rowunfolded algorithm for NTF 1 [5]. The quality of the separation is evaluated with the standard mean-sir measure, comparing the estimated 3D arrays with
7 306 R. Zdunek and T.M. Rutkowski 10 Parameters of the statistics: Mean = [db], Std = [db] Parameters of the statistics: Mean = [db], Std = [db] Mean SIRs [db] Mean SIRs [db] Fig. 3. Histograms of 100 mean-sir samples generated with: (left) standard algorithm for NTF 1 (β = 0), (right) our algorithm with the Green s function, β =0.2, and δ =0.001 the respective original ones. The algorithm is initialized with the random initial approximations for A (0) and X (0). Hence, the Monte Carlo (MC) analysis is applied to test the consistency (repeatability) of the results. Fig. 3 shows the histograms from 100 mean-sir samples for both β =0andβ =0.2. The results demonstrate that the smoothness constraints considerably improve the repeatability of the results (the STD of the histogram with β =0.2 ismuch smaller than with β = 0). This is a promising result towards a uniqueness of NTF. 5 Conclusions In the paper, we derived the new algorithm for NTF 1, which may be useful for estimation of locally smooth signals and images in BSS applications. The algorithm exploits the information on pair-wise interactions between adjacent pixels, which is motivated by MRF models in tomographic image reconstruction. The proposed approach can be further extended with additional constraints or different updating rules. Also, another extension may concern the application of data-driven hyperparameter estimation techniques, especially for the regularization parameter. References 1. Shashua, A., Hazan, T.: Non-negative tensor factorization with applications to statistics and computer vision. In: Proc. of the 22th International Conference on Machine Learning, Bonn, Germany (2005) 2. Hazan, T., Polak, S., Shashua, A.: Sparse image coding using a 3D non-negative tensor factorization. In: International Conference of Computer Vision (ICCV), pp (2005) 3. Lee, D.D., Seung, H.S.: Learning the parts of objects by nonnegative matrix factorization. Nature 401, (1999)
8 Nonnegative Tensor Factorization with Smoothness Constraints Bader, B.W., Kolda, T.G.: Algorithm 862: Matlab tensor classes for fast algorithm prototyping. ACM Trans. Math. Softw. 32, (2006) 5. Cichocki, A., Zdunek, R., Choi, S., Plemmons, R., Amari, S.I.: Novel multi-layer nonnegative tensor factorization with sparsity constraints. In: Beliczynski, B., Dzielinski, A., Iwanowski, M., Ribeiro, B. (eds.) ICANNGA LNCS, vol. 4432, pp Springer, Heidelberg (2007) 6. Cichocki, A., Zdunek, R., Choi, S., Plemmons, R., Amari, S.: Nonnegative tensor factorization using Alpha and Beta divergencies. In: Proc. IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2007), Honolulu, Hawaii, USA, vol. III, pp (2007) 7. Zdunek, R., Cichocki, A.: Gibbs regularized nonnegative matrix factorization for blind separation of locally smooth signals. In: 15th IEEE International Workshop on Nonlinear Dynamics of Electronic Systems (NDES 2007), Tokushima, Japan, pp (2007) 8. Zdunek, R., Cichocki, A.: Blind image separation using nonnegative matrix factorization with Gibbs smoothing. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds.) ICONIP 2007, Part II. LNCS, vol. 4985, pp Springer, Heidelberg (2008) 9. Dhillon, I., Sra, S.: Generalized nonnegative matrix approximations with Bregman divergences. In: Neural Information Proc. Systems, Vancouver, Canada, pp (2005) 10. Cichocki, A., Zdunek, R., Amari, S.: Csiszar s divergences for non-negative matrix factorization: Family of new algorithms. In: Rosca, J.P., Erdogmus, D., Príncipe, J.C., Haykin, S. (eds.) ICA LNCS, vol. 3889, pp Springer, Heidelberg (2006) 11. Kompass, R.: A generalized divergence measure for nonnegative matrix factorization. Neural Computation 19, (2006) 12. Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI-6, (1984) 13. Besag, J.: Toward Bayesian image analysis. J. Appl. Stat. 16, (1989) 14. Hebert, T., Leahy, R.: A generalized EM algorithm for 3-D Bayesian reconstruction from Poisson data using Gibbs priors. IEEE Transactions on Medical Imaging 8, (1989) 15. Geman, S., McClure, D.: Statistical methods for tomographic image reconstruction. Bull. Int. Stat. Inst. LII-4, 5 21 (1987) 16. Geman, S., Reynolds, G.: Constrained parameters and the recovery of discontinuities. IEEE Trans. Pattern Anal. Machine Intell. 14, (1992) 17. Stevenson, R., Delp, E.: Fitting curves with discontinuities. In: Proc. 1st Int. Workshop on Robust Computer Vision, Seattle, Washington, USA (1990) 18. Green, P.J.: Bayesian reconstruction from emission tomography data using a modified EM algorithm. IEEE Trans. Medical Imaging 9, (1990) 19. Lange, K., Carson, R.: EM reconstruction algorithms for emission and transmission tomography. J. Comp. Assisted Tomo. 8, (1984)
Nonnegative Tensor Factorization with Smoothness Constraints
Nonnegative Tensor Factorization with Smoothness Constraints Rafal ZDUNEK 1 and Tomasz M. RUTKOWSKI 2 1 Institute of Telecommunications, Teleinformatics and Acoustics, Wroclaw University of Technology,
More informationRegularized Alternating Least Squares Algorithms for Non-negative Matrix/Tensor Factorization
Regularized Alternating Least Squares Algorithms for Non-negative Matrix/Tensor Factorization Andrzej CICHOCKI and Rafal ZDUNEK Laboratory for Advanced Brain Signal Processing, RIKEN BSI, Wako-shi, Saitama
More informationNon-Negative Matrix Factorization with Quasi-Newton Optimization
Non-Negative Matrix Factorization with Quasi-Newton Optimization Rafal ZDUNEK, Andrzej CICHOCKI Laboratory for Advanced Brain Signal Processing BSI, RIKEN, Wako-shi, JAPAN Abstract. Non-negative matrix
More informationContinuous State MRF s
EE64 Digital Image Processing II: Purdue University VISE - December 4, Continuous State MRF s Topics to be covered: Quadratic functions Non-Convex functions Continuous MAP estimation Convex functions EE64
More informationB-Spline Smoothing of Feature Vectors in Nonnegative Matrix Factorization
B-Spline Smoothing of Feature Vectors in Nonnegative Matrix Factorization Rafa l Zdunek 1, Andrzej Cichocki 2,3,4, and Tatsuya Yokota 2,5 1 Department of Electronics, Wroclaw University of Technology,
More informationNovel Multi-layer Non-negative Tensor Factorization with Sparsity Constraints
Novel Multi-layer Non-negative Tensor Factorization with Sparsity Constraints Andrzej CICHOCKI 1, Rafal ZDUNEK 1, Seungjin CHOI, Robert PLEMMONS 2, and Shun-ichi AMARI 1 1 RIKEN Brain Science Institute,
More informationAutomatic Rank Determination in Projective Nonnegative Matrix Factorization
Automatic Rank Determination in Projective Nonnegative Matrix Factorization Zhirong Yang, Zhanxing Zhu, and Erkki Oja Department of Information and Computer Science Aalto University School of Science and
More informationSingle-channel source separation using non-negative matrix factorization
Single-channel source separation using non-negative matrix factorization Mikkel N. Schmidt Technical University of Denmark mns@imm.dtu.dk www.mikkelschmidt.dk DTU Informatics Department of Informatics
More informationMULTIPLICATIVE ALGORITHM FOR CORRENTROPY-BASED NONNEGATIVE MATRIX FACTORIZATION
MULTIPLICATIVE ALGORITHM FOR CORRENTROPY-BASED NONNEGATIVE MATRIX FACTORIZATION Ehsan Hosseini Asl 1, Jacek M. Zurada 1,2 1 Department of Electrical and Computer Engineering University of Louisville, Louisville,
More informationNovel Alternating Least Squares Algorithm for Nonnegative Matrix and Tensor Factorizations
Novel Alternating Least Squares Algorithm for Nonnegative Matrix and Tensor Factorizations Anh Huy Phan 1, Andrzej Cichocki 1,, Rafal Zdunek 1,2,andThanhVuDinh 3 1 Lab for Advanced Brain Signal Processing,
More informationNMFLAB for Signal Processing
NMFLAB for Signal Processing Toolbox for NMF (Non-negative Matrix Factorization) and BSS (Blind Source Separation) By Andrzej CICHOCKI and Rafal ZDUNEK Copyright LABSP, June 15, 2006 The NMFLAB Package
More informationNMF WITH SPECTRAL AND TEMPORAL CONTINUITY CRITERIA FOR MONAURAL SOUND SOURCE SEPARATION. Julian M. Becker, Christian Sohn and Christian Rohlfing
NMF WITH SPECTRAL AND TEMPORAL CONTINUITY CRITERIA FOR MONAURAL SOUND SOURCE SEPARATION Julian M. ecker, Christian Sohn Christian Rohlfing Institut für Nachrichtentechnik RWTH Aachen University D-52056
More informationNon-negative Matrix Factorization: Algorithms, Extensions and Applications
Non-negative Matrix Factorization: Algorithms, Extensions and Applications Emmanouil Benetos www.soi.city.ac.uk/ sbbj660/ March 2013 Emmanouil Benetos Non-negative Matrix Factorization March 2013 1 / 25
More informationFLEXIBLE HALS ALGORITHMS FOR SPARSE NON-NEGATIVE MATRIX/TENSOR FACTORIZATION. RIKEN Brain Science Institute, LABSP, Wako-shi, Saitama , JAPAN
FLEXIBLE HALS ALGORITHMS FOR SPARSE NON-NEGATIVE MATRIX/TENSOR FACTORIZATION Andre CICHOCKI, Anh Huy PHAN and Cesar CAIAFA RIKEN Brain Science Institute, LABSP, Wako-shi, Saitama 351-198, JAPAN ABSTRACT
More informationA randomized block sampling approach to the canonical polyadic decomposition of large-scale tensors
A randomized block sampling approach to the canonical polyadic decomposition of large-scale tensors Nico Vervliet Joint work with Lieven De Lathauwer SIAM AN17, July 13, 2017 2 Classification of hazardous
More informationOn Spectral Basis Selection for Single Channel Polyphonic Music Separation
On Spectral Basis Selection for Single Channel Polyphonic Music Separation Minje Kim and Seungjin Choi Department of Computer Science Pohang University of Science and Technology San 31 Hyoja-dong, Nam-gu
More informationCONVOLUTIVE NON-NEGATIVE MATRIX FACTORISATION WITH SPARSENESS CONSTRAINT
CONOLUTIE NON-NEGATIE MATRIX FACTORISATION WITH SPARSENESS CONSTRAINT Paul D. O Grady Barak A. Pearlmutter Hamilton Institute National University of Ireland, Maynooth Co. Kildare, Ireland. ABSTRACT Discovering
More informationNon-negative matrix factorization and term structure of interest rates
Non-negative matrix factorization and term structure of interest rates Hellinton H. Takada and Julio M. Stern Citation: AIP Conference Proceedings 1641, 369 (2015); doi: 10.1063/1.4906000 View online:
More informationProbabilistic Latent Variable Models as Non-Negative Factorizations
1 Probabilistic Latent Variable Models as Non-Negative Factorizations Madhusudana Shashanka, Bhiksha Raj, Paris Smaragdis Abstract In this paper, we present a family of probabilistic latent variable models
More informationBayesian Image Segmentation Using MRF s Combined with Hierarchical Prior Models
Bayesian Image Segmentation Using MRF s Combined with Hierarchical Prior Models Kohta Aoki 1 and Hiroshi Nagahashi 2 1 Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology
More informationNonnegative Matrix Factorization with Toeplitz Penalty
Journal of Informatics and Mathematical Sciences Vol. 10, Nos. 1 & 2, pp. 201 215, 2018 ISSN 0975-5748 (online); 0974-875X (print) Published by RGN Publications http://dx.doi.org/10.26713/jims.v10i1-2.851
More informationSINGLE CHANNEL SPEECH MUSIC SEPARATION USING NONNEGATIVE MATRIX FACTORIZATION AND SPECTRAL MASKS. Emad M. Grais and Hakan Erdogan
SINGLE CHANNEL SPEECH MUSIC SEPARATION USING NONNEGATIVE MATRIX FACTORIZATION AND SPECTRAL MASKS Emad M. Grais and Hakan Erdogan Faculty of Engineering and Natural Sciences, Sabanci University, Orhanli
More informationIndependent Component Analysis and Unsupervised Learning. Jen-Tzung Chien
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent voices Nonparametric likelihood
More informationCP DECOMPOSITION AND ITS APPLICATION IN NOISE REDUCTION AND MULTIPLE SOURCES IDENTIFICATION
International Conference on Computer Science and Intelligent Communication (CSIC ) CP DECOMPOSITION AND ITS APPLICATION IN NOISE REDUCTION AND MULTIPLE SOURCES IDENTIFICATION Xuefeng LIU, Yuping FENG,
More informationOrthogonal Nonnegative Matrix Factorization: Multiplicative Updates on Stiefel Manifolds
Orthogonal Nonnegative Matrix Factorization: Multiplicative Updates on Stiefel Manifolds Jiho Yoo and Seungjin Choi Department of Computer Science Pohang University of Science and Technology San 31 Hyoja-dong,
More informationCVPR A New Tensor Algebra - Tutorial. July 26, 2017
CVPR 2017 A New Tensor Algebra - Tutorial Lior Horesh lhoresh@us.ibm.com Misha Kilmer misha.kilmer@tufts.edu July 26, 2017 Outline Motivation Background and notation New t-product and associated algebraic
More informationTo be published in Optics Letters: Blind Multi-spectral Image Decomposition by 3D Nonnegative Tensor Title: Factorization Authors: Ivica Kopriva and A
o be published in Optics Letters: Blind Multi-spectral Image Decomposition by 3D Nonnegative ensor itle: Factorization Authors: Ivica Kopriva and Andrzej Cichocki Accepted: 21 June 2009 Posted: 25 June
More informationIndependent Component Analysis and Unsupervised Learning
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien National Cheng Kung University TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent
More information1 Introduction Blind source separation (BSS) is a fundamental problem which is encountered in a variety of signal processing problems where multiple s
Blind Separation of Nonstationary Sources in Noisy Mixtures Seungjin CHOI x1 and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University 48 Kaeshin-dong, Cheongju Chungbuk
More informationNon-Negative Matrix Factorization And Its Application to Audio. Tuomas Virtanen Tampere University of Technology
Non-Negative Matrix Factorization And Its Application to Audio Tuomas Virtanen Tampere University of Technology tuomas.virtanen@tut.fi 2 Contents Introduction to audio signals Spectrogram representation
More informationSlice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, Frequency and Time Domains
Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, and Domains Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang 2 Laboratory for Advanced Brain Signal Processing,
More informationSimultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors
Simultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors Sean Borman and Robert L. Stevenson Department of Electrical Engineering, University of Notre Dame Notre Dame,
More informationEquivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 12, DECEMBER 2008 2009 Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation Yuanqing Li, Member, IEEE, Andrzej Cichocki,
More informationwhere A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ
BLIND SEPARATION OF NONSTATIONARY AND TEMPORALLY CORRELATED SOURCES FROM NOISY MIXTURES Seungjin CHOI x and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University, KOREA
More informationMathematical Foundations of the Generalization of t-sne and SNE for Arbitrary Divergences
MACHINE LEARNING REPORTS Mathematical Foundations of the Generalization of t-sne and SNE for Arbitrary Report 02/2010 Submitted: 01.04.2010 Published:26.04.2010 T. Villmann and S. Haase University of Applied
More informationNon-negative Matrix Factor Deconvolution; Extraction of Multiple Sound Sources from Monophonic Inputs
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Non-negative Matrix Factor Deconvolution; Extraction of Multiple Sound Sources from Monophonic Inputs Paris Smaragdis TR2004-104 September
More informationGroup Sparse Non-negative Matrix Factorization for Multi-Manifold Learning
LIU, LU, GU: GROUP SPARSE NMF FOR MULTI-MANIFOLD LEARNING 1 Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning Xiangyang Liu 1,2 liuxy@sjtu.edu.cn Hongtao Lu 1 htlu@sjtu.edu.cn
More informationUsing Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method
Using Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method Antti Honkela 1, Stefan Harmeling 2, Leo Lundqvist 1, and Harri Valpola 1 1 Helsinki University of Technology,
More informationSparseness Constraints on Nonnegative Tensor Decomposition
Sparseness Constraints on Nonnegative Tensor Decomposition Na Li nali@clarksonedu Carmeliza Navasca cnavasca@clarksonedu Department of Mathematics Clarkson University Potsdam, New York 3699, USA Department
More informationIEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 54, NO. 2, FEBRUARY
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL 54, NO 2, FEBRUARY 2006 423 Underdetermined Blind Source Separation Based on Sparse Representation Yuanqing Li, Shun-Ichi Amari, Fellow, IEEE, Andrzej Cichocki,
More informationPreserving Privacy in Data Mining using Data Distortion Approach
Preserving Privacy in Data Mining using Data Distortion Approach Mrs. Prachi Karandikar #, Prof. Sachin Deshpande * # M.E. Comp,VIT, Wadala, University of Mumbai * VIT Wadala,University of Mumbai 1. prachiv21@yahoo.co.in
More informationON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM. Brain Science Institute, RIKEN, Wako-shi, Saitama , Japan
ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM Pando Georgiev a, Andrzej Cichocki b and Shun-ichi Amari c Brain Science Institute, RIKEN, Wako-shi, Saitama 351-01, Japan a On leave from the Sofia
More informationc Springer, Reprinted with permission.
Zhijian Yuan and Erkki Oja. A FastICA Algorithm for Non-negative Independent Component Analysis. In Puntonet, Carlos G.; Prieto, Alberto (Eds.), Proceedings of the Fifth International Symposium on Independent
More informationAn Improved Cumulant Based Method for Independent Component Analysis
An Improved Cumulant Based Method for Independent Component Analysis Tobias Blaschke and Laurenz Wiskott Institute for Theoretical Biology Humboldt University Berlin Invalidenstraße 43 D - 0 5 Berlin Germany
More informationNon-Negative Matrix Factorization
Chapter 3 Non-Negative Matrix Factorization Part 2: Variations & applications Geometry of NMF 2 Geometry of NMF 1 NMF factors.9.8 Data points.7.6 Convex cone.5.4 Projections.3.2.1 1.5 1.5 1.5.5 1 3 Sparsity
More informationNatural Gradient Learning for Over- and Under-Complete Bases in ICA
NOTE Communicated by Jean-François Cardoso Natural Gradient Learning for Over- and Under-Complete Bases in ICA Shun-ichi Amari RIKEN Brain Science Institute, Wako-shi, Hirosawa, Saitama 351-01, Japan Independent
More informationFAST AND EFFECTIVE MODEL ORDER SELECTION METHOD TO DETERMINE THE NUMBER OF SOURCES IN A LINEAR TRANSFORMATION MODEL
FAST AND EFFECTIVE MODEL ORDER SELECTION METHOD TO DETERMINE THE NUMBER OF SOURCES IN A LINEAR TRANSFORMATION MODEL Fengyu Cong 1, Asoke K Nandi 1,2, Zhaoshui He 3, Andrzej Cichocki 4, Tapani Ristaniemi
More informationCovariance Matrix Simplification For Efficient Uncertainty Management
PASEO MaxEnt 2007 Covariance Matrix Simplification For Efficient Uncertainty Management André Jalobeanu, Jorge A. Gutiérrez PASEO Research Group LSIIT (CNRS/ Univ. Strasbourg) - Illkirch, France *part
More informationAN ALTERNATING MINIMIZATION ALGORITHM FOR NON-NEGATIVE MATRIX APPROXIMATION
AN ALTERNATING MINIMIZATION ALGORITHM FOR NON-NEGATIVE MATRIX APPROXIMATION JOEL A. TROPP Abstract. Matrix approximation problems with non-negativity constraints arise during the analysis of high-dimensional
More informationNumerical Methods. Rafał Zdunek Underdetermined problems (2h.) Applications) (FOCUSS, M-FOCUSS,
Numerical Methods Rafał Zdunek Underdetermined problems (h.) (FOCUSS, M-FOCUSS, M Applications) Introduction Solutions to underdetermined linear systems, Morphological constraints, FOCUSS algorithm, M-FOCUSS
More informationAnalysis of polyphonic audio using source-filter model and non-negative matrix factorization
Analysis of polyphonic audio using source-filter model and non-negative matrix factorization Tuomas Virtanen and Anssi Klapuri Tampere University of Technology, Institute of Signal Processing Korkeakoulunkatu
More informationNonnegative Matrix Factorization Clustering on Multiple Manifolds
Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence (AAAI-10) Nonnegative Matrix Factorization Clustering on Multiple Manifolds Bin Shen, Luo Si Department of Computer Science,
More informationPROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata
' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--
More informationA Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement
A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement Simon Leglaive 1 Laurent Girin 1,2 Radu Horaud 1 1: Inria Grenoble Rhône-Alpes 2: Univ. Grenoble Alpes, Grenoble INP,
More informationNonnegative Matrix Factorization
Nonnegative Matrix Factorization Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationConstrained Projection Approximation Algorithms for Principal Component Analysis
Constrained Projection Approximation Algorithms for Principal Component Analysis Seungjin Choi, Jong-Hoon Ahn, Andrzej Cichocki Department of Computer Science, Pohang University of Science and Technology,
More informationRank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs
Rank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs Bhargav Kanagal Department of Computer Science University of Maryland College Park, MD 277 bhargav@cs.umd.edu Vikas
More informationA two-layer ICA-like model estimated by Score Matching
A two-layer ICA-like model estimated by Score Matching Urs Köster and Aapo Hyvärinen University of Helsinki and Helsinki Institute for Information Technology Abstract. Capturing regularities in high-dimensional
More informationNon-Negative Tensor Factorisation for Sound Source Separation
ISSC 2005, Dublin, Sept. -2 Non-Negative Tensor Factorisation for Sound Source Separation Derry FitzGerald, Matt Cranitch φ and Eugene Coyle* φ Dept. of Electronic Engineering, Cor Institute of Technology
More information/16/$ IEEE 1728
Extension of the Semi-Algebraic Framework for Approximate CP Decompositions via Simultaneous Matrix Diagonalization to the Efficient Calculation of Coupled CP Decompositions Kristina Naskovska and Martin
More informationNonnegative Tensor Factorization using a proximal algorithm: application to 3D fluorescence spectroscopy
Nonnegative Tensor Factorization using a proximal algorithm: application to 3D fluorescence spectroscopy Caroline Chaux Joint work with X. Vu, N. Thirion-Moreau and S. Maire (LSIS, Toulon) Aix-Marseille
More informationRegularized NNLS Algorithms for Nonnegative Matrix Factorization with Application to Text Document Clustering
Regularized NNLS Algorithms for Nonnegative Matrix Factorization with Application to Text Document Clustering Rafal Zdunek Abstract. Nonnegative Matrix Factorization (NMF) has recently received much attention
More informationAutomated Segmentation of Low Light Level Imagery using Poisson MAP- MRF Labelling
Automated Segmentation of Low Light Level Imagery using Poisson MAP- MRF Labelling Abstract An automated unsupervised technique, based upon a Bayesian framework, for the segmentation of low light level
More informationENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition
ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University
More informationData Mining and Matrices
Data Mining and Matrices 6 Non-Negative Matrix Factorization Rainer Gemulla, Pauli Miettinen May 23, 23 Non-Negative Datasets Some datasets are intrinsically non-negative: Counters (e.g., no. occurrences
More informationMarkov Random Fields
Markov Random Fields Umamahesh Srinivas ipal Group Meeting February 25, 2011 Outline 1 Basic graph-theoretic concepts 2 Markov chain 3 Markov random field (MRF) 4 Gauss-Markov random field (GMRF), and
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationSOUND SOURCE SEPARATION BASED ON NON-NEGATIVE TENSOR FACTORIZATION INCORPORATING SPATIAL CUE AS PRIOR KNOWLEDGE
SOUND SOURCE SEPARATION BASED ON NON-NEGATIVE TENSOR FACTORIZATION INCORPORATING SPATIAL CUE AS PRIOR KNOWLEDGE Yuki Mitsufuji Sony Corporation, Tokyo, Japan Axel Roebel 1 IRCAM-CNRS-UPMC UMR 9912, 75004,
More informationNonnegative Matrix Factor 2-D Deconvolution for Blind Single Channel Source Separation
Nonnegative Matrix Factor 2-D Deconvolution for Blind Single Channel Source Separation Mikkel N. Schmidt and Morten Mørup Technical University of Denmark Informatics and Mathematical Modelling Richard
More informationDeep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści
Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, 2017 Spis treści Website Acknowledgments Notation xiii xv xix 1 Introduction 1 1.1 Who Should Read This Book?
More informationFundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)
Fundamentals of Principal Component Analysis (PCA),, and Independent Vector Analysis (IVA) Dr Mohsen Naqvi Lecturer in Signal and Information Processing, School of Electrical and Electronic Engineering,
More informationDiscriminative Fields for Modeling Spatial Dependencies in Natural Images
Discriminative Fields for Modeling Spatial Dependencies in Natural Images Sanjiv Kumar and Martial Hebert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 {skumar,hebert}@ri.cmu.edu
More informationA Convex Cauchy-Schwarz Divergence Measure for Blind Source Separation
INTERNATIONAL JOURNAL OF CIRCUITS, SYSTEMS AND SIGNAL PROCESSING Volume, 8 A Convex Cauchy-Schwarz Divergence Measure for Blind Source Separation Zaid Albataineh and Fathi M. Salem Abstract We propose
More informationOn the Estimation of the Mixing Matrix for Underdetermined Blind Source Separation in an Arbitrary Number of Dimensions
On the Estimation of the Mixing Matrix for Underdetermined Blind Source Separation in an Arbitrary Number of Dimensions Luis Vielva 1, Ignacio Santamaría 1,Jesús Ibáñez 1, Deniz Erdogmus 2,andJoséCarlosPríncipe
More informationORTHOGONALITY-REGULARIZED MASKED NMF FOR LEARNING ON WEAKLY LABELED AUDIO DATA. Iwona Sobieraj, Lucas Rencker, Mark D. Plumbley
ORTHOGONALITY-REGULARIZED MASKED NMF FOR LEARNING ON WEAKLY LABELED AUDIO DATA Iwona Sobieraj, Lucas Rencker, Mark D. Plumbley University of Surrey Centre for Vision Speech and Signal Processing Guildford,
More informationNonnegative matrix factorization (NMF) for determined and underdetermined BSS problems
Nonnegative matrix factorization (NMF) for determined and underdetermined BSS problems Ivica Kopriva Ruđer Bošković Institute e-mail: ikopriva@irb.hr ikopriva@gmail.com Web: http://www.lair.irb.hr/ikopriva/
More informationRecurrent Latent Variable Networks for Session-Based Recommendation
Recurrent Latent Variable Networks for Session-Based Recommendation Panayiotis Christodoulou Cyprus University of Technology paa.christodoulou@edu.cut.ac.cy 27/8/2017 Panayiotis Christodoulou (C.U.T.)
More informationSEC: Stochastic ensemble consensus approach to unsupervised SAR sea-ice segmentation
2009 Canadian Conference on Computer and Robot Vision SEC: Stochastic ensemble consensus approach to unsupervised SAR sea-ice segmentation Alexander Wong, David A. Clausi, and Paul Fieguth Vision and Image
More informationarxiv: v4 [math.na] 10 Nov 2014
NEWTON-BASED OPTIMIZATION FOR KULLBACK-LEIBLER NONNEGATIVE TENSOR FACTORIZATIONS SAMANTHA HANSEN, TODD PLANTENGA, TAMARA G. KOLDA arxiv:134.4964v4 [math.na] 1 Nov 214 Abstract. Tensor factorizations with
More informationNon-negative matrix factorization with fixed row and column sums
Available online at www.sciencedirect.com Linear Algebra and its Applications 9 (8) 5 www.elsevier.com/locate/laa Non-negative matrix factorization with fixed row and column sums Ngoc-Diep Ho, Paul Van
More informationCPSC 340: Machine Learning and Data Mining. Sparse Matrix Factorization Fall 2018
CPSC 340: Machine Learning and Data Mining Sparse Matrix Factorization Fall 2018 Last Time: PCA with Orthogonal/Sequential Basis When k = 1, PCA has a scaling problem. When k > 1, have scaling, rotation,
More informationNonnegative matrix factorization with α-divergence
Nonnegative matrix factorization with α-divergence Andrzej Cichocki a, Hyekyoung Lee b, Yong-Deok Kim b, Seungjin Choi b, a Laboratory for Advanced Brain Signal Processing Brain Science Institute, RIKEN
More informationCity Research Online. Permanent City Research Online URL:
Benetos, E. & Kotropoulos, C. (2008). A tensor-based approach for automatic music genre classification. Paper presented at the EUSIPCO 2008: 16th European Signal Processing Conference, 25-29 Aug 2008,
More informationA variational radial basis function approximation for diffusion processes
A variational radial basis function approximation for diffusion processes Michail D. Vrettas, Dan Cornford and Yuan Shen Aston University - Neural Computing Research Group Aston Triangle, Birmingham B4
More informationNonnegative Tensor Factorization for Continuous EEG Classification a
POSTECH-MLG-2007-004 Machine Learning Group Department of Computer Science, POSTECH Nonnegative Tensor Factorization for Continuous EEG Classification a Hyekyoung Lee, Yong-Deok Kim, Andrzej Cichocki,
More informationMachine Learning (BSMC-GA 4439) Wenke Liu
Machine Learning (BSMC-GA 4439) Wenke Liu 02-01-2018 Biomedical data are usually high-dimensional Number of samples (n) is relatively small whereas number of features (p) can be large Sometimes p>>n Problems
More informationNON-NEGATIVE SPARSE CODING
NON-NEGATIVE SPARSE CODING Patrik O. Hoyer Neural Networks Research Centre Helsinki University of Technology P.O. Box 9800, FIN-02015 HUT, Finland patrik.hoyer@hut.fi To appear in: Neural Networks for
More informationIEICE TRANS. FUNDAMENTALS, VOL.Exx??, NO.xx XXXX 200x 1
IEICE TRANS. FUNDAMENTALS, VOL.Exx??, NO.xx XXXX x INVITED PAPER Special Section on Signal Processing Fast Local Algorithms for Large Scale Nonnegative Matrix and Tensor Factorizations Andrze CICHOCKI
More informationWHEN an object is represented using a linear combination
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL 20, NO 2, FEBRUARY 2009 217 Discriminant Nonnegative Tensor Factorization Algorithms Stefanos Zafeiriou Abstract Nonnegative matrix factorization (NMF) has proven
More informationUsing Hankel structured low-rank approximation for sparse signal recovery
Using Hankel structured low-rank approximation for sparse signal recovery Ivan Markovsky 1 and Pier Luigi Dragotti 2 Department ELEC Vrije Universiteit Brussel (VUB) Pleinlaan 2, Building K, B-1050 Brussels,
More informationNote on Algorithm Differences Between Nonnegative Matrix Factorization And Probabilistic Latent Semantic Indexing
Note on Algorithm Differences Between Nonnegative Matrix Factorization And Probabilistic Latent Semantic Indexing 1 Zhong-Yuan Zhang, 2 Chris Ding, 3 Jie Tang *1, Corresponding Author School of Statistics,
More informationEUSIPCO
EUSIPCO 213 1569744273 GAMMA HIDDEN MARKOV MODEL AS A PROBABILISTIC NONNEGATIVE MATRIX FACTORIZATION Nasser Mohammadiha, W. Bastiaan Kleijn, Arne Leijon KTH Royal Institute of Technology, Department of
More information1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo
The Fixed-Point Algorithm and Maximum Likelihood Estimation for Independent Component Analysis Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.Box 5400,
More informationBias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions
- Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions Simon Luo The University of Sydney Data61, CSIRO simon.luo@data61.csiro.au Mahito Sugiyama National Institute of
More informationComplete Blind Subspace Deconvolution
Complete Blind Subspace Deconvolution Zoltán Szabó Department of Information Systems, Eötvös Loránd University, Pázmány P. sétány 1/C, Budapest H-1117, Hungary szzoli@cs.elte.hu http://nipg.inf.elte.hu
More informationCS598 Machine Learning in Computational Biology (Lecture 5: Matrix - part 2) Professor Jian Peng Teaching Assistant: Rongda Zhu
CS598 Machine Learning in Computational Biology (Lecture 5: Matrix - part 2) Professor Jian Peng Teaching Assistant: Rongda Zhu Feature engineering is hard 1. Extract informative features from domain knowledge
More informationThe Information Bottleneck Revisited or How to Choose a Good Distortion Measure
The Information Bottleneck Revisited or How to Choose a Good Distortion Measure Peter Harremoës Centrum voor Wiskunde en Informatica PO 94079, 1090 GB Amsterdam The Nederlands PHarremoes@cwinl Naftali
More informationA Generative Perspective on MRFs in Low-Level Vision Supplemental Material
A Generative Perspective on MRFs in Low-Level Vision Supplemental Material Uwe Schmidt Qi Gao Stefan Roth Department of Computer Science, TU Darmstadt 1. Derivations 1.1. Sampling the Prior We first rewrite
More informationFactorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling
Monte Carlo Methods Appl, Vol 6, No 3 (2000), pp 205 210 c VSP 2000 Factorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling Daniel B Rowe H & SS, 228-77 California Institute of
More informationSparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach
Sparse Sensing in Colocated MIMO Radar: A Matrix Completion Approach Athina P. Petropulu Department of Electrical and Computer Engineering Rutgers, the State University of New Jersey Acknowledgments Shunqiao
More information