Blind Source Separation Using Artificial immune system

Similar documents
Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004

Research Article A Hybrid ICA-SVM Approach for Determining the Quality Variables at Fault in a Multivariate Process

CIFAR Lectures: Non-Gaussian statistics and natural images

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

Independent Component Analysis

Separation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function

Independent Component Analysis

Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)

TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen

Independent Component Analysis. PhD Seminar Jörgen Ungh

Independent Component Analysis. Contents

CPSC 340: Machine Learning and Data Mining. More PCA Fall 2017

Principal Component Analysis vs. Independent Component Analysis for Damage Detection

ICA. Independent Component Analysis. Zakariás Mátyás

Recursive Generalized Eigendecomposition for Independent Component Analysis

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

Independent Component Analysis

Non-Euclidean Independent Component Analysis and Oja's Learning

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello

Comparative Analysis of ICA Based Features

HST.582J/6.555J/16.456J

Using Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method

Advanced Introduction to Machine Learning CMU-10715

1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo

ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS. Maria Funaro, Erkki Oja, and Harri Valpola

Separation of Different Voices in Speech using Fast Ica Algorithm

Multi-user FSO Communication Link

Independent component analysis: algorithms and applications

Robust extraction of specific signals with temporal structure

NONLINEAR INDEPENDENT FACTOR ANALYSIS BY HIERARCHICAL MODELS

An Introduction to Independent Components Analysis (ICA)

Estimation of linear non-gaussian acyclic models for latent factors

Blind Deconvolution via Maximum Kurtosis Adaptive Filtering

Independent Component Analysis

Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2

An Improved Cumulant Based Method for Independent Component Analysis

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

Independent Component Analysis and FastICA. Copyright Changwei Xiong June last update: July 7, 2016

Independent Component Analysis and Its Application on Accelerator Physics

Comparison of Fast ICA and Gradient Algorithms of Independent Component Analysis for Separation of Speech Signals

CPSC 340: Machine Learning and Data Mining. Sparse Matrix Factorization Fall 2018

DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION. Alexandre Iline, Harri Valpola and Erkki Oja

INDEPENDENT COMPONENT ANALYSIS

FEATURE EXTRACTION USING SUPERVISED INDEPENDENT COMPONENT ANALYSIS BY MAXIMIZING CLASS DISTANCE

Immediate Reward Reinforcement Learning for Projective Kernel Methods

ICA and ISA Using Schweizer-Wolff Measure of Dependence

MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen

Extraction of Sleep-Spindles from the Electroencephalogram (EEG)

Independent Component Analysis for Redundant Sensor Validation

A Canonical Genetic Algorithm for Blind Inversion of Linear Channels

NONLINEAR IDENTIFICATION ON BASED RBF NEURAL NETWORK

ACENTRAL problem in neural-network research, as well

Verification of contribution separation technique for vehicle interior noise using only response signals

Cheng Soon Ong & Christian Walder. Canberra February June 2018

ICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA

APPLICATION OF INDEPENDENT COMPONENT ANALYSIS TO CHEMICAL REACTIONS. S.Triadaphillou, A. J. Morris and E. B. Martin

Linear Prediction Based Blind Source Extraction Algorithms in Practical Applications

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

1 Introduction Blind source separation (BSS) is a fundamental problem which is encountered in a variety of signal processing problems where multiple s

Independent Component Analysis (ICA)

Unsupervised Learning Methods

Non-orthogonal Support-Width ICA

Unsupervised learning: beyond simple clustering and PCA

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

Principal Component Analysis

Rigid Structure from Motion from a Blind Source Separation Perspective

Mining Big Data Using Parsimonious Factor and Shrinkage Methods

ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM. Brain Science Institute, RIKEN, Wako-shi, Saitama , Japan

Undercomplete Independent Component. Analysis for Signal Separation and. Dimension Reduction. Category: Algorithms and Architectures.

Independent Component Analysis and Unsupervised Learning

Gaussian Processes for Audio Feature Extraction

Comparison of Modern Stochastic Optimization Algorithms

Two-Stage Temporally Correlated Source Extraction Algorithm with Its Application in Extraction of Event-Related Potentials

Independent Component Analysis and Blind Source Separation

Research Article Integrating Independent Component Analysis and Principal Component Analysis with Neural Network to Predict Chinese Stock Market

Independent Component Analysis and Unsupervised Learning. Jen-Tzung Chien

where A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ

Intelligent Modular Neural Network for Dynamic System Parameter Estimation

Speed and Accuracy Enhancement of Linear ICA Techniques Using Rational Nonlinear Functions

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Efficiency Measurement Using Independent Component Analysis And Data Envelopment Analysis

Constrained Independent Component Analysis

On the Estimation of Entropy in the FastICA Algorithm

Clutter Rejection for Doppler Ultrasound Based on Blind Source Separation

Fast Sparse Representation Based on Smoothed

ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC ALGORITHM FOR NONLINEAR MIMO MODEL OF MACHINING PROCESSES

Engineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers

Package ProDenICA. February 19, 2015

Soft-LOST: EM on a Mixture of Oriented Lines

Short-Time ICA for Blind Separation of Noisy Speech

Bayesian ensemble learning of generative models

Autoregressive Independent Process Analysis with Missing Observations

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

PCA, Kernel PCA, ICA

Post-Nonlinear Blind Extraction in the Presence of Ill-Conditioned Mixing Wai Yie Leong and Danilo P. Mandic, Senior Member, IEEE

Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA

ISSN: (Online) Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies

Transcription:

American Journal of Engineering Research (AJER) e-issn : 2320-0847 p-issn : 2320-0936 Volume-03, Issue-02, pp-240-247 www.ajer.org Research Paper Open Access Blind Source Separation Using Artificial immune system 1 Sunil Kumar.S, 2 Arun J S, 3 George Mathew, 4 Varadharaju.G 1,2,3, 4 PG students,department of Electronics and Communication Akshaya College of Engineering and technology,coimbatore,tamil Nadu Abstract: - Independent component analysis is a computational method to solve blind source separation. Blind source separation is defined as the separation of source signals from received signals without any prior knowledge of the source signals. Independent Component Analysis looks for the component that are statistically independent and nongaussian. The most traditional algorithm used is FASTICA algorithm. It is the algorithm which uses Newtonian Iteration approximation.independent component analysis based on FASTICA algorithm faced two main disadvantages.one is that the order of the independent components is difficult to be determined and the suitable source signals are not isolated.in order to overcome these disadvantages, an improved ICA algorithm based on Artificial Immune System is used.ais is an attractive technique and it is advantageous over other techniques such as it can be easily implemented and has great capability of escaping local optimal solutions. It has the great capability of providing better signal separation when compared to FASTICA algorithm.the goal of the proposed AIS-ICA algorithm is to use AIS to determine the separating matrix of ICA by means of which order of independent component analysis is determined and suitable source signals are isolated. The Artificial immune system is based on natural immune system principles and it can offer strong and robust information processing capabilities for solving complex problems. Artificial immune system is based on clonal selection principle that can be used to optimise functions. The steps involved in CLONALG are initialisation, cloning, mutation and selection. In case of initialisation, initialise the population randomly and select the function to maximise. In the second step, the best antibody being cloned the most. In case of mutation, the worst antibodies are mutated higher.the final step is selection in which antibodies are selected for next iteration. It describes the basic principles of immune response to an antigenic stimulus. It is population based algorithm and its only variation operator is mutation. Keywords: - Independent component analysis, artificial immune system, signal separation, heuristic algorithm I. INTRODUCTION Blind source separation (BSS) is to separate the source from the received signals without any prior knowledge of the source signal. Problems related to BSS have become an active research area in the fields of statistical signal processing and unsupervised neural learning. Independent component analysis (ICA) is one of the most used methods for BSS. The goal of ICA is to recover independent sources when given only sensor observations that are unknown linear mixtures of the unobserved independent source signals. It has been investigated extensively in image processing, financial time series data and statistical process control. For ICA, many effective algorithms have been proposed, the most used traditional algorithm is FastICA algorithm that uses the approximate Newtonian iteration algorithm. But in practical application, it often leads to local minimum solution and the suitable source signals are not isolated. Moreover, the order of the independent components (ICs) is difficult to be determined. These two problems are the main drawbacks of FastICA algorithm. To overcome these disadvantages, an improved ICA algorithm based on artificial immune system (AIS) (called AIS-ICA) is presented. They are one among many types of algorithm inspired by biological systems, including evolutionary algorithms, swarm intelligence, neural networks and membrane computing. AIS are bio-inspired algorithms that take their inspiration from the human immune system. Within AIS, there are many different types of algorithm, and research to date has focused primarily on the theories of immune networks, clonal selection and negative selection. It is based on natural immune system principles, and it can offer strong and robust information processing w w w. a j e r. o r g Page 240

capabilities for solving complex problems. There are various applications of AIS, and they include data analysis, scheduling, classification, fault detection and security of information systems. Since AIS has many advantages over other heuristic techniques such as it can be easily implemented and has great capability of escaping local optimal solutions, it is used in this study to develop the AIS-ICA algorithm. I.ICA II. METHODOLOGY DATA CENTERING PRINCIPAL COMPONENT ANALYSIS DATA WHITENING FIXED POINT ITERATION FOR ONE UNIT SEPERATED SIGNAL Fig 1: block diagram of existing work DATA CENTERING The input data X is centered by computing the mean of every component of X and subtracting that mean. The data centering can be defined as the difference between data vector(x) and mean(x) and it is given by X c =X-E{X}(1) PRINCIPAL COMPONENT ANALYSIS It is one of the pre processing tool for ICA. It is used to find the eigen values and the eigen vectors in order to whiten the given data. The drawback is that the concept of correlation is used. DATA WHITENING The main goal of whitening is to transform the data vector linearly as result of which newly obtained data vector is white i.e. the components are uncorrelated and the variance equal to unity or the covariance matrix equal to identity matrix. Whitening can be done using Eigen value decomposition which can be defined as follows E*D*E T Where E is the Eigen Vector, D is the Eigen value, E T is the transpose matrix of Eigen vectors Since eigen values are single component wise operation the above equation can be written as Z=E*D -1/2 *E T FASTICA ALGORITHM FOR ONE UNIT The FASTICA algorithm for one unit estimates one row of demixing matrix that is extremum of contrast functions. It is an algorithm which maximises the non gaussianity of statistical independence. Non gaussianity can be measured by two methods namely Kurtosis contrast function and negentropy. Kurtosis can be classically dehined as follows Kurt(b) = E(b 4 ) 3(E(b 2 )) 2 (2) If variable b is assumed to be zero mean and unit variance, the right hand side simplifies to E(b 4 ) 3. This shows that kurtosis is simply a normalised version of the fourth moment E(b 4 ). For a Gaussian b, the fouth moment equals 3(E(b 2 )) 2. Thus, kurtosis is zero for a Gaussian random variable and non zero for most non Gaussian random variables. Unlike kurtosis,negentropy is determined according to the information quantity of differential entropy. Entropy is a measure of the average uncertainity in a random variable. The differential entropy H of random variable b with density f(b) is defined as H(b)=- p b log p b db. According to a fundamental result of w w w. a j e r. o r g Page 241

information theory, a Gaussian variable will have the highest entropy value among a set of random variables with equal variance. For obtaining a measure of non-gaussianity, the negentropy J is defined as follows: J(b) = H(b gauss ) - H(b) The negentropy is always non negative and is zero if and only if b has Gaussian distribution. Since negentropy is very difficult to compute, an approximation of negentropy is proposed as follows: J(b) ~[E{G(b)} E{G(0)}] 2 (3) Where 0 is a Gaussian variable of zero mean and unit variance, and b is a random variable with zero mean and unit variance. G is a non quadratic function, and is given by G(b) = b 4. DATA CENTERING PRINCIPAL COMPONENT ANALYSIS DATA WHITENING INITIALISATION CLONING HYPER MUTATION SELECTION SIGNAL SEPARATION Fig 2:Block diagram of proposed work DATA CENTERING The input data X is centered by computing the mean of every component of X and subtracting that mean. The data centering can be defined as the difference between data vector(x) and mean(x) and it is given by X c =X-E{X} PRINCIPAL COMPONENT ANALYSIS It is one of the pre processing tool for ICA. It is used to find the eigen values and the eigen vectors in order to whiten the given data. The drawback is that the concept of correlation is used. DATA WHITENING The main goal of whitening is to transform the data vector linearly as result of which newly obtained data vector is white i.e. the components are uncorrelated and the variance equal to unity or the covariance matrix equal to identity matrix. Whitening can be done using Eigen value decomposition which can be defined as follows E*D*E T (4) Where E is the Eigen Vector, D is the Eigen value, E T is the transpose matrix of Eigen vectors Since eigen values are single component wise operation the above equation can be written as Z=E*D -1/2 *E T (5) INITIALISATION w w w. a j e r. o r g Page 242

It is the process in which variables are assigned. The function that is to be maximsed or minimized should be selected.initialise the population randomly. CLONING Antibody s affinity corresponds to the evaluation of the objective function given by the antigen.the antibodies are cloned; the best antibody being cloned the most and worst being cloned the least number of times. HYPER MUTATION The clones are mutated in inverse proportion to their affinity. The best antibody s clones are mutated lesser and worst antibody s clones are mutated most.the mutation can be uniform, gaussian or exponential. SELECTION It is the final step involved in CLONALG algorithm. It is the process in which best antibodies are selected for next iteration. III. SIMULATION RESULTS The input signal is then passed through the mixing matrix as a result of which mixed signals will be produced. The proposed AIS ICA and FASTICA algorithm will be used to separate the original signal. The artificially generated mixtures are used to evaluate the effectiveness of the proposed AIS ICA algorithm compared with that of FASTICA algorithm. Six simulated signals are used as input source signals which are as follows: a. Modulated sinusoid: P(t)= 2*sin(t/149)*cos(t/8)+0.2*rand() b. Square wave: Q(t)= sign(sin(12*t + 9*cos(2/29))) + 0.1 *rand() c. Sawtooth: R(t)= ( rem(t, 79)-17 )/23+0.1*rand() d. Impulsive curve: S(t)= ((rem(t,23)-11)/9)^5) + 0.1*rand() e. Exponential: U(t)= 5*exp(-t/121)*cos(37*t) + 0.1*rand() where the rem function returns a result that is between 0 and sign(e)*abs(z). If z is a zero,rem returns NaN(not a number). The rand function generates a set of uniformly distributed pseudo random numbers. Fig 3:Block diagram of separating six signals Where s1,s2,s3,s4,s5,s6 represents separated signals w w w. a j e r. o r g Page 243

The input signals waveforms are given below as follows Fig 6:Input signals The first to sixth I/p represent modulated sinusoid, square wave,sawtooth wave, impulsive wave, exponential and spiky noise respectively.the input signals are passed through the 6 6 mixing matrix as a result of which the mixed signals are generated. The mixed signals are passed through FASTICA and AIS-ICA algorithm by means of which signals will be separated. The mixed signal wave forms are as follows Fig 7: Mixed signals w w w. a j e r. o r g Page 244

The separated signals of FASTICA and AIS-ICA are as follows: Figure 8: FASTICA Figure 9 AIS-ICA It can be seen that FastICA couldn t separate the independent components clearly, whereas the proposed AIS- ICA separates them clearly. That is, it can be seen that the source signals separated by the AIS-ICA algorithm are more accurate than those separated by the FastICA algorithm. w w w. a j e r. o r g Page 245

Figure11:FASTICA by 1 st round AIS-ICA Fig 9(a): AIS-ICA by 1 st round From the above rounds of FASTICA and AIS ICA, the following information can be inferred: AIS-ICA provides better separating results compared to that of FAST-ICA. The order of IC can be easily determined by AIS-ICA w w w. a j e r. o r g Page 246

IV. CONCLUSION AND FUTURE WORKS Independent component analysis separates mixed signals blindly without any information of the mixing system. FASTICA is the most popular gradient based ICA algorithm, but it has two disadvantages order of the ICs is difficult to be determined and easy to obtain local minimum. In order to overcome these two disadvantages, an improved ICA algorithm based on AIS is used. In the proposed AIS-ICA algorithm, it is used to determine the de mixing matrix of ICA. Simulation results from the artificial signal data showed that the AIS- ICA algorithms provides better separating results than that of FASTICA algorithm. The order of ICs can be determined by the proposed AIS-ICA algorithm. The future work indicates AIS ICA has to applications aiding problems involved in claasification,fault detection and security of informational systems. 4.1 FUTURE WORKS The future work indicates AIS ICA has found to be in many involved in claasification,fault detection and security of informational systems. It also aids the problems involved in control of growing of memory cells and also the mutation and cloning are taken randomly. So a new classifier called as proposed particle swam optimisation is used to attain an optimal value. The future works aids to be applied in EEG signal classification by means of which normal and abnormal signals can be easily classified with high accuracy. REFERENCES [1] A. Hyvärinen, J. Karhunen, and E. Oja, Independent Component Analysis, New York: John Wiley & Sons, 2001. [2] L. N. De Castro and J. Timmis, Artificial Immune Systems: A New Computational Intelligence Approach, New York: Springer-Verlag, 2002. [3] V. David and A. Sanchez, Frontiers of research in BSS/ICA, Neurocomputing, vol. 49, pp. 7-23, 2002. [4] L. N. De Castro and J. Timmis, Artificial Immune Systems: A New Computational Intelligence Approach, New York: Springer-Verlag, 2002. [5] A. Cichocki, S. I. Amari, Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications, New York: John Wiley & Sons, 2002. [6] L. N. De Castro and J. Timmis, Artificial Immune Systems: A New Computational Intelligence Approach, New York: Springer-Verlag, 2002. [7] L. Xie and J. Wu, Global optimal ICA and its application in MEG data analysis, Neurocomputing, vol. 69, pp. 2438-2442, 2006. [8] H. Zhang, C. Guo, Z. Shi, and E. Feng, A new constrained fixed-point algorithm for ordering independent components, Journal of Computational and Applied Mathematics, vol. 220, pp. 548-558, 2008. [9] C. J. Lu, T. S. Lee, and C. C. Chiu, Financial time series forecasting using independent component analysis and support vector regression, Decision Support Systems, vol. 47, pp. 115-125, 2009. [10] F. Nian, W. Li, X. Sun, and M. Li, An Improved Particle Swarm Optimization Application to Independent Component Analysis, in Proc. 2009 International Conference on Information Engineering and Computer Science, Wuhan, China, 2009, pp. 1-4. [11] C. J. Lu, Y. E. Shao, and B. S. Li, Mixture control chart patterns recognition using independent component analysis and support vector machine, Neurocomputing, vol. 74, pp. 1908-1914, 2011. [12] C. J. Lu, An independent component analysis-based disturbance separation scheme for statistical process monitoring, Journal of Intelligent Manufacturing, vol. 23, pp. 561-573, 2012. w w w. a j e r. o r g Page 247