Nonlinear Blind Source Separation Using Hybrid Neural Networks*

Similar documents
Functions of Random Variables

Estimation of Stress- Strength Reliability model using finite mixture of exponential distributions

A Robust Total Least Mean Square Algorithm For Nonlinear Adaptive Filter

An Introduction to. Support Vector Machine

Bayes Estimator for Exponential Distribution with Extension of Jeffery Prior Information

Bayes (Naïve or not) Classifiers: Generative Approach

Kernel-based Methods and Support Vector Machines

Introduction to local (nonparametric) density estimation. methods

Research on SVM Prediction Model Based on Chaos Theory

Principal Components. Analysis. Basic Intuition. A Method of Self Organized Learning

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

ABOUT ONE APPROACH TO APPROXIMATION OF CONTINUOUS FUNCTION BY THREE-LAYERED NEURAL NETWORK

Analysis of Lagrange Interpolation Formula

CHAPTER VI Statistical Analysis of Experimental Data

Generating Multivariate Nonnormal Distribution Random Numbers Based on Copula Function

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

Study on a Fire Detection System Based on Support Vector Machine

Lecture 9: Tolerant Testing

A Sensitivity-Based Adaptive Architecture Pruning Algorithm for Madalines

Research and Simulation of FECG Signal Blind Separation Algorithm Based on Gradient Method

Analysis of Variance with Weibull Data

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier

Unsupervised Learning and Other Neural Networks

Econometric Methods. Review of Estimation

PROJECTION PROBLEM FOR REGULAR POLYGONS

13. Parametric and Non-Parametric Uncertainties, Radial Basis Functions and Neural Network Approximations

Research Article A New Derivation and Recursive Algorithm Based on Wronskian Matrix for Vandermonde Inverse Matrix

BERNSTEIN COLLOCATION METHOD FOR SOLVING NONLINEAR DIFFERENTIAL EQUATIONS. Aysegul Akyuz Dascioglu and Nese Isler

A New Family of Transformations for Lifetime Data

Support vector machines II

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

Comparing Different Estimators of three Parameters for Transmuted Weibull Distribution

6.867 Machine Learning

The Generalized Inverted Generalized Exponential Distribution with an Application to a Censored Data

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames

Dimensionality reduction Feature selection

Chapter 14 Logistic Regression Models

MAX-MIN AND MIN-MAX VALUES OF VARIOUS MEASURES OF FUZZY DIVERGENCE

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

Comparison of Dual to Ratio-Cum-Product Estimators of Population Mean

Statistical characteristics of the normalized Stokes parameters

Cubic Nonpolynomial Spline Approach to the Solution of a Second Order Two-Point Boundary Value Problem

Support vector machines

Dynamic Analysis of Axially Beam on Visco - Elastic Foundation with Elastic Supports under Moving Load

Systematic Selection of Parameters in the development of Feedforward Artificial Neural Network Models through Conventional and Intelligent Algorithms

Point Estimation: definition of estimators

Reliability evaluation of distribution network based on improved non. sequential Monte Carlo method

Simulation Output Analysis

Median as a Weighted Arithmetic Mean of All Sample Observations

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Lecture 3 Probability review (cont d)

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits

Lecture 12: Multilayer perceptrons II

A BRUTE-FORCE ANALYTICAL FORMULATION OF THE INDEPENDENT COMPONENTS ANALYSIS SOLUTION

A Combination of Adaptive and Line Intercept Sampling Applicable in Agricultural and Environmental Studies

Permutation Tests for More Than Two Samples

VOL. 3, NO. 11, November 2013 ISSN ARPN Journal of Science and Technology All rights reserved.

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

Numerical Simulations of the Complex Modied Korteweg-de Vries Equation. Thiab R. Taha. The University of Georgia. Abstract

A Study of the Reproducibility of Measurements with HUR Leg Extension/Curl Research Line

Continuous Distributions

Study of Correlation using Bayes Approach under bivariate Distributions

Binary classification: Support Vector Machines

A New Measure of Probabilistic Entropy. and its Properties

13. Artificial Neural Networks for Function Approximation

Chapter 4 Multiple Random Variables

A Penalty Function Algorithm with Objective Parameters and Constraint Penalty Parameter for Multi-Objective Programming

Lecture 16: Backpropogation Algorithm Neural Networks with smooth activation functions

ECE 559: Wireless Communication Project Report Diversity Multiplexing Tradeoff in MIMO Channels with partial CSIT. Hoa Pham

BAYESIAN INFERENCES FOR TWO PARAMETER WEIBULL DISTRIBUTION

CIS 800/002 The Algorithmic Foundations of Data Privacy October 13, Lecture 9. Database Update Algorithms: Multiplicative Weights

An Improved Differential Evolution Algorithm Based on Statistical Log-linear Model

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy

ESTIMATION OF MISCLASSIFICATION ERROR USING BAYESIAN CLASSIFIERS

Confidence Intervals for Double Exponential Distribution: A Simulation Approach

4. Standard Regression Model and Spatial Dependence Tests

On Fuzzy Arithmetic, Possibility Theory and Theory of Evidence

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i.

It is Advantageous to Make a Syllabus as Precise as Possible: Decision-Theoretic Analysis

Entropy ISSN by MDPI

Third handout: On the Gini Index

Lecture 2 - What are component and system reliability and how it can be improved?

Grading learning for blind source separation

Lecture 07: Poles and Zeros

Chapter 8. Inferences about More Than Two Population Central Values

Special Instructions / Useful Data

ANALYSIS ON THE NATURE OF THE BASIC EQUATIONS IN SYNERGETIC INTER-REPRESENTATION NETWORK

Image Decomposition of Partly Noisy Images

Uniform asymptotical stability of almost periodic solution of a discrete multispecies Lotka-Volterra competition system

Summary of the lecture in Biostatistics

NP!= P. By Liu Ran. Table of Contents. The P vs. NP problem is a major unsolved problem in computer

ρ < 1 be five real numbers. The

Towards Multi-Layer Perceptron as an Evaluator Through Randomly Generated Training Patterns

22 Nonparametric Methods.

Rademacher Complexity. Examples

A New Development on ANN in China Biomimetic Pattern Recognition and Multi Weight Vector Neurons

Faults Classification of a Scooter Engine Platform Using Wavelet Transform and Artificial Neural Network

Class 13,14 June 17, 19, 2015

Transcription:

Nolear Bld Source Separato Usg Hybrd Neural Networks* Chu-Hou Zheg,2, Zh-Ka Huag,2, chael R. Lyu 3, ad Tat-g Lok 4 Itellget Computg Lab, Isttute of Itellget aches, Chese Academy of Sceces, P.O.Box 3, Hefe, Ahu, Cha 2 Departmet of Automato, Uversty of Scece ad Techology of Cha 3 Computer Scece & Egeerg Dept., The Chese Uversty of Hog Kog, Hog Kog 4 Iformato Egeerg Dept., The Chese Uversty of Hog Kog, Shat, Hog Kog zhegch@m.ac.c Abstract. Ths paper proposes a ovel algorthm based o mmzg mutual formato for a specal case of olear bld source separato: postolear bld source separato. A etwork composed of a set of radal bass fucto (RBF) etworks, a set of multlayer perceptro ad a lear etwork s used as a demxg system to separate sources post-olear mxtures. The expermetal results show that our proposed method s effectve, ad they also show that the local character of the RBF etwork s uts allows a sgfcat speedup the trag of the system. Itroducto Bld source separato (BSS) stataeous ad covolute lear mxture has bee tesvely studed over the last decade. ost of the bld separato algorthms are based o the theory of the depedet compoet aalyss (ICA) whe the mxture model s lear [,2]. However, geeral real-world stuato, olear mxture of sgals s geerally more prevalet. For olear demxg [6,7], may dffcultes occur ad the lear ICA s o loger applcable because of the complexty of olear parameters. I ths paper, we shall -deep vestgate a specal but mportat stace of olear mxtures,.e., post-olear (PNL) mxtures, ad gve out a ovel algortthm. 2 Post-olear xtures A mportat specal case of the geeral olear mxg model that cossts of so called post-olear mxtures troduced by Taleb ad Jutte [], ca be see as a hybrd of a lear stage followed by a olear stage. * Ths work was supported by the Natoal Scece Foudato of Cha (Nos.6472, 37368 ad 642). J. Wag et al. (Eds.): ISNN 26, LNCS 397, pp. 6 7, 26. Sprger-Verlag Berl Hedelberg 26

66 C.-H. Zheg et al. s u f x g v y s A u f x g v B y xg system Separatg system Fg.. The mxg separatg system for PNL I the post-olear mxtures model, the observatos x = ( x, x2, L, x ) T have the followg specfc form (as show Fg. ) x = f ajsj, =, L, () j= The correspodg vector-matrx form ca be wrtte as: x = f ( As ) (2) Cotrary to geeral olear mxtures, the PNL mxtures have a favorable separablty property. I fact, f the correspodg separatg model for post-olear mxtures, as show Fgure, are wrtte as: y = b g ( x ) (3) j j j j= The t ca be demostrated that [], uder weak codtos o the mxg matrx A ad o the source dstrbuto, the output depedece ca be obtaed f ad oly f =, L,., h = g o f are lear. For more detals, please refer to lteratures []. 3 Cotrast Fucto I ths paper, we use Shao s mutual formato as the measure of mutual depedece. It ca be defed as: I( y) = H( y ) H( y ) (4) where H( y) = p( y)log p( y) dy deotes Shao s dfferetal etropy. Accordg to the theory gve above, the separatg system of PNL we proposed ths paper s show Fg.2, where B ad g form the umxg structure for PNL, y are the extracted depedet compoets, ad ψ some olear mappgs, whch are used oly for the optmzato of the etwork.

Nolear Bld Source Separato Usg Hybrd Neural Networks 67 Assume that each fucto ψ ( φ, y) s the cumulatve probablty fucto (CPF) of the correspodg compoet y, the z are uformly dstrbuted [, ], Cosequetly, H( z ) = [7]. oreover, because ψ ( φ, y) are all cotuous ad mootoc creasg trasformatos (thus also vertble), the t ca be easly show that I( z) = I( y )[7]. Cosequetly, we ca obta I( y) = I( z) = H( z ) H( z) = H( z ) () Therefore, maxmzg H ( z ) s equvalet to mmzg I( y ). x ψ g B y ψ z x 2 y 2 ψ 2 z 2 g 2 x y ψ z g Fg. 2. The partcular structure of the umxg etwork It has bee proved the lterature [7] that, gve the costrats placed o ψ ( φ, y), the z s bouded to [, ], ad gve that ψ ( φ, y) s also costraed to be a cotuous creasg fucto, the maxmzg H( z ) wll lead ψ ( φ, y) to become the estmates of the CPFs of y. Cosequetly, y should be the duplcate of s wth just sg ad scale ambguty. Now, the fudametal problem that we should to solve s to optmze the etworks (formed by the g, B ad ψ blocks) by maxmzg H ( z ). 4 Usupervsed Learg of Separatg System Wth respect to the separato structure of ths paper, the jot probablstc desty fucto (PDF) of the output vector z ca be calculated as: p( x) p( z) = (6) det( B) g' ( θ, x ) ψ ' ( φ, y ) = = whch leads to the followg expresso of the jot etropy: ( ) ( ψ ) H( z ) = H( x ) + log det( B ) + E log g' ( θ, x ) + E log ' ( φ, y ) (7) = =

68 C.-H. Zheg et al. The mmzato of I( y ), whch s equal to maxmze H ( z ) here, requres the computato of ts gradet wth respect to the separato structure parameters B, θ ad φ. I ths paper, we use RBF [3,4] etwork to model the olear parametrc fuctos gk( θ k, xk), ad choose Gaussa kerel fucto as the actvato fucto of the hdde euros. I order to mplemet the costrats o the ψ fucto easy, we use multlayer perceptro to model the olear parametrc fuctos ψ ( φ, y ). k k k Expermet Results. Extractg Sources From xtures of Smulat Sgals I the frst expermet, the source sgals cosst of a susod sgal ad a fuy. curve sgal [],.e. s ( t ) = [(rem(t,27)-3)/9,((rem(t,23)-)/9) ] T, whch are show Fg.3 (a). The two source sgals are frst learly mxed wth the (radomly chose) mxture matrx: -.389.38 A =.464 -.22 (8) The, the two olear dstorto fuctos f ( u) = f ( u) = tah( u) (9) 2 are appled to each mxture for producg a PNL mxture. Fg.3 (b) shows the separated sgals. To compare the performace of our proposed method wth other oes, we also use ISEP method [7] to coduct the related expermets based o the same data. The correlatos betwee the two recovered sgals separated by two methods ad the two orgal sources are reported Table.. Clearly, the separated sgals usg the method proposed ths paper s more smlar to the orgal sgals tha the other. 2-2 - (a) - - (b) Fg. 3. The two set of sgals show. (a) Source sgals. (b) Separated sgals.

Nolear Bld Source Separato Usg Hybrd Neural Networks 69 Table. Correlatos betwee two orgal sources ad the two recovered sgals Expermet smulat sgals speech sgals y y 2 y y 2 ISEP S.98.27.94.28 S 2.3.9879.829.9639 ethod S.993.84.997.73 ths paper S 2.83.99.72.97.2 Extractg Sources from xtures of Speech Sgals To test the valdty of the algorthm proposed ths paper ulterorly, we also have expermetalzed usg real-lfe speech sgals. I ths expermet two speech sgals (wth 3 samples, samplg rate 8kHz, obtaed from http://www.ece.mcmaster.ca /~relly/ kamra /d8.htm) are post-olearly mxed by: -.42.43 A=.864 -.2 () 3 3 f( u) = ( u+ u ), f2 ( u) = u+ tah( u) () 2 6 The expermetal results are show Fg.4 ad Table., whch coforms the cocluso draw from the frst expermet. - - (a) -8 9 8-9 (b) Fg. 4. The two set of speech sgals show. (a) Source sgals. (b) Separated sgals..3 Trag Speed We also performed tests whch we compared, o the same post-olear BSS problems, etworks whch the g blocks had LP structures. Table 2 shows the meas ad stadard devatos of epochs requred to reach the stop crtero, whch was based o the value of the objectve fucto H ( z ), for LP-based etworks ad RBF-based etworks.

7 C.-H. Zheg et al. Table 2. Comparso of trag speeds betwee LP-based ad RBF-based etworks Two superg Superg. ad subg. supergaussos RBF LP RBF LP ea 3 8 369 68 St. dev 4 2 8 38 From the two tables we ca see that the separatg results of the two methods are very smlar, but the RBF-based mplemetatos traed faster ad show a smaller oscllato of trag tmes (Oe epoch took approxmately the same tme both kds of etwork). Ths maly caused by the local character of RBF etworks. 6 Coclusos We proposed ths paper a ovel algorthm for post-olear bld source separato. Ths ew method works by optmzg a etwork wth a specalzed archtecture, usg the output etropy as the objectve fucto, whch s equvalet to the mutual formato crtero but eeds ot to calculate the margal etropy of the output. Fally, the expermetal results showed that ths method s compettve to other exstg oes. Refereces. Hyväre, A., Karhue, J., Oja, E.: Idepedet Compoet Aalyss. J. Wley, New York (2) 2. Hyväre, A., Pajue, P.: Nolear Idepedet Compoet Aalyss: Exstece ad Uqueess Results. Neural Networks, 2(3) (999) 429 439 3. Huag, D.S.: Systematc Theory of Neural Networks for Patter Recogto. Publshg House of Electroc Idustry of Cha, Bejg (996) 4. Huag, D.S.: The Uted Adaptve Learg Algorthm for the Lk Weghts ad the Shape Parameters RBFN for Patter Recogto. Iteratoal Joural of Patter Recogto ad Artfcal Itellgece.(6) (997) 873-888. Taleb, A., Jutte, C.: Source Separato Post- olear xtures. IEEE Tras. Sgal Processg, 47 (999) 287 282 6. artez, Bray, D. A.: Nolear Bld Source Separato Usg Kerels. IEEE Tras. Neural Networks, 4() (23) 228 23 7. Almeuda, L. B.: ISEP Lear ad Nolear ICA Based o utual Iformato. Joural of ache Learg Research.4(2) (23) 297-38