A Unified Approach on Fast Training of Feedforward and Recurrent Networks Using EM Algorithm

Size: px
Start display at page:

Download "A Unified Approach on Fast Training of Feedforward and Recurrent Networks Using EM Algorithm"

Transcription

1 2270 IEEE TRASACTIOS O SIGAL PROCESSIG, VOL. 46, O. 8, AUGUST 1998 [12] Q. T. Zhag, K. M. Wog, P. C. Yip, ad J. P. Reilly, Statistical aalysis of the performace of iformatio criteria i the detectio of the umber of sigals i array processig, IEEE Tras. Acoust., Speech, Sigal Processig, vol. ASSP-37, pp , A Uified Approach o Fast Traiig of Feedforward ad Recurret etworks Usig EM Algorithm Sheg Ma ad Chuayi Ji Abstract I this work, we provide a theoretical framework that uifies the otios of hidde represetatios ad movig targets through the expectatio ad maximizatio (EM) algorithm. Based o such a framework, two fast traiig algorithms ca be derived cosistetly for both feedforward etworks ad recurret etworks. I. ITRODUCTIO Although multilayer feedforward ad recurret eural etworks have foud may importat applicatios i sigal processig, traiig such etworks is usually slow due to the oliearity ad, possibly, the compact structure of these etworks. This hiders wide applicatios of these etworks i such importat sigal processig areas as speech ad commuicatio, where the etworks eed to be adapted rapidly to a ostatioary eviromet [4]. Oe of the iterestig approaches used to reduce the traiig time was based o hidde represetatios or movig targets [3], [6], [9]. Targets were obtaied adaptively for either hidde or recurret uits ad the used to decompose traiig a etire (multilayer or recurret) etwork ito traiig several sigle-layer feedforward etworks. The approach resulted i fast traiig algorithms for feedforward etworks with hard threshold uits [3] but was ot effective i traiig etworks with sigmoidal uits [6], [9]. I additio, the algorithms were heuristic, which was less desirable. O the other had, the expectatio-maximizatio (EM) algorithm [2] i statistics was developed based o so-called hidde (radom) variables. Although similar to movig targets, hidde (radom) variables were defied through a soud theoretical framework, ad the EM algorithms were developed based o a theoretical foudatio for maximum likehood estimatio (MLE). Whe applied to traiig treestructured etworks with the localized modules [5], the EM algorithm reduced the traiig time sigificatly. Motivated by the idea of movig targets ad the EM algorithms, we have developed i our previous work two fast traiig algorithms for feedforward [7] ad recurret etworks [8]. These algorithms have demostrated that EM algorithms ca also be applied effectively to layered ad recurret etworks to speed up traiig. However, sice the two algorithms have bee developed i separate cotexts, their iter-relatioships have ot bee carefully ivestigated. Mauscript received July 23, 1997; revised Jauary 5, This work was supported by the atioal Sciece Foudatio uder Grat ECS ad CAREER Grat IRI The associate editor coordiatig the review of this paper ad approvig it for publicatio was Prof. Yu-He Hu. The authors are with the Departmet of Electrical, Computer, ad Systems Egieerig, Resselaer Polytechic Istitute, Troy, Y USA. Publisher Item Idetifier S X(98) I this correspodece, we will focus o providig a uified framework, based o the EM algorithm, from which both algorithms ca be derived i a similar fashio. We will show that the two algorithms are based o a similar probabilistic formulatio o hidde targets. Such a formulatio will result i a EM algorithm for feedforward etworks that reduces the traiig a oliear etwork ito traiig idividual euros. By the Markov properties of recurret etworks ad the mea-field approximatio [11], a recurret etwork ca be ufolded at each EM step ito a equivalet feedforward etwork. The, the EM algorithm developed for feedforward etworks ca be directly applied to improve the traiig time. II. UIFIED OTATIOS Cosider the discrete time recurret etworks with a fully coected hidde layer as show i Fig. 1. The etwork has oe liear output, d iputs, ad L fully coected recurret (hidde) sigmoid uits. Let ~w (1) ad ~w (3) deote the weight vectors coectig the exteral iputs ad the other recurret uits to the th hidde uit, respectively, for 1 L: Let W (1) ad W (3) be the weight matrices cotaiig all ~w (1) ad ~w (3), respectively. Let ~w (2) be the weight vector coectig the recurret hidde uits to the output. f~x();t( +1)g =1 is a set of traiig samples, where ~x() is a (d +1)21 iput vector at the th time uit, 1 ad t( +1)is the correspodig desired output. Let h ( +1)be the output of the recurret hidde uit. The, h (+1) = g( ~w (1)T ~x()+ ~w (3)T ~ h()), where g(1) is a sigmoid fuctio. The output of the etwork y( +1) correspodig to a iput ~x() ca be expressed as y( +1) = ~w (2)T ~ h( +1): The recurret etwork is reduced to a feedforward etwork whe the recurret coectios do ot exist, i.e., W (3) =0: Therefore, we ca use the above otatios for both feedforward ad recurret etworks. 2 III. THE EM ALGORITHM The method we will use to develop our algorithm is based o the EM algorithm for MLE [2]. Let D I be a set of data observed directly, ad let 2 be a set of parameters characterizig the correspodig distributio of the radom variables. The maximum likelihood problem is to fid a optimal set of parameters 2 opt through maximizig the log likelihood of the data. Sice it ca be very difficult to maximize the origial log likelihood fuctio directly, the EM algorithm has bee itroduced to simplify this process. A set of hidde (radom) variables are itroduced. The data for the hidde variables are called missig data. Together with the missig data, the so-called icomplete data D I form a complete data set D c : The EM algorithm coverts the origial maximum likelihood procedure ito a two-step process: the E-step (expectatio) ad the M-step (maximizatio). I the E-step, the followig coditioal expectatio ca be computed (with respect to hidde variables) of the complete data log likelihood give icomplete data D I ad the previous parameters 2 (p) at the pth step Q(22 (p) )=Efl P (D c 2)D I ; 2 (p) g (1) where 2 are the parameters to be obtaied i the (ext) M-step. The otatio Q(22 (p) ) idicates that the coditioal expectatio : 1 The bias correspods to a iput x0() =1: 2 o delays exist for feedforward euros, i.e., the time +1reduces to X/98$ IEEE

2 IEEE TRASACTIOS O SIGAL PROCESSIG, VOL. 46, O. 8, AUGUST Fig. 1. Recurret etwork. is a fuctio of 2 ad that the expectatio is evaluated usig 2 (p) as parameters i the probability desity fuctio. P (1) represets a probability desity fuctio. E stads for the expectatio operatio. I the M-step, ew parameters 2 (p+1) are foud by maximizig the expected log likelihood Q(22 (p) ): The algorithm alterates betwee the E- ad M-steps ad termiates whe a certai covergece criterio is satisfied. Local covergece of the EM algorithm is guarateed [2]. There is a exteded survey o the EM algorithm i [1]. IV. PROBABILISTIC MODELS A. Choice of Hidde Variables For feedforward ad recurret etworks, we choose the hidde variables ~z to be either the desired hidde targets at the hidde layer for the feedforward etworks or the desired hidde states for recurret euros of the recurret etworks. The, ~z() ad ~z( +1)deote the correspodig missig data for the feedforward ad recurret etworks, respectively, correspodig to the iput ~x() (for 1 ). The complete data set therefore cotais the labeled traiig samples ad the desired hidde target, which is f~x();t();~z()g =1 for feedforward etworks, ad f~x();t( + 1);~z( +1)g =1 for recurret etworks. The icomplete data set is simply the traiig set f~x();t()g=1: For simplicity, we use the otatio f~xg; fyg ad f~zg to represet f~x()g =1; ft()g =1 or ft( +1)g=1; f~z()g =1 or f~z( +1)g=1, respectively, ad we use 2 to represet all the weights i a etwork. We assume that traiig samples (sequeces for recurret etworks) are draw idepedetly. So are the desired hidde targets. Usig the otatios, we have that the coditioal expectatio of the complete-data log likelihood defied i (1) ca the be expressed as Q(22 (p) ) = l P (f~xg2) + s P (f~zgftg; f~xg; 2 (p) ) l P (f~zg; ftgf~xg; 2) df~zg: B. Choice of Probabilistic Models To evaluate the expected log-likelihood Q(22 (p) ), we eed to obtai P (f~zgftg; f~xg; 2) ad P (f~zg; ftgf~xg; 2): I this work, Gaussia distributios are chose to facilitate the aalysis. 1) A Probabilistic Model for Two Layer Feedforward etworks: For two-layer feedforward etworks, we choose P (f~zgf~xg; 2) = B 1 exp(0 1E 1) (2) P (ftgf~zg; 2) = B 2 exp(02 2 E 2 ) (3) where E 1 = 6 =1 k~x() 0 ~ h()k 2, which measures the error betwee the desired ad the actual outputs of hidde uits. E 2 = 6 =1 (t() 0 ~z() T 1 ~w (2) ) 2, which characterizes the error betwee the desired ad the actual outputs of the etwork whe the desired hidde values are used as iputs to the secod layer. B 1 ad B 2 are ormalizatio costats. 1 ad 2 are weightig factors. The, we ca obtai [7] P (ftg; f~zgf~xg; 2) = A tz exp(0 1E 1 0 2E 2) (4) P (f~zgftg; f~xg; 2 (p) )=A =2 z =1 exp( 1 2 (~z() 0 ^~z()) T 6 01 (~z() 0 ^~z())) (5) where A tz = ( 2=)( 1=) L, ad A z =(1= (2) L det(6): 6 is a (L 2 L) covariace matrix whose iverse is 6 01 = 2( 1 I +

3 2272 IEEE TRASACTIOS O SIGAL PROCESSIG, VOL. 46, O. 8, AUGUST ~w (2) ( ~w (2) ) T ): The superscript p idicates the correspodig value at the pth step. I is a (L 2 L) idetity matrix. ^~z() is the expectatio of hidde targets ~z() give the th traiig pairs ad the previous parameter set 2 (p) : The th ( L) elemet of ^~z() ca be computed as [7] ^z () =h (p) 2w (2) () k ~w (2) k2 e() (6) where h (p) () represets the th elemet of ~ h(): e() is the traiig error of the th traiig sample satisfyig e() =t() 0 y (p) (): 2) A Probabilistic Model for Recurret etworks: For recurret etworks, we ote that the output of the recurret etwork ad the hidde states at the curret time step deped oly o the hidde states at the previous time step. This results i the Markov property that leads to the coditioal probability desity fuctios [8] P (f~zgftg; f~xg; 2) = P (~z( +1)~z();t( +1);~x(); 2) (7) =1 P (ftg; f~zgf~xg; 2) = P (t( +1);~z( +1)~z();~x(); 2): (8) =1 Similar Gaussio models to those used for feedforward etworks are chose as the coditioal distributios P (~z( +1)~z();~x(); 2) = B1 exp(01e1( + 1)) (9) P (t( +1)~z( +1); 2) = B2 exp(02e2( + 1)) (10) where B1;B2;1; ad 2 are the same as for feedforward etworks. E1( +1) = k~z( +1)0 ~ h 0 ( +1)k 2, where the th elemet of ~ h 0 (+1) for 1 L is h 0 (+1) = g(~x() T 1 ~w (1) +~z() T 1 ~w (3) ): Therefore, E1( +1)measures the error betwee the desired hidde states ~z( +1)ad ~ h 0 ( +1): Likewise, E2( +1)=(t( +1)0 ~z( +1) T 1 ~w (2) ) 2, which characterizes the error betwee the desired ad the actual output of the etwork whe the desired hidde states are used as iputs to the output layer. These Gaussia assumptios lead to P (t( +1);~z( +1)~z();~x(); 2) = A tz exp(01e1( +1)02E2( + 1)) (11) P (~z( +1)t( +1);~z();~x(); 2) = A z exp(0 1 2 (~z( +1) 0 ^~z( + 1)) T 01 (~z( +1)0 ^~z( + 1))) (12) where A tz ;A z ; 6; ad I are the same as for feedforward etworks. ^~z( +1)is the coditioal expectatio of ~z( +1)coditioed o t(+1);~z() ad ~x(): The th elemet of ^~z(+1) ca be expressed as ^z ( +1)=h 0 ( +1)+ 2w (2) 1 + 2k ~w (2) k2 e( +1) (13) with e( +1)=t( +1)0 ~ h 0 ( +1) T 1 ~w (2) : The, we ca obtai the two distributios P (f~zgftg; f~xg; 2) ad P (ftg; f~zgf~xg; 2) similar to those for feedforward etworks. V. APPLYIG THE EM ALGORITHM TO FEEDFORWARD AD RECURRET ETWORKS A. The E-Step for Feedforward etworks For feedforward etworks, the expected log likelihood ca be evaluated by usig the established probability models [7] as Q(22 (p) )=Efl P (ftg; f~zg; f~xg2)ftg; f~xg; 2 (p) g = E c 0 E p 0 E h 0 E o (14) wheree c is a costat, ad E p = 2 ( ~w (2) ) T ~w (2) :E h ad E o ca be expressed as E h = 1 E o = 2 (^z () 0 g(w T ~x())) 2 (15) (( ~w (2) ) T ^~z() 0 t()) 2 : (16) Therefore, the evaluatio of the expectatio of the log likelihood i the E-step is reduced to fidig the expected hidde targets through (6). B. The E-Step ad the Mea-Field Approximatio for Recurret etworks To evaluate Q(22 p ) for recurret etworks, we also eed to compute the expectatio of the hidde targets ~z(): Sice ~z() ow depeds o ~z( 0 1), due to the oliearity of sigmoidal fuctios, computig the expectatio directly is impossible. Therefore, the selfcosistet mea-field approximatio [11] is used to approximate the expectatio. 3 This approximatio replaces the coditio o ~z() by its expected value P (~z( +1)t( +1);~z();~x(); 2 p ) P (~z( +1)t( +1); ~z();~x(); ~ 2 p ) (17) where the th elemet ~z () of ~z() ~ ca be computed recursively through ~h ( +1)=g(~x() T 1 ~w (1) + ~z() ~ T 1 ~w (3) ) (18) ~z ( +1) h ~ 2w (2) ( +1) k ~w (2) k2 ~e( +1) (19) for every 1 L ad 1 : ~e( +1)=t( +1)0 ~ h( + 1) T 1 ~w (2) 1 ~ h( +1)is a L 2 1 vector whose th elemet is defied by (18). It is easy to observe that (19) is strikigly similar to (6). If ~w (3) s equal to zero so that the recurret etworks reduce to the feedforward etworks, ~z() ~ will equal to ^~z() for the feedforward etworks. The mea-field approximatio essetially ufolds a recurret etwork ito a two-layer feedforward etwork show i Fig. 2. The weights at the first layer of the ufolded etwork cosist of W (1) ad W (3) : The total iputs to the two-layer etwork cosist of the origial iput at time ad the estimated mea value of the previous hidde targets. The, the rest of the results will follow those for two layer feedforward etworks except that E h = 1 E o = 2 (~z () 0 g(~x() T 1 ~w (1) + ~ ~z() T 1 ~w (3) )) 2 (20) ( ~w (2) 1 ~ ~z() 0 t()) 2 : (21) 3 Please see [8] for more refereces.

4 IEEE TRASACTIOS O SIGAL PROCESSIG, VOL. 46, O. 8, AUGUST Fig. 2. Ufolded two-layer feedforward etwork. C. The M-Step ew weights are obtaied through maximizig the expected log likelihood Q(22 (p) ) i the M-step, which ivolves solvig two separated quadratic miimizatio problems ( ~w (1)(p+1) ;~w (3)(p+1) ) =arg mi ~w ;~w (~z ( +1) 0 g(~x() T 1 ~w (1) + ~ ~z() T 1 ~w (3) )) 2 (22) where ~w (1) ad ~w (3) stad for the ew weight vectors coectig the th uit to the iputs ad the other recurret hidde uits for 1 L: Solvig this maximizatio problem is equivalet to traiig idividual (orecurret) hidde uits i the ufolded etwork i parallel. I additio, we have ~w (2) = arg mi ~w ( ~w (2) ( +1)0 t( + 1)) 2 + E p (23) where ~w (2) are the ew weights coectig hidde uits with the output uit, ad E p = ( 2 k ~w (2) k 2 =2 ( 1 + 2k ~w (2) k 2 )) + ( 2 (k~w (2) k 2 k ~w (2) k 2 0 ( ~w (2) ) T ( ~w (2) ) ( ~w (2) ) T ~w (2) )=2 1 ( k ~w (2) k 2 )): This step is equivalet to traiig the output euro i the ufolded two layer feedforward etwork. Whe w (3) s are set to zero, we have the M-step for feedforward etworks. Therefore, the maor computatioal cost of the M-step for both the feedforward ad recurret etworks is i traiig idividual euros. This ca be doe rapidly through liear weighted regressios [7]. Fig. 3. Flowchart of the algorithm. D. Summary of the Algorithm I this work, we have established a uified framework to derive fast traiig algorithms for both feedforward ad recurret etworks. The key steps ca be summarized as follows. Probabilistic models are chose for hidde states for both two-layer feedforward etworks ad the recurret etworks. Based o these models, the idea of movig targets have bee formulated through the EM algorithm. For the feedforward etworks, the expectatio of hidde targets are computed at each E-step. I the M-step, they the serve as the targets at the hidde layer to trai the first layer weights. They are also used as the iputs to the output uit to trai the secod-layer weights. Therefore, traiig two-layer feedforward etworks is decomposed to traiig idividual euros. The mea-field approximatio effectively ufolds a recurret etwork ito a equivalet feedforward etwork at each EM step. Such a feedforward etwork uses the expected hidde states at the previous step as a part of the iputs, ad the expected hidde states at the curret step as the targets at the hidde layer. The, the maximizatio method developed for the feedforward etwork ca be applied to achieve fast traiig for each equivalet feedforward etwork. The maor steps of the uified EM algorithms are summarized i a flowchart give i Fig. 3. The traiig time has bee reduced 5 15 times by both algorithms o various bech-mark problems [7], [8]. Further ivestigatios are eeded o how widely our simple models for desired hidde states is applicable for real problems. ACKOWLEDGMET The authors would like to thak helpful commets from aoymous reviewers.

5 2274 IEEE TRASACTIOS O SIGAL PROCESSIG, VOL. 46, O. 8, AUGUST 1998 REFERECES [1] S. Amari, Iformatio geometry of EM algorithms for eural etworks, eural etworks, vol. 8, o. 9, pp , [2] A. P. Dempster,. M. Laird, ad D. B. Rubi, Maximum likelihood from icomplete data via EM algorithm, J. R. Statist. Soc., vol. B39, pp. 1 33, [3] T. Grossma, Learig, capacity ad implemetatio of eural etwork models, Ph.D. dissertatio, Weizma Ist. Sci., Rehovot, Israel, [4] S. Hayki ad L. Li, oliear adaptive predictio of ostatioary sigals, IEEE Tras. Sigal Processig, vol. 43, pp , Feb [5] M. Jorda ad R. A. Jacobs, Hierarchical mixture of experts ad the EM algorithm, eural Comput., vol. 6, pp , [6] A. Krough, G. I. Thorberaso, ad J. A. Hertz, A cost fuctio for iteral represetatios, i D. Touretzky, Ed., Advaces i eural Iformatio Processig Systems. Sa Mateo, CA: Morga Kaufma, 1990, pp [7] S. Ma, C. Ji, ad J. Farmer, A efficiet EM-based traiig algorithm for feedforward eural etworks, eural etworks, vol. 10, o. 2, pp , [8] S. Ma ad C. Ji, Fast traiig of recurret etworks based o the EM algorithm, submitted for publicatio. [9] R. Rohwer, The movig target traiig algorithm, Adv. eural Iform. Process. Syst., vol. 2, pp , [10] D. Saad ad E. Marom, Learig by choice of iteral represetatios: A eergy miimizatio approach, Complex Syst., vol. 4, pp , [11] J. Zhag, The mea-field theory i EM procedures for Markov radom fields, IEEE Tras. Sigal Processig, vol. 40, pp , Oct

RAINFALL PREDICTION BY WAVELET DECOMPOSITION

RAINFALL PREDICTION BY WAVELET DECOMPOSITION RAIFALL PREDICTIO BY WAVELET DECOMPOSITIO A. W. JAYAWARDEA Departmet of Civil Egieerig, The Uiversit of Hog Kog, Hog Kog, Chia P. C. XU Academ of Mathematics ad Sstem Scieces, Chiese Academ of Scieces,

More information

Information-based Feature Selection

Information-based Feature Selection Iformatio-based Feature Selectio Farza Faria, Abbas Kazeroui, Afshi Babveyh Email: {faria,abbask,afshib}@staford.edu 1 Itroductio Feature selectio is a topic of great iterest i applicatios dealig with

More information

Discrete-Time Systems, LTI Systems, and Discrete-Time Convolution

Discrete-Time Systems, LTI Systems, and Discrete-Time Convolution EEL5: Discrete-Time Sigals ad Systems. Itroductio I this set of otes, we begi our mathematical treatmet of discrete-time s. As show i Figure, a discrete-time operates or trasforms some iput sequece x [

More information

Information Theory Tutorial Communication over Channels with memory. Chi Zhang Department of Electrical Engineering University of Notre Dame

Information Theory Tutorial Communication over Channels with memory. Chi Zhang Department of Electrical Engineering University of Notre Dame Iformatio Theory Tutorial Commuicatio over Chaels with memory Chi Zhag Departmet of Electrical Egieerig Uiversity of Notre Dame Abstract A geeral capacity formula C = sup I(; Y ), which is correct for

More information

Algorithms for Clustering

Algorithms for Clustering CR2: Statistical Learig & Applicatios Algorithms for Clusterig Lecturer: J. Salmo Scribe: A. Alcolei Settig: give a data set X R p where is the umber of observatio ad p is the umber of features, we wat

More information

w (1) ˆx w (1) x (1) /ρ and w (2) ˆx w (2) x (2) /ρ.

w (1) ˆx w (1) x (1) /ρ and w (2) ˆx w (2) x (2) /ρ. 2 5. Weighted umber of late jobs 5.1. Release dates ad due dates: maximimizig the weight of o-time jobs Oce we add release dates, miimizig the umber of late jobs becomes a sigificatly harder problem. For

More information

Expectation-Maximization Algorithm.

Expectation-Maximization Algorithm. Expectatio-Maximizatio Algorithm. Petr Pošík Czech Techical Uiversity i Prague Faculty of Electrical Egieerig Dept. of Cyberetics MLE 2 Likelihood.........................................................................................................

More information

CSE 527, Additional notes on MLE & EM

CSE 527, Additional notes on MLE & EM CSE 57 Lecture Notes: MLE & EM CSE 57, Additioal otes o MLE & EM Based o earlier otes by C. Grat & M. Narasimha Itroductio Last lecture we bega a examiatio of model based clusterig. This lecture will be

More information

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss ECE 90 Lecture : Complexity Regularizatio ad the Squared Loss R. Nowak 5/7/009 I the previous lectures we made use of the Cheroff/Hoeffdig bouds for our aalysis of classifier errors. Hoeffdig s iequality

More information

10-701/ Machine Learning Mid-term Exam Solution

10-701/ Machine Learning Mid-term Exam Solution 0-70/5-78 Machie Learig Mid-term Exam Solutio Your Name: Your Adrew ID: True or False (Give oe setece explaatio) (20%). (F) For a cotiuous radom variable x ad its probability distributio fuctio p(x), it

More information

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors ECONOMETRIC THEORY MODULE XIII Lecture - 34 Asymptotic Theory ad Stochastic Regressors Dr. Shalabh Departmet of Mathematics ad Statistics Idia Istitute of Techology Kapur Asymptotic theory The asymptotic

More information

Research Article A Unified Weight Formula for Calculating the Sample Variance from Weighted Successive Differences

Research Article A Unified Weight Formula for Calculating the Sample Variance from Weighted Successive Differences Discrete Dyamics i Nature ad Society Article ID 210761 4 pages http://dxdoiorg/101155/2014/210761 Research Article A Uified Weight Formula for Calculatig the Sample Variace from Weighted Successive Differeces

More information

ON POINTWISE BINOMIAL APPROXIMATION

ON POINTWISE BINOMIAL APPROXIMATION Iteratioal Joural of Pure ad Applied Mathematics Volume 71 No. 1 2011, 57-66 ON POINTWISE BINOMIAL APPROXIMATION BY w-functions K. Teerapabolar 1, P. Wogkasem 2 Departmet of Mathematics Faculty of Sciece

More information

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1 EECS564 Estimatio, Filterig, ad Detectio Hwk 2 Sols. Witer 25 4. Let Z be a sigle observatio havig desity fuctio where. p (z) = (2z + ), z (a) Assumig that is a oradom parameter, fid ad plot the maximum

More information

Basics of Probability Theory (for Theory of Computation courses)

Basics of Probability Theory (for Theory of Computation courses) Basics of Probability Theory (for Theory of Computatio courses) Oded Goldreich Departmet of Computer Sciece Weizma Istitute of Sciece Rehovot, Israel. oded.goldreich@weizma.ac.il November 24, 2008 Preface.

More information

Clustering. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar.

Clustering. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar. Clusterig CM226: Machie Learig for Bioiformatics. Fall 216 Sriram Sakararama Ackowledgmets: Fei Sha, Ameet Talwalkar Clusterig 1 / 42 Admiistratio HW 1 due o Moday. Email/post o CCLE if you have questios.

More information

Optimally Sparse SVMs

Optimally Sparse SVMs A. Proof of Lemma 3. We here prove a lower boud o the umber of support vectors to achieve geeralizatio bouds of the form which we cosider. Importatly, this result holds ot oly for liear classifiers, but

More information

Lecture 4. Hw 1 and 2 will be reoped after class for every body. New deadline 4/20 Hw 3 and 4 online (Nima is lead)

Lecture 4. Hw 1 and 2 will be reoped after class for every body. New deadline 4/20 Hw 3 and 4 online (Nima is lead) Lecture 4 Homework Hw 1 ad 2 will be reoped after class for every body. New deadlie 4/20 Hw 3 ad 4 olie (Nima is lead) Pod-cast lecture o-lie Fial projects Nima will register groups ext week. Email/tell

More information

1 Hash tables. 1.1 Implementation

1 Hash tables. 1.1 Implementation Lecture 8 Hash Tables, Uiversal Hash Fuctios, Balls ad Bis Scribes: Luke Johsto, Moses Charikar, G. Valiat Date: Oct 18, 2017 Adapted From Virgiia Williams lecture otes 1 Hash tables A hash table is a

More information

Session 5. (1) Principal component analysis and Karhunen-Loève transformation

Session 5. (1) Principal component analysis and Karhunen-Loève transformation 200 Autum semester Patter Iformatio Processig Topic 2 Image compressio by orthogoal trasformatio Sessio 5 () Pricipal compoet aalysis ad Karhue-Loève trasformatio Topic 2 of this course explais the image

More information

Mixtures of Gaussians and the EM Algorithm

Mixtures of Gaussians and the EM Algorithm Mixtures of Gaussias ad the EM Algorithm CSE 6363 Machie Learig Vassilis Athitsos Computer Sciece ad Egieerig Departmet Uiversity of Texas at Arligto 1 Gaussias A popular way to estimate probability desity

More information

The Expectation-Maximization (EM) Algorithm

The Expectation-Maximization (EM) Algorithm The Expectatio-Maximizatio (EM) Algorithm Readig Assigmets T. Mitchell, Machie Learig, McGraw-Hill, 997 (sectio 6.2, hard copy). S. Gog et al. Dyamic Visio: From Images to Face Recogitio, Imperial College

More information

Topics Machine learning: lecture 2. Review: the learning problem. Hypotheses and estimation. Estimation criterion cont d. Estimation criterion

Topics Machine learning: lecture 2. Review: the learning problem. Hypotheses and estimation. Estimation criterion cont d. Estimation criterion .87 Machie learig: lecture Tommi S. Jaakkola MIT CSAIL tommi@csail.mit.edu Topics The learig problem hypothesis class, estimatio algorithm loss ad estimatio criterio samplig, empirical ad epected losses

More information

ECE-S352 Introduction to Digital Signal Processing Lecture 3A Direct Solution of Difference Equations

ECE-S352 Introduction to Digital Signal Processing Lecture 3A Direct Solution of Difference Equations ECE-S352 Itroductio to Digital Sigal Processig Lecture 3A Direct Solutio of Differece Equatios Discrete Time Systems Described by Differece Equatios Uit impulse (sample) respose h() of a DT system allows

More information

ADVANCED DIGITAL SIGNAL PROCESSING

ADVANCED DIGITAL SIGNAL PROCESSING ADVANCED DIGITAL SIGNAL PROCESSING PROF. S. C. CHAN (email : sccha@eee.hku.hk, Rm. CYC-702) DISCRETE-TIME SIGNALS AND SYSTEMS MULTI-DIMENSIONAL SIGNALS AND SYSTEMS RANDOM PROCESSES AND APPLICATIONS ADAPTIVE

More information

Support vector machine revisited

Support vector machine revisited 6.867 Machie learig, lecture 8 (Jaakkola) 1 Lecture topics: Support vector machie ad kerels Kerel optimizatio, selectio Support vector machie revisited Our task here is to first tur the support vector

More information

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering CEE 5 Autum 005 Ucertaity Cocepts for Geotechical Egieerig Basic Termiology Set A set is a collectio of (mutually exclusive) objects or evets. The sample space is the (collectively exhaustive) collectio

More information

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 +

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 + 62. Power series Defiitio 16. (Power series) Give a sequece {c }, the series c x = c 0 + c 1 x + c 2 x 2 + c 3 x 3 + is called a power series i the variable x. The umbers c are called the coefficiets of

More information

Week 1, Lecture 2. Neural Network Basics. Announcements: HW 1 Due on 10/8 Data sets for HW 1 are online Project selection 10/11. Suggested reading :

Week 1, Lecture 2. Neural Network Basics. Announcements: HW 1 Due on 10/8 Data sets for HW 1 are online Project selection 10/11. Suggested reading : ME 537: Learig-Based Cotrol Week 1, Lecture 2 Neural Network Basics Aoucemets: HW 1 Due o 10/8 Data sets for HW 1 are olie Proect selectio 10/11 Suggested readig : NN survey paper (Zhag Chap 1, 2 ad Sectios

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology 6.0/6.3: Probabilistic Systems Aalysis (Fall 00) Problem Set 8: Solutios. (a) We cosider a Markov chai with states 0,,, 3,, 5, where state i idicates that there are i shoes available at the frot door i

More information

Stochastic Simulation

Stochastic Simulation Stochastic Simulatio 1 Itroductio Readig Assigmet: Read Chapter 1 of text. We shall itroduce may of the key issues to be discussed i this course via a couple of model problems. Model Problem 1 (Jackso

More information

Intro to Learning Theory

Intro to Learning Theory Lecture 1, October 18, 2016 Itro to Learig Theory Ruth Urer 1 Machie Learig ad Learig Theory Comig soo 2 Formal Framework 21 Basic otios I our formal model for machie learig, the istaces to be classified

More information

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015 ECE 8527: Itroductio to Machie Learig ad Patter Recogitio Midterm # 1 Vaishali Ami Fall, 2015 tue39624@temple.edu Problem No. 1: Cosider a two-class discrete distributio problem: ω 1 :{[0,0], [2,0], [2,2],

More information

Lecture 11: Decision Trees

Lecture 11: Decision Trees ECE9 Sprig 7 Statistical Learig Theory Istructor: R. Nowak Lecture : Decisio Trees Miimum Complexity Pealized Fuctio Recall the basic results of the last lectures: let X ad Y deote the iput ad output spaces

More information

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator Ecoomics 24B Relatio to Method of Momets ad Maximum Likelihood OLSE as a Maximum Likelihood Estimator Uder Assumptio 5 we have speci ed the distributio of the error, so we ca estimate the model parameters

More information

CS284A: Representations and Algorithms in Molecular Biology

CS284A: Representations and Algorithms in Molecular Biology CS284A: Represetatios ad Algorithms i Molecular Biology Scribe Notes o Lectures 3 & 4: Motif Discovery via Eumeratio & Motif Represetatio Usig Positio Weight Matrix Joshua Gervi Based o presetatios by

More information

Orthogonal Gaussian Filters for Signal Processing

Orthogonal Gaussian Filters for Signal Processing Orthogoal Gaussia Filters for Sigal Processig Mark Mackezie ad Kiet Tieu Mechaical Egieerig Uiversity of Wollogog.S.W. Australia Abstract A Gaussia filter usig the Hermite orthoormal series of fuctios

More information

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian Chapter 2 EM algorithms The Expectatio-Maximizatio (EM) algorithm is a maximum likelihood method for models that have hidde variables eg. Gaussia Mixture Models (GMMs), Liear Dyamic Systems (LDSs) ad Hidde

More information

Estimation for Complete Data

Estimation for Complete Data Estimatio for Complete Data complete data: there is o loss of iformatio durig study. complete idividual complete data= grouped data A complete idividual data is the oe i which the complete iformatio of

More information

CHEBYSHEV POLYNOMIAL APPROXIMATION FOR ACTIVATION SIGMOID FUNCTION

CHEBYSHEV POLYNOMIAL APPROXIMATION FOR ACTIVATION SIGMOID FUNCTION CHEBYSHEV POLYNOMIAL APPROXIMATION FOR ACTIVATION SIGMOID FUNCTION Miroslav Vlče Abstract: A alterative polyomial approximatio for the activatio sigmoid fuctio is developed here. It ca cosiderably simplify

More information

Exponential Families and Bayesian Inference

Exponential Families and Bayesian Inference Computer Visio Expoetial Families ad Bayesia Iferece Lecture Expoetial Families A expoetial family of distributios is a d-parameter family f(x; havig the followig form: f(x; = h(xe g(t T (x B(, (. where

More information

Asymptotic Coupling and Its Applications in Information Theory

Asymptotic Coupling and Its Applications in Information Theory Asymptotic Couplig ad Its Applicatios i Iformatio Theory Vicet Y. F. Ta Joit Work with Lei Yu Departmet of Electrical ad Computer Egieerig, Departmet of Mathematics, Natioal Uiversity of Sigapore IMS-APRM

More information

The z-transform. 7.1 Introduction. 7.2 The z-transform Derivation of the z-transform: x[n] = z n LTI system, h[n] z = re j

The z-transform. 7.1 Introduction. 7.2 The z-transform Derivation of the z-transform: x[n] = z n LTI system, h[n] z = re j The -Trasform 7. Itroductio Geeralie the complex siusoidal represetatio offered by DTFT to a represetatio of complex expoetial sigals. Obtai more geeral characteristics for discrete-time LTI systems. 7.

More information

On Algorithm for the Minimum Spanning Trees Problem with Diameter Bounded Below

On Algorithm for the Minimum Spanning Trees Problem with Diameter Bounded Below O Algorithm for the Miimum Spaig Trees Problem with Diameter Bouded Below Edward Kh. Gimadi 1,2, Alexey M. Istomi 1, ad Ekateria Yu. Shi 2 1 Sobolev Istitute of Mathematics, 4 Acad. Koptyug aveue, 630090

More information

Pattern Classification, Ch4 (Part 1)

Pattern Classification, Ch4 (Part 1) Patter Classificatio All materials i these slides were take from Patter Classificatio (2d ed) by R O Duda, P E Hart ad D G Stork, Joh Wiley & Sos, 2000 with the permissio of the authors ad the publisher

More information

Filter banks. Separately, the lowpass and highpass filters are not invertible. removes the highest frequency 1/ 2and

Filter banks. Separately, the lowpass and highpass filters are not invertible. removes the highest frequency 1/ 2and Filter bas Separately, the lowpass ad highpass filters are ot ivertible T removes the highest frequecy / ad removes the lowest frequecy Together these filters separate the sigal ito low-frequecy ad high-frequecy

More information

Recursive Algorithms. Recurrences. Recursive Algorithms Analysis

Recursive Algorithms. Recurrences. Recursive Algorithms Analysis Recursive Algorithms Recurreces Computer Sciece & Egieerig 35: Discrete Mathematics Christopher M Bourke cbourke@cseuledu A recursive algorithm is oe i which objects are defied i terms of other objects

More information

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector Summary ad Discussio o Simultaeous Aalysis of Lasso ad Datzig Selector STAT732, Sprig 28 Duzhe Wag May 4, 28 Abstract This is a discussio o the work i Bickel, Ritov ad Tsybakov (29). We begi with a short

More information

Probabilistic and Average Linear Widths in L -Norm with Respect to r-fold Wiener Measure

Probabilistic and Average Linear Widths in L -Norm with Respect to r-fold Wiener Measure joural of approximatio theory 84, 3140 (1996) Article No. 0003 Probabilistic ad Average Liear Widths i L -Norm with Respect to r-fold Wieer Measure V. E. Maiorov Departmet of Mathematics, Techio, Haifa,

More information

Preponderantly increasing/decreasing data in regression analysis

Preponderantly increasing/decreasing data in regression analysis Croatia Operatioal Research Review 269 CRORR 7(2016), 269 276 Prepoderatly icreasig/decreasig data i regressio aalysis Darija Marković 1, 1 Departmet of Mathematics, J. J. Strossmayer Uiversity of Osijek,

More information

Bayesian Methods: Introduction to Multi-parameter Models

Bayesian Methods: Introduction to Multi-parameter Models Bayesia Methods: Itroductio to Multi-parameter Models Parameter: θ = ( θ, θ) Give Likelihood p(y θ) ad prior p(θ ), the posterior p proportioal to p(y θ) x p(θ ) Margial posterior ( θ, θ y) is Iterested

More information

Table 12.1: Contingency table. Feature b. 1 N 11 N 12 N 1b 2 N 21 N 22 N 2b. ... a N a1 N a2 N ab

Table 12.1: Contingency table. Feature b. 1 N 11 N 12 N 1b 2 N 21 N 22 N 2b. ... a N a1 N a2 N ab Sectio 12 Tests of idepedece ad homogeeity I this lecture we will cosider a situatio whe our observatios are classified by two differet features ad we would like to test if these features are idepedet

More information

1 Review of Probability & Statistics

1 Review of Probability & Statistics 1 Review of Probability & Statistics a. I a group of 000 people, it has bee reported that there are: 61 smokers 670 over 5 960 people who imbibe (drik alcohol) 86 smokers who imbibe 90 imbibers over 5

More information

THE KALMAN FILTER RAUL ROJAS

THE KALMAN FILTER RAUL ROJAS THE KALMAN FILTER RAUL ROJAS Abstract. This paper provides a getle itroductio to the Kalma filter, a umerical method that ca be used for sesor fusio or for calculatio of trajectories. First, we cosider

More information

Discrete Orthogonal Moment Features Using Chebyshev Polynomials

Discrete Orthogonal Moment Features Using Chebyshev Polynomials Discrete Orthogoal Momet Features Usig Chebyshev Polyomials R. Mukuda, 1 S.H.Og ad P.A. Lee 3 1 Faculty of Iformatio Sciece ad Techology, Multimedia Uiversity 75450 Malacca, Malaysia. Istitute of Mathematical

More information

The random version of Dvoretzky s theorem in l n

The random version of Dvoretzky s theorem in l n The radom versio of Dvoretzky s theorem i l Gideo Schechtma Abstract We show that with high probability a sectio of the l ball of dimesio k cε log c > 0 a uiversal costat) is ε close to a multiple of the

More information

11 THE GMM ESTIMATION

11 THE GMM ESTIMATION Cotets THE GMM ESTIMATION 2. Cosistecy ad Asymptotic Normality..................... 3.2 Regularity Coditios ad Idetificatio..................... 4.3 The GMM Iterpretatio of the OLS Estimatio.................

More information

A new iterative algorithm for reconstructing a signal from its dyadic wavelet transform modulus maxima

A new iterative algorithm for reconstructing a signal from its dyadic wavelet transform modulus maxima ol 46 No 6 SCIENCE IN CHINA (Series F) December 3 A ew iterative algorithm for recostructig a sigal from its dyadic wavelet trasform modulus maxima ZHANG Zhuosheg ( u ), LIU Guizhog ( q) & LIU Feg ( )

More information

Time-Domain Representations of LTI Systems

Time-Domain Representations of LTI Systems 2.1 Itroductio Objectives: 1. Impulse resposes of LTI systems 2. Liear costat-coefficiets differetial or differece equatios of LTI systems 3. Bloc diagram represetatios of LTI systems 4. State-variable

More information

The Maximum-Likelihood Decoding Performance of Error-Correcting Codes

The Maximum-Likelihood Decoding Performance of Error-Correcting Codes The Maximum-Lielihood Decodig Performace of Error-Correctig Codes Hery D. Pfister ECE Departmet Texas A&M Uiversity August 27th, 2007 (rev. 0) November 2st, 203 (rev. ) Performace of Codes. Notatio X,

More information

Properties and Hypothesis Testing

Properties and Hypothesis Testing Chapter 3 Properties ad Hypothesis Testig 3.1 Types of data The regressio techiques developed i previous chapters ca be applied to three differet kids of data. 1. Cross-sectioal data. 2. Time series data.

More information

Research Article On the Strong Laws for Weighted Sums of ρ -Mixing Random Variables

Research Article On the Strong Laws for Weighted Sums of ρ -Mixing Random Variables Hidawi Publishig Corporatio Joural of Iequalities ad Applicatios Volume 2011, Article ID 157816, 8 pages doi:10.1155/2011/157816 Research Article O the Strog Laws for Weighted Sums of ρ -Mixig Radom Variables

More information

Application to Random Graphs

Application to Random Graphs A Applicatio to Radom Graphs Brachig processes have a umber of iterestig ad importat applicatios. We shall cosider oe of the most famous of them, the Erdős-Réyi radom graph theory. 1 Defiitio A.1. Let

More information

Fault-Free Vertex-Pancyclicity in Twisted Cubes with Faulty Edges

Fault-Free Vertex-Pancyclicity in Twisted Cubes with Faulty Edges Fault-Free Verte-Pacclicit i Twisted Cubes with Fault Edges Jug-Sheg Fu Abstract The -dimesioal twisted cube deoted b a variatio of the hpercube possesses some properties superior to the hpercube. I this

More information

Approximation by Superpositions of a Sigmoidal Function

Approximation by Superpositions of a Sigmoidal Function Zeitschrift für Aalysis ud ihre Aweduge Joural for Aalysis ad its Applicatios Volume 22 (2003, No. 2, 463 470 Approximatio by Superpositios of a Sigmoidal Fuctio G. Lewicki ad G. Mario Abstract. We geeralize

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science. BACKGROUND EXAM September 30, 2004.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science. BACKGROUND EXAM September 30, 2004. MASSACHUSETTS INSTITUTE OF TECHNOLOGY Departmet of Electrical Egieerig ad Computer Sciece 6.34 Discrete Time Sigal Processig Fall 24 BACKGROUND EXAM September 3, 24. Full Name: Note: This exam is closed

More information

Complex Algorithms for Lattice Adaptive IIR Notch Filter

Complex Algorithms for Lattice Adaptive IIR Notch Filter 4th Iteratioal Coferece o Sigal Processig Systems (ICSPS ) IPCSIT vol. 58 () () IACSIT Press, Sigapore DOI:.7763/IPCSIT..V58. Complex Algorithms for Lattice Adaptive IIR Notch Filter Hog Liag +, Nig Jia

More information

Statistical Inference Based on Extremum Estimators

Statistical Inference Based on Extremum Estimators T. Rotheberg Fall, 2007 Statistical Iferece Based o Extremum Estimators Itroductio Suppose 0, the true value of a p-dimesioal parameter, is kow to lie i some subset S R p : Ofte we choose to estimate 0

More information

Intermittent demand forecasting by using Neural Network with simulated data

Intermittent demand forecasting by using Neural Network with simulated data Proceedigs of the 011 Iteratioal Coferece o Idustrial Egieerig ad Operatios Maagemet Kuala Lumpur, Malaysia, Jauary 4, 011 Itermittet demad forecastig by usig Neural Network with simulated data Nguye Khoa

More information

Mi-Hwa Ko and Tae-Sung Kim

Mi-Hwa Ko and Tae-Sung Kim J. Korea Math. Soc. 42 2005), No. 5, pp. 949 957 ALMOST SURE CONVERGENCE FOR WEIGHTED SUMS OF NEGATIVELY ORTHANT DEPENDENT RANDOM VARIABLES Mi-Hwa Ko ad Tae-Sug Kim Abstract. For weighted sum of a sequece

More information

Principle Of Superposition

Principle Of Superposition ecture 5: PREIMINRY CONCEP O RUCUR NYI Priciple Of uperpositio Mathematically, the priciple of superpositio is stated as ( a ) G( a ) G( ) G a a or for a liear structural system, the respose at a give

More information

Research Article A New Second-Order Iteration Method for Solving Nonlinear Equations

Research Article A New Second-Order Iteration Method for Solving Nonlinear Equations Abstract ad Applied Aalysis Volume 2013, Article ID 487062, 4 pages http://dx.doi.org/10.1155/2013/487062 Research Article A New Secod-Order Iteratio Method for Solvig Noliear Equatios Shi Mi Kag, 1 Arif

More information

Lecture 2 October 11

Lecture 2 October 11 Itroductio to probabilistic graphical models 203/204 Lecture 2 October Lecturer: Guillaume Oboziski Scribes: Aymeric Reshef, Claire Verade Course webpage: http://www.di.es.fr/~fbach/courses/fall203/ 2.

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Patter Recogitio Classificatio: No-Parametric Modelig Hamid R. Rabiee Jafar Muhammadi Sprig 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Ageda Parametric Modelig No-Parametric Modelig

More information

Pixel Recurrent Neural Networks

Pixel Recurrent Neural Networks Pixel Recurret Neural Networks Aa ro va de Oord, Nal Kalchbreer, Koray Kavukcuoglu Google DeepMid August 2016 Preseter - Neha M Example problem (completig a image) Give the first half of the image, create

More information

Fastest mixing Markov chain on a path

Fastest mixing Markov chain on a path Fastest mixig Markov chai o a path Stephe Boyd Persi Diacois Ju Su Li Xiao Revised July 2004 Abstract We ider the problem of assigig trasitio probabilities to the edges of a path, so the resultig Markov

More information

Chandrasekhar Type Algorithms. for the Riccati Equation of Lainiotis Filter

Chandrasekhar Type Algorithms. for the Riccati Equation of Lainiotis Filter Cotemporary Egieerig Scieces, Vol. 3, 00, o. 4, 9-00 Chadrasekhar ype Algorithms for the Riccati Equatio of Laiiotis Filter Nicholas Assimakis Departmet of Electroics echological Educatioal Istitute of

More information

OPTIMAL PIECEWISE UNIFORM VECTOR QUANTIZATION OF THE MEMORYLESS LAPLACIAN SOURCE

OPTIMAL PIECEWISE UNIFORM VECTOR QUANTIZATION OF THE MEMORYLESS LAPLACIAN SOURCE Joural of ELECTRICAL EGIEERIG, VOL. 56, O. 7-8, 2005, 200 204 OPTIMAL PIECEWISE UIFORM VECTOR QUATIZATIO OF THE MEMORYLESS LAPLACIA SOURCE Zora H. Perić Veljo Lj. Staović Alesadra Z. Jovaović Srdja M.

More information

Chimica Inorganica 3

Chimica Inorganica 3 himica Iorgaica Irreducible Represetatios ad haracter Tables Rather tha usig geometrical operatios, it is ofte much more coveiet to employ a ew set of group elemets which are matrices ad to make the rule

More information

Jacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3

Jacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3 No-Parametric Techiques Jacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3 Parametric vs. No-Parametric Parametric Based o Fuctios (e.g Normal Distributio) Uimodal Oly oe peak Ulikely real data cofies

More information

6.867 Machine learning, lecture 7 (Jaakkola) 1

6.867 Machine learning, lecture 7 (Jaakkola) 1 6.867 Machie learig, lecture 7 (Jaakkola) 1 Lecture topics: Kerel form of liear regressio Kerels, examples, costructio, properties Liear regressio ad kerels Cosider a slightly simpler model where we omit

More information

This is an introductory course in Analysis of Variance and Design of Experiments.

This is an introductory course in Analysis of Variance and Design of Experiments. 1 Notes for M 384E, Wedesday, Jauary 21, 2009 (Please ote: I will ot pass out hard-copy class otes i future classes. If there are writte class otes, they will be posted o the web by the ight before class

More information

Machine Learning Regression I Hamid R. Rabiee [Slides are based on Bishop Book] Spring

Machine Learning Regression I Hamid R. Rabiee [Slides are based on Bishop Book] Spring Machie Learig Regressio I Hamid R. Rabiee [Slides are based o Bishop Book] Sprig 015 http://ce.sharif.edu/courses/93-94//ce717-1 Liear Regressio Liear regressio: ivolves a respose variable ad a sigle predictor

More information

Research Article Some E-J Generalized Hausdorff Matrices Not of Type M

Research Article Some E-J Generalized Hausdorff Matrices Not of Type M Abstract ad Applied Aalysis Volume 2011, Article ID 527360, 5 pages doi:10.1155/2011/527360 Research Article Some E-J Geeralized Hausdorff Matrices Not of Type M T. Selmaogullari, 1 E. Savaş, 2 ad B. E.

More information

A Block Cipher Using Linear Congruences

A Block Cipher Using Linear Congruences Joural of Computer Sciece 3 (7): 556-560, 2007 ISSN 1549-3636 2007 Sciece Publicatios A Block Cipher Usig Liear Cogrueces 1 V.U.K. Sastry ad 2 V. Jaaki 1 Academic Affairs, Sreeidhi Istitute of Sciece &

More information

MOMENT-METHOD ESTIMATION BASED ON CENSORED SAMPLE

MOMENT-METHOD ESTIMATION BASED ON CENSORED SAMPLE Vol. 8 o. Joural of Systems Sciece ad Complexity Apr., 5 MOMET-METHOD ESTIMATIO BASED O CESORED SAMPLE I Zhogxi Departmet of Mathematics, East Chia Uiversity of Sciece ad Techology, Shaghai 37, Chia. Email:

More information

An Introduction to Randomized Algorithms

An Introduction to Randomized Algorithms A Itroductio to Radomized Algorithms The focus of this lecture is to study a radomized algorithm for quick sort, aalyze it usig probabilistic recurrece relatios, ad also provide more geeral tools for aalysis

More information

Provläsningsexemplar / Preview TECHNICAL REPORT INTERNATIONAL SPECIAL COMMITTEE ON RADIO INTERFERENCE

Provläsningsexemplar / Preview TECHNICAL REPORT INTERNATIONAL SPECIAL COMMITTEE ON RADIO INTERFERENCE TECHNICAL REPORT CISPR 16-4-3 2004 AMENDMENT 1 2006-10 INTERNATIONAL SPECIAL COMMITTEE ON RADIO INTERFERENCE Amedmet 1 Specificatio for radio disturbace ad immuity measurig apparatus ad methods Part 4-3:

More information

1 Review and Overview

1 Review and Overview DRAFT a fial versio will be posted shortly CS229T/STATS231: Statistical Learig Theory Lecturer: Tegyu Ma Lecture #3 Scribe: Migda Qiao October 1, 2013 1 Review ad Overview I the first half of this course,

More information

ME 539, Fall 2008: Learning-Based Control

ME 539, Fall 2008: Learning-Based Control ME 539, Fall 2008: Learig-Based Cotrol Neural Network Basics 10/1/2008 & 10/6/2008 Uiversity Orego State Neural Network Basics Questios??? Aoucemet: Homework 1 has bee posted Due Friday 10/10/08 at oo

More information

Numerical Conformal Mapping via a Fredholm Integral Equation using Fourier Method ABSTRACT INTRODUCTION

Numerical Conformal Mapping via a Fredholm Integral Equation using Fourier Method ABSTRACT INTRODUCTION alaysia Joural of athematical Scieces 3(1): 83-93 (9) umerical Coformal appig via a Fredholm Itegral Equatio usig Fourier ethod 1 Ali Hassa ohamed urid ad Teh Yua Yig 1, Departmet of athematics, Faculty

More information

SOME TRIBONACCI IDENTITIES

SOME TRIBONACCI IDENTITIES Mathematics Today Vol.7(Dec-011) 1-9 ISSN 0976-38 Abstract: SOME TRIBONACCI IDENTITIES Shah Devbhadra V. Sir P.T.Sarvajaik College of Sciece, Athwalies, Surat 395001. e-mail : drdvshah@yahoo.com The sequece

More information

Mathematical Modeling of Optimum 3 Step Stress Accelerated Life Testing for Generalized Pareto Distribution

Mathematical Modeling of Optimum 3 Step Stress Accelerated Life Testing for Generalized Pareto Distribution America Joural of Theoretical ad Applied Statistics 05; 4(: 6-69 Published olie May 8, 05 (http://www.sciecepublishiggroup.com/j/ajtas doi: 0.648/j.ajtas.05040. ISSN: 6-8999 (Prit; ISSN: 6-9006 (Olie Mathematical

More information

Frequency Domain Filtering

Frequency Domain Filtering Frequecy Domai Filterig Raga Rodrigo October 19, 2010 Outlie Cotets 1 Itroductio 1 2 Fourier Represetatio of Fiite-Duratio Sequeces: The Discrete Fourier Trasform 1 3 The 2-D Discrete Fourier Trasform

More information

REGRESSION WITH QUADRATIC LOSS

REGRESSION WITH QUADRATIC LOSS REGRESSION WITH QUADRATIC LOSS MAXIM RAGINSKY Regressio with quadratic loss is aother basic problem studied i statistical learig theory. We have a radom couple Z = X, Y ), where, as before, X is a R d

More information

Lecture 19: Convergence

Lecture 19: Convergence Lecture 19: Covergece Asymptotic approach I statistical aalysis or iferece, a key to the success of fidig a good procedure is beig able to fid some momets ad/or distributios of various statistics. I may

More information

x[0] x[1] x[2] Figure 2.1 Graphical representation of a discrete-time signal.

x[0] x[1] x[2] Figure 2.1 Graphical representation of a discrete-time signal. x[ ] x[ ] x[] x[] x[] x[] 9 8 7 6 5 4 3 3 4 5 6 7 8 9 Figure. Graphical represetatio of a discrete-time sigal. From Discrete-Time Sigal Processig, e by Oppeheim, Schafer, ad Buck 999- Pretice Hall, Ic.

More information

Chapter 7: The z-transform. Chih-Wei Liu

Chapter 7: The z-transform. Chih-Wei Liu Chapter 7: The -Trasform Chih-Wei Liu Outlie Itroductio The -Trasform Properties of the Regio of Covergece Properties of the -Trasform Iversio of the -Trasform The Trasfer Fuctio Causality ad Stability

More information

The DOA Estimation of Multiple Signals based on Weighting MUSIC Algorithm

The DOA Estimation of Multiple Signals based on Weighting MUSIC Algorithm , pp.10-106 http://dx.doi.org/10.1457/astl.016.137.19 The DOA Estimatio of ultiple Sigals based o Weightig USIC Algorithm Chagga Shu a, Yumi Liu State Key Laboratory of IPOC, Beijig Uiversity of Posts

More information

Machine Learning Brett Bernstein

Machine Learning Brett Bernstein Machie Learig Brett Berstei Week 2 Lecture: Cocept Check Exercises Starred problems are optioal. Excess Risk Decompositio 1. Let X = Y = {1, 2,..., 10}, A = {1,..., 10, 11} ad suppose the data distributio

More information