Sparse Kernel Ridge Regression Using Backward Deletion

Size: px
Start display at page:

Download "Sparse Kernel Ridge Regression Using Backward Deletion"

Transcription

1 Sparse Kerne Rdge Regresson Usng Bacward Deeon Lng Wang, Lefeng Bo, and Lcheng Jao Insue of Inegen Informaon Processng 7007, Xdan Unversy, X an, Chna {wp, bf08}@63.com Absrac. Based on he feaure map prncpe, Sparse Kerne Rdge Regresson (SKRR) mode s proposed. SKRR obans he sparseness by bacward deeon feaure seecon procedure ha recursvey removes he feaure wh he smaes eave-one-ou score un he sop creron s sasfed. Besdes good generazaon performance, he mos compeng propery of SKRR s raher sparse, and moreover, he erne funcon needs no o be posve defne. Expermens on synhec and benchmar daa ses vadae he feasby and vady of SKRR. Inroducon Regresson probem s one of he fundamena probems n he fed of supervsed earnng. I can be hough of as esmang he rea vaued funcon from a sampes se of nose observaon. A very successfu approach for regresson s Suppor Vecor Regresson (SVR) [-] ha aemps o smuaneousy mnmze emprca rs and confdence nerva, eadng o good generazaon. Due o ε -nsensve oss funcon, SVR obans a sparse mode (predcon for a new npu ony needs he subse of ranng sampes). hough very successfu, SVR aso has some dsadvanages [3]: he souon s usuay no very sparse and he predcon speed for a new npu s sgnfcany sower han ha of some oher earnng machnes such as Neura Newors [4-5]. Kerne funcon mus sasfy Mercer s condon. I s we nown ha dfferen erne funcons w nduce dfferen agorhms, achevng dfferen performance. However, erne funcons n SVR mus sasfy Mercer s posve defne condon, whch ms he usabe ernes. In order o dea wh he probems menoned above, Reevance Vecor Machnes (RVM) [3] s proposed, whch s very eegan and obans a hgh sparse souon. However, RVM needs o sove near equaons, whose cos s very expensve, and herefore, s no appcabe o arge scae probems. In hs paper, we propose a new earnng mode, Sparse Kerne Rdge Regresson (SKRR) o overcome he above probems. In SKRR, sampes are mapped no he feaure space whose dmenson s equa o he sampe sze, and hen Rdge Regresson (RR) [6] s mpemened n he feaure space. When he ranng process s compeed, a bacward deeon feaure seecon procedure s apped o oban a sparse mode. Q. Yang and G. Webb (Eds.): PRICAI 006, LNAI 4099, pp , 006. Sprnger-Verag Bern Hedeberg 006

2 366 L. Wang, L. Bo, and L. Jao Besdes good generazaon performance, he mos compeng propery of SKRR s s sparseness ha s comparabe wh ha of RVM. Anoher advanage of SKRR s ha he erne funcon needs no o be posve defne. Expermens on synhec and benchmar daa ses assess he feasby and vady of SKRR. Kerne Rdge Regresson Le {( x, y ), L,( x, y )} be emprca sampes se drawn from y = f( x ) + ε, =,, L,, () where y s corruped by addve nose ε, whose dsrbuons are usuay unnown. Learnng ams o nfer he funcon f ( x ) from he fne daa se {( x, y ), L,( x, y )}. A cassca mehod for he probem s Rdge Regresson (RR), whch s an exenson of near regresson by addng a quadrac penazng erm: ( ) wˆ = arg mn w x + w+ y + λ w, () = where λ s a fxed posve consan, caed reguarzaon coeffcen. Equaon () can be rewren as ( ( λ ) ) w ˆ = arg mn w X X+ I w w X y, (3) ' x, w where X = M, w = M. ', x w + Rdge regresson [6] s a we-nown approach for he souon of regresson probems, whch has a good generazaon performance as we as SVR, and he mode does no need he erne funcon sasfyng Mercer s condon, moreover, here s an effcen eave-one-ou cross-vadaon mode seecon mehod. In order o mae RR appcabe o he nonnear probems, we generaze by a feaure map. Defne a ( xx, ) =,, L,, as shown vecor made up of a se of rea-vaued funcons { } [ ] z = ( x, x ), L, ( x, x ), (4) ' where ( xx, ) s erne funcon ha needs no o be posve defne. We ca { [ (, ),, (, ) ], n z z xx xx x } F = = L R (5) { (, ),,(, ) (, ),, (, ) } feaure space. In parcuar, z y z y z = [ x x x x ] L L are emprca sampes n he feaure space. Subsung z for x, we have he Kerne Rdge Regresson (KRR) [7] n he feaure space F

3 Sparse Kerne Rdge Regresson Usng Bacward Deeon 367 ( ( λ ) ) w ˆ = arg mn w Z Z+ I w w Z y, (6) ' z, w where Z = M, w = M. he correspondng decson funcon s ', z w + f( x) = w( x, x ) + w +. (7) = From equaon (6) and (7), we can oban w=[z ˆ Z+ λi] Z Y. (8) 3 Sparse Kerne Rdge Regresson For a suabe λ, KRR has a good generazaon performance. Is ey dsadvanage s no sparseness, whch seems o prohb s appcaon n some feds. In hs paper, we consder how o deee redundan feaures and smuaneousy eep good generazaon performance a an accepabe compuaona cos. 3. Feaure Seecon by Bacward Deeon In order o oban he sparseness, a bacward deeon procedure s mpemened afer he ranng process. Bacward deeon procedure recursvey removes he feaure wh he smaes eave-one-ou score un he sop creron s sasfed. Le H = ( Z Z+λI ) and b = Z y, hen equaon (6) can be rewren as ( L ( )) w ˆ = arg mn = w Hw w b. (9) h When he feaure s deeed a he h eraon, e H represen he sub-marx h formed by omng he row and coumn of H. Le R represen he nverse of H, (8), we have w he weghs and f he opma vaue of L. Accordng o equaon ( ) f = b R b. (0), P {} h where P s a se of remanng feaures (varabes) a he eraon. In erms of a ran- updae [8-9], R and w can be formuaed n equaon () and () (see Appendx for deas), where R represens he nverse of H. =,, P {}. ()

4 368 L. Wang, L. Bo, and L. Jao n ( w ) = ( ), { } R b P. P { } ( ) R () ogeher wh w = R b, equaon () s smpfed as P P ( w ) = w w, P { }. (3) Subsung equaon () no (0), we oban ( ) R f = b b + b b = + P f, P, P P By he vrue of b b = w, equaon (4) s ransaed no ( w ). (4) Δ f = f f =. (5) We ca Δ f eave-one-ou score. A each eraon, we remove he feaure wh he smaes eave-one-ou score. he feaure o be deeed can be obaned by he foowng expresson. s = arg mn Δ f. (6) P { + } ( ) (, s) Z (,) s w (, s) H Fg.. Dsrbuon of parameers afer he h s feaure was deeed Fgure shows he dsrbuon of parameers afer he s h feaure was deeed. Noe ha he ( + ) h varabe s bas ha s reserved durng he feaure seecon process.

5 Sparse Kerne Rdge Regresson Usng Bacward Deeon 369 A he h eraon, he oa ncrease of oss funcon L s where Δ f = bw L. (7) op P op L s he mnmum of equaon (9). We ermnae he agorhm f op Δf ε L. (8) where ε s a sma posve number. Accordng o he dervaon above, Bacward Deeon Feaure Seecon (BDFS) can be descrbed as he foowng Agorhm : Agorhm. BDFS. Se P = {,, L, }, R = H, w = R b;. For = o, do: ( w ) 3. (a) s = arg mn ; P { + } ( ) (,) s R s s R = R,, P s 4. (b) ( ) ( ) {} (,) s s 5. (c) ( w ) = w ws, ss P { s} ; 6. (, s) (d) P = P { s}, (, s) R = R, w + = w. 7. (e) IF ( bw + P bw) εbw, Sop. 8. End For 9. End Agorhm 3. Mode Seecon here exs free parameers ncudng erne parameer and reguarzaon parameer n SKRR. In order o oban good generazaon, s needed o choose he suabe parameers. Cross-vadaon s a mode seecon mehod ofen used n esmang he generazaon performance of sasca cassfers. 0-fod cross-vadaon s ofen used n some erne-based earnng agorhms such as SVMs. Leave-one-ou cross vadaon s he mos exreme form of cross-vadaon. Leave-one-ou cross vadaon error s an aracve mode seecon creron snce provdes an amos unbased esmaor of generazaon performance []. However, hs mehod s rarey adoped n he erne machnes because s compuaonay expensve. Forunaey for SKRR, here s an effcen mpemenaon for eave-one-ou cross 3 vadaon ha ony ncurs a compuaona cos of O ( ) [0]. Le E( Γ ) be eave-oneou cross vadaon error, where Γ denoes free parameers. ss ;

6 370 L. Wang, L. Bo, and L. Jao Lemma. [-]. E( Γ ) = B( Γ)( I-A) y, where ( Γ) he h enry α ( ) ( Γ ) = ( +λ ) A Z Z Z I Z. ( ) Γ, ( ) Le he snguar vaue decomposon of Z be B s a dagona marx wh α Γ beng he h enry of Z = UDV, (9) where UV, are orhogona marces, D s a dagona marx. Subsung equaon (9) no A ( Γ), can be smpfed as ( Γ ) = ( +λ ) A UD D D I D U, (0) where DD+ λi s a dagona marx. Hence for dfferen λ, we ony need o perform once marx decomposon. For cary, here we gve he dea seps of eave-one-ou mode seecon agorhm n whch σ represens erne parameer: Agorhm. LOO Mode Seecon For σ, =,, L, q Z = UDV ; For λ, =,, L, p ( α = UD DD+ λ I) DU ; ( ) B = α ; r( ) = B ( y A y ) { } = r( ) = E End for End for op op ( λ, σ ) = argmn( E )., hus, accordng o he anayss n secon 3. and 3., SKRR can be descrbed n he foowng Agorhm 3: Agorhm 3. SKRR. Le λ be nose varance, and choose he erne parameers usng LOO mode seecon agorhm;. ran KRR usng he seeced parameers; 3. Impemen BDFS for KRR; 4. Re-esmae λ usng LOO mode seecon agorhm; 5. Re-ran KRR on he smpfed feaures.

7 Sparse Kerne Rdge Regresson Usng Bacward Deeon 37 4 Smuaon In order o evauae he performance of he proposed agorhm, we performed SKRR on hree daa ses: Snc, Boson Housng and Abaone daa ses, and compared s performance wh ha of KRR, SVR, and RVM. For he sae of comparson, dfferen agorhms use he same npu sequence. he eemens of Gram Marx are consruced usng Gaussan erne of he form ( xy, ) = exp. For SVR and x y σ RVM, we uze 0-fod cross vadaon procedure o choose he free parameers. For a daases, parameer ε n BDFS s se 0.0. Large amouns of expermens show ha s a good seecon. 4. oy Expermen he Snc funcon y = sn( x) / x+ σ N(0,) s a popuar choce o usrae suppor vecor machnes regresson. We generae ranng sampes from Snc funcon a 00 equay-spaced x -vaue n [ 0,0] wh added Gaussan nose of sandard devaon 0.. Resus are averaged over 00 random nsanaons of he nose, wh he error beng measured over 000 nose-free es sampes n [ 0,0]. he decson funcons and suppor vecors obaned by SKRR and SVR wh ε = 0. are shown n Fgure. Suppor vecors number and mean square error are summarzed n abe. For Snc daa se, SKRR ouperforms oher hree agorhms. SKRR and RVM oban smar suppor vecors number ha s sgnfcany ess han ha of SVR. abe. Generazaon error obaned by he four agorhms on Snc daa se. MSE denoes mean square error and NSV denoes suppor vecors number SKRR KRR SVR RVM MSE 0.85 ± ± ± ± 0.47 NSV 7.0 ± ± ± ± 0.38 Fg.. Suppor vecors and decson boundary obaned by SKRR and SVR. Rea red nes denoe he decson boundary and do nes denoe he Snc funcon; crces denoe he suppor vecors.

8 37 L. Wang, L. Bo, and L. Jao 4. Benchmar Comparson Boson Housng and Abaone daa ses ha come from SALOG COLLECION [3] are popuar choces o evauae he performance of agorhms. Boson Housng daa se ncudes 506 exampes, 3 arbues of each exampe and Abaone daa se ncudes 477 exampes, 7 arbues of each exampe. For he Boson Housng daa se, we average our resus over 00 random sps of he fu daase no 48 ranng sampes and 5 esng sampes. For he Abaone daa se, we average our resus over 0 random sps of he moher daase no 000 ranng sampes and 377 esng sampes. Before expermens, we scae a ranng sampes no [-, ] and hen adus esng sampes usng he same near ransformaon. he resus are summarzed n abe and 3. abe. Generazaon error obaned by he four agorhms on Boson Housng daa se. MSE denoes mean square error and NSV denoes suppor vecors number. SKRR KRR SVR RVM MSE 0.05 ± ± ± ± 6.8 NSV 5.3 ± ± ± ±.49 From he expermena resus (abe, and 3), we observe he foowng SKRR, KRR and RVM oban smar performance and are sghy superor o SVR. Boh SKRR and RVM are raher sparse. Suppor (reevance) vecors number of SKRR and RVM s much ess han ha of SVR. abe 3. Generazaon error obaned by he four agorhms on Abaone daa se. MSE denoes mean square error and NSV denoes suppor vecors number. SKRR KRR SVR RVM MSE 4.64 ± ± ± ± 0. NSV 8.90 ± ± ± ±.45 5 Concuson Bacward deeon feaure seecon provdes a sae-of-he-ar oo ha can deee redundan feaures and smuaneousy eep good generazaon performance a an accepabe compuaona cos. Based on BDFS, we propose SKRR ha obans hgh sparseness. Furher appcaon of BDFS ncudes condensng he wo-sage RBF Newors. Appcaon o near regresson s n progress. Acnowedgmens hs wor was suppored by he Naona Naura Scence Foundaon of Chna under gran No and he Naona Grand Fundamena Research 973 Program of Chna under Gran No. 00CB

9 Sparse Kerne Rdge Regresson Usng Bacward Deeon 373 References. Vapn, V: he Naure of Sasca Learnng heory. New Yor Sprnger-Verag (995). Vapn, V: Sasca Learnng heory. New Yor Wey-Inerscence Pubcaon (998) 3. ppng, M.E: he Reevance Vecor Machne. Proc. Advances n Neura Informaon Processng Sysems, Soa,S., Leen,. and Müer K.-R. eds. Cambrdge, Mass MI Press (00) Nea, R: Bayesan Learnng for Neura Newors. New Yor Sprnger Verag (996) 5. Rpey R: Paern Recognon and Neura Newors. Cambrdge, U.K. Cambrdge Unv. Press (996) 6. Hoer, A. and Kennard, R: Rdge Regresson: Based Esmaon for Nonorhogona Probems. echnomercs (970) Saunder C. and Gammerman A: Rdge regresson earnng agorhm n dua varabes. In: Shav J. edor. Machne earnng Proceedngs of he 5 h Inernaona Conference, Morgan Kaufmann (998) 8. Soer, J. and Bursch, R: Inroducon o Numerca Anayss. Sprnger Verag, New Yor, Second Edon (993) 9. Bo, L.F., Wang, L., and Jao, L.C.: Sparse Gaussan processes usng bacward emnaon. Lecure Noes n Compuer Scence 397 (006) Bo, L.F., Wang, L., and Jao, L.C.: Feaure scang for erne Fsher dscrmnan anayss usng eave-one-ou cross vadaon. Neura Compuaon 8(4) (006) Aen, D.M: he reaonshp beween varabe seecon and predcon. echnomercs 6 (974) 5-7. Gavn C.C.: Effcen eave-one-ou cross-vadaon of erne fsher dscrmnan cassfers. Paern Recognon 36() (003) Mche, D, Spegehaer, D. J., and ayor, C.C.: Machne Learnng, Neura and Sasca Cassfcaon. Prence Ha, Engewood Cffs, N.J., 994, Daa avaabe a anonymous fp: fp.ncc.up.p/pub/saog/. Appendx he marx nverson formua for a symmerc marx wh boc sub-marces s gven n he foowng Lemma. Lemma [8]: Gven nverbe marces A and C, and marx B, he foowng equay hods: A B A + A BE B A E BC = B C C B E C + C B D BC, (A.) D= A BC B where E= C B A B. Equaon () can be derved n erms of Lemma. We assume here ha B C s h he feaure o be deeed. hus n our probem, R corresponds o A, and

10 374 L. Wang, L. Bo, and L. Jao corresponds o E, and ( ) R corresponds o. Observng he op ef boc of he as par n equaon (A.), we have whch corresponds o Hence, we have A + A BE E B A E +,, P {} =,, P {} A BE, P { }, (A.). (A.3). (A.4)

Sparse Kernel Ridge Regression Using Backward Deletion

Sparse Kernel Ridge Regression Using Backward Deletion Sparse Kerne Rdge Regresson Usng Backward Deeon Lng Wang, Lefeng Bo, and Lcheng Jao Insue of Inegen Informaon Processng 710071, Xdan Unversy, X an, Chna {wp, bf018}@163.com Absrac. Based on he feaure map

More information

THE POLYNOMIAL TENSOR INTERPOLATION

THE POLYNOMIAL TENSOR INTERPOLATION Pease ce hs arce as: Grzegorz Berna, Ana Ceo, The oynoma ensor neroaon, Scenfc Research of he Insue of Mahemacs and Comuer Scence, 28, oume 7, Issue, ages 5-. The webse: h://www.amcm.cz./ Scenfc Research

More information

The Research of Algorithm for Data Mining Based on Fuzzy Theory

The Research of Algorithm for Data Mining Based on Fuzzy Theory The Research of Agorhm for Daa Mnng Based on Fuzzy Theory Wang Amn, L Je, Schoo of Compuer and Informaon Engneerng Anyang Norma Unversy Anyang, Chna Shenyang Insue of Compung Technoogy Chnese Academy of

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

CHAPTER 7: CLUSTERING

CHAPTER 7: CLUSTERING CHAPTER 7: CLUSTERING Semparamerc Densy Esmaon 3 Paramerc: Assume a snge mode for p ( C ) (Chapers 4 and 5) Semparamerc: p ( C ) s a mure of denses Mupe possbe epanaons/prooypes: Dfferen handwrng syes,

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Stochastic State Estimation and Control for Stochastic Descriptor Systems

Stochastic State Estimation and Control for Stochastic Descriptor Systems Sochasc Sae smaon and Conro for Sochasc Descrpor Sysems hwe Gao and aoyan Sh Schoo of ecrc and ecronc ngneerng ann Unversy ann 372, Chna e-ma: zhwegac@pubc.p..cn bsrac In hs paper, a sochasc observer s

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press, Lecure Sdes for INTRODUCTION TO Machne Learnng ETHEM ALPAYDIN The MIT Press, 2004 aaydn@boun.edu.r h://www.cme.boun.edu.r/~ehem/2m CHAPTER 7: Cuserng Semaramerc Densy Esmaon Paramerc: Assume a snge mode

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria IOSR Journal of Mahemacs (IOSR-JM e-issn: 78-578, p-issn: 9-765X. Volume 0, Issue 4 Ver. IV (Jul-Aug. 04, PP 40-44 Mulple SolonSoluons for a (+-dmensonalhroa-sasuma shallow waer wave equaon UsngPanlevé-Bӓclund

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction ECOOMICS 35* -- OTE 9 ECO 35* -- OTE 9 F-Tess and Analyss of Varance (AOVA n he Smple Lnear Regresson Model Inroducon The smple lnear regresson model s gven by he followng populaon regresson equaon, or

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

Short-term Load Forecasting Model for Microgrid Based on HSA-SVM

Short-term Load Forecasting Model for Microgrid Based on HSA-SVM MATEC Web of Conferences 73 0007 (08) hps://do.org/0.05/maecconf/08730007 SMIMA 08 Shor-erm Load Forecasng Mode for Mcrogrd Based on HSA-SVM Han Aoyang * Yu Lao An Shuhua Zhang Zhsheng 3* Qngdao Eecrc

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

EFFICIENT TRAINING OF RBF NETWORKS VIA THE KURTOSIS AND SKEWNESS MINIMIZATION LEARNING ALGORITHM

EFFICIENT TRAINING OF RBF NETWORKS VIA THE KURTOSIS AND SKEWNESS MINIMIZATION LEARNING ALGORITHM Journa o heoreca and Apped Inormaon echnoogy 0 h February 03. Vo. 48 o. 005-03 JAI & LLS. A rghs reserved. ISS: 99-8645 www.a.org E-ISS: 87-395 EFFICIE RAIIG OF RBF EWORKS VIA HE KUROSIS AD SKEWESS MIIMIZAIO

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

Comparison of Differences between Power Means 1

Comparison of Differences between Power Means 1 In. Journal of Mah. Analyss, Vol. 7, 203, no., 5-55 Comparson of Dfferences beween Power Means Chang-An Tan, Guanghua Sh and Fe Zuo College of Mahemacs and Informaon Scence Henan Normal Unversy, 453007,

More information

CHAPTER 2: Supervised Learning

CHAPTER 2: Supervised Learning HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b

Attribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b Inernaonal Indusral Informacs and Compuer Engneerng Conference (IIICEC 05) Arbue educon Algorhm Based on Dscernbly Marx wh Algebrac Mehod GAO Jng,a, Ma Hu, Han Zhdong,b Informaon School, Capal Unversy

More information

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs

More information

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC CH.3. COMPATIBILITY EQUATIONS Connuum Mechancs Course (MMC) - ETSECCPB - UPC Overvew Compably Condons Compably Equaons of a Poenal Vecor Feld Compably Condons for Infnesmal Srans Inegraon of he Infnesmal

More information

Improved Stumps Combined by Boosting for Text Categorization

Improved Stumps Combined by Boosting for Text Categorization 1000-985/00/13(08)1361-07 00 Journa of Sofware Vo.13, No.8 Improved Sumps Comned y Boosng for Tex Caegorzaon DIAO L-, HU Ke-yun, LU Yu-chang, SHI Chun-y (Sae Key Laoraory of Inegen Technoogy and Sysem,

More information

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes. umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

Bayesian Inference of the GARCH model with Rational Errors

Bayesian Inference of the GARCH model with Rational Errors 0 Inernaonal Conference on Economcs, Busness and Markeng Managemen IPEDR vol.9 (0) (0) IACSIT Press, Sngapore Bayesan Inference of he GARCH model wh Raonal Errors Tesuya Takash + and Tng Tng Chen Hroshma

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β SARAJEVO JOURNAL OF MATHEMATICS Vol.3 (15) (2007), 137 143 SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β M. A. K. BAIG AND RAYEES AHMAD DAR Absrac. In hs paper, we propose

More information

Training Algorithm of Adaptive Neural Fuzzy Inference System Based on Improved SRUKF

Training Algorithm of Adaptive Neural Fuzzy Inference System Based on Improved SRUKF IAEG Inernaona Journa of Compuer Scence, 44:4, IJCS_44_4_ ranng Agorhm of Adapve eura Fuy Inference Sysem Based on Improved SRUKF Wang Hong and Gao onghu Absrac o sove he probem of ow predcon accuracy

More information

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA Mchaela Chocholaá Unversy of Economcs Braslava, Slovaka Inroducon (1) one of he characersc feaures of sock reurns

More information

A HIERARCHICAL KALMAN FILTER

A HIERARCHICAL KALMAN FILTER A HIERARCHICAL KALMAN FILER Greg aylor aylor Fry Consulng Acuares Level 8, 3 Clarence Sree Sydney NSW Ausrala Professoral Assocae, Cenre for Acuaral Sudes Faculy of Economcs and Commerce Unversy of Melbourne

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

A Fully Distributed Reactive Power Optimization and Control Method for Active Distribution Networks

A Fully Distributed Reactive Power Optimization and Control Method for Active Distribution Networks 1 A Fuy Dsrbued Reacve Power Opmzaon Conro Mehod for Acve Dsrbuon Newors Weye Zheng, Wenchuan Wu, Senor Member, IEEE, Bomng Zhang, Feow, IEEE, Hongbn Sun, Senor Member, IEEE, Lu Ybng, Suden Member, IEEE

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students.

Example: Suppose we want to build a classifier that recognizes WebPages of graduate students. Exampe: Suppose we want to bud a cassfer that recognzes WebPages of graduate students. How can we fnd tranng data? We can browse the web and coect a sampe of WebPages of graduate students of varous unverstes.

More information

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets Forecasng Usng Frs-Order Dfference of Tme Seres and Baggng of Compeve Assocave Nes Shuch Kurog, Ryohe Koyama, Shnya Tanaka, and Toshhsa Sanuk Absrac Ths arcle descrbes our mehod used for he 2007 Forecasng

More information

A New Excitation Control for Multimachine Power Systems II: Robustness and Disturbance Attenuation Analysis

A New Excitation Control for Multimachine Power Systems II: Robustness and Disturbance Attenuation Analysis 88 Inernaona Journa of Conro Hars Auomaon E Psaks and and Sysems Anono vo T Aexandrds no (speca edon) pp 88-95 June 5 A New Excaon Conro for umachne Power Sysems II: Robusness and Dsurbance Aenuaon Anayss

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

On Kalman Information Fusion for Multiple Wireless Sensors Networks Systems with Multiplicative Noise

On Kalman Information Fusion for Multiple Wireless Sensors Networks Systems with Multiplicative Noise Sensors & ransducers Vo. 76 Issue 8 Auus 04 pp. 5764 Sensors & ransducers 04 b IFSA Pubshn S. L. hp://www.sensorspora.com On Kaman Informaon Fuson for Mupe Wreess Sensors Neworks Ssems wh Mupcave Nose

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

Deriving the Dual. Prof. Bennett Math of Data Science 1/13/06

Deriving the Dual. Prof. Bennett Math of Data Science 1/13/06 Dervng the Dua Prof. Bennett Math of Data Scence /3/06 Outne Ntty Grtty for SVM Revew Rdge Regresson LS-SVM=KRR Dua Dervaton Bas Issue Summary Ntty Grtty Need Dua of w, b, z w 2 2 mn st. ( x w ) = C z

More information

Kayode Ayinde Department of Pure and Applied Mathematics, Ladoke Akintola University of Technology P. M. B. 4000, Ogbomoso, Oyo State, Nigeria

Kayode Ayinde Department of Pure and Applied Mathematics, Ladoke Akintola University of Technology P. M. B. 4000, Ogbomoso, Oyo State, Nigeria Journal of Mahemacs and Sascs 3 (4): 96-, 7 ISSN 549-3644 7 Scence Publcaons A Comparave Sudy of he Performances of he OLS and some GLS Esmaors when Sochasc egressors are boh Collnear and Correlaed wh

More information

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition EHEM ALPAYDI he MI Press, 04 Lecure Sldes for IRODUCIO O Machne Learnng 3rd Edon alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/ml3e Sldes from exboo resource page. Slghly eded and wh addonal examples

More information

Appendix to Online Clustering with Experts

Appendix to Online Clustering with Experts A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper

More information

Secure and Fast Digital Signatures using BCH Codes

Secure and Fast Digital Signatures using BCH Codes 0 IJCSNS Inernaona Journa of Compuer Scence and Nework Secury, VOL6 No0, Ocober 006 Secure and Fas Dga Sgnaures usng BCH Codes Omessaâd HAMDI *, Sam HARARI ** and Ammar BOUALLEGUE ***, (*),(***) SYSCOM

More information

Online Appendix for. Strategic safety stocks in supply chains with evolving forecasts

Online Appendix for. Strategic safety stocks in supply chains with evolving forecasts Onlne Appendx for Sraegc safey socs n supply chans wh evolvng forecass Tor Schoenmeyr Sephen C. Graves Opsolar, Inc. 332 Hunwood Avenue Hayward, CA 94544 A. P. Sloan School of Managemen Massachuses Insue

More information

Block compressed sensing of video based on unstable sampling rates and multihypothesis predictions

Block compressed sensing of video based on unstable sampling rates and multihypothesis predictions ISSN : 0974-745 Voume 0 Issue 7 BoTechnoog BoTechnoog An Indan Journa BTAIJ, 0(7), 04 [95-95] Bock compressed sensng of vdeo based on unsabe sampng raes and muhpohess predcons Zhao Chengn *, Lu Jangun

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

Robust Output Tracking of Uncertain Large-Scale Input-Delay Systems via Decentralized Fuzzy Sliding Mode Control

Robust Output Tracking of Uncertain Large-Scale Input-Delay Systems via Decentralized Fuzzy Sliding Mode Control Inernaona Journa of Conro Scence and Engneerng (6: 57-7 DOI:.593/.conro.6.4 Robus Oupu rackng of Unceran Large-Scae Inpu-Deay Sysems va Decenrazed Fuzzy Sdng Mode Conro Chang-Che ng Chang Deparmen of Eecrca

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code Usng Fuzzy Paern Recognon o Deec Unknown Malcous Execuables Code Boyun Zhang,, Janpng Yn, and Jngbo Hao School of Compuer Scence, Naonal Unversy of Defense Technology, Changsha 40073, Chna hnxzby@yahoo.com.cn

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth Should Exac Index umbers have Sandard Errors? Theory and Applcaon o Asan Growh Rober C. Feensra Marshall B. Rensdorf ovember 003 Proof of Proposon APPEDIX () Frs, we wll derve he convenonal Sao-Vara prce

More information

Fall 2009 Social Sciences 7418 University of Wisconsin-Madison. Problem Set 2 Answers (4) (6) di = D (10)

Fall 2009 Social Sciences 7418 University of Wisconsin-Madison. Problem Set 2 Answers (4) (6) di = D (10) Publc Affars 974 Menze D. Chnn Fall 2009 Socal Scences 7418 Unversy of Wsconsn-Madson Problem Se 2 Answers Due n lecure on Thursday, November 12. " Box n" your answers o he algebrac quesons. 1. Consder

More information

Li An-Ping. Beijing , P.R.China

Li An-Ping. Beijing , P.R.China A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

Scattering at an Interface: Oblique Incidence

Scattering at an Interface: Oblique Incidence Course Insrucor Dr. Raymond C. Rumpf Offce: A 337 Phone: (915) 747 6958 E Mal: rcrumpf@uep.edu EE 4347 Appled Elecromagnecs Topc 3g Scaerng a an Inerface: Oblque Incdence Scaerng These Oblque noes may

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

CS286.2 Lecture 14: Quantum de Finetti Theorems II

CS286.2 Lecture 14: Quantum de Finetti Theorems II CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2

More information

Mixtures Experiments with Mixing Errors

Mixtures Experiments with Mixing Errors Mures Epermens h Mng Errors Aaa Ahuba & Aeander N. Donev Frs verson: June 009 Research Repor No. 0, 009, Probaby and Sascs Group Schoo o Mahemacs, he Unversy o Mancheser Mures Epermens h Mng Errors Aaa

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information