Sparse Kernel Ridge Regression Using Backward Deletion
|
|
- Leslie Kelley
- 5 years ago
- Views:
Transcription
1 Sparse Kerne Rdge Regresson Usng Backward Deeon Lng Wang, Lefeng Bo, and Lcheng Jao Insue of Inegen Informaon Processng , Xdan Unversy, X an, Chna {wp, bf018}@163.com Absrac. Based on he feaure map prncpe, Sparse Kerne Rdge Regresson (SKRR) mode s proposed. SKRR obans he sparseness by backward deeon feaure seecon procedure ha recursvey removes he feaure wh he smaes eave-one-ou score un he sop creron s sasfed. Besdes good generazaon performance, he mos compeng propery of SKRR s raher sparse, and moreover, he kerne funcon needs no o be posve defne. Expermens on synhec and benchmark daa ses vadae he feasby and vady of SKRR. 1 Inroducon Regresson probem s one of he fundamena probems n he fed of supervsed earnng. I can be hough of as esmang he rea vaued funcon from a sampes se of nose observaon. A very successfu approach for regresson s Suppor Vecor Regresson (SVR) [1-] ha aemps o smuaneousy mnmze emprca rsk and confdence nerva, eadng o good generazaon. Due o ε -nsensve oss funcon, SVR obans a sparse mode (predcon for a new npu ony needs he subse of ranng sampes). hough very successfu, SVR aso has some dsadvanages [3]: he souon s usuay no very sparse and he predcon speed for a new npu s sgnfcany sower han ha of some oher earnng machnes such as Neura Neworks [4-5]. Kerne funcon mus sasfy Mercer s condon. I s we known ha dfferen kerne funcons w nduce dfferen agorhms, achevng dfferen performance. However, kerne funcons n SVR mus sasfy Mercer s posve defne condon, whch ms he usabe kernes. In order o dea wh he probems menoned above, Reevance Vecor Machnes (RVM) [3] s proposed, whch s very eegan and obans a hgh sparse souon. However, RVM needs o sove near equaons, whose cos s very expensve, and herefore, s no appcabe o arge scae probems. In hs paper, we propose a new earnng mode, Sparse Kerne Rdge Regresson (SKRR) o overcome he above probems. In SKRR, sampes are mapped no he feaure space whose dmenson s equa o he sampe sze, and hen Rdge Regresson (RR) [6] s mpemened n he feaure space. When he ranng process s compeed,
2 a backward deeon feaure seecon procedure s apped o oban a sparse mode. Besdes good generazaon performance, he mos compeng propery of SKRR s s sparseness ha s comparabe wh ha of RVM. Anoher advanage of SKRR s ha he kerne funcon needs no o be posve defne. Expermens on synhec and benchmark daa ses assess he feasby and vady of SKRR. Kerne Rdge Regresson Le {( x, y ),,( x, y )} 1 1 be emprca sampes se drawn from y = f( x ) + ε, = 1,,,, (1) where y s corruped by addve nose ε, whose dsrbuons are usuay unknown. Learnng ams o nfer he funcon f ( x ) from he fne daa se {( x1, y1),,( x, y) }. A cassca mehod for he probem s Rdge Regresson (RR), whch s an exenson of near regresson by addng a quadrac penazng erm: ( ) wˆ = arg mn w x + w+ 1 y + λ w, () = 1 where λ s a fxed posve consan, caed reguarzaon coeffcen. Equaon () can be rewren as ˆ arg mn w = w X X+ λi w w X y, (3) ( ( ) ) ' x1,1 w1 where X =, w =. ',1 x w + 1 Rdge regresson [6] s a we-known approach for he souon of regresson probems, whch has a good generazaon performance as we as SVR, and he mode does no need he kerne funcon sasfyng Mercer s condon, moreover, here s an effcen eave-one-ou cross-vadaon mode seecon mehod. In order o make RR appcabe o he nonnear probems, we generaze by a feaure map. Defne a k( xx, ) = 1,,,, as shown vecor made up of a se of rea-vaued funcons { } [ k k ] 1 z = ( x, x ),, ( x, x ), (4) ' where k( xx, ) s kerne funcon ha needs no o be posve defne. We ca (, ),, (, ), n F = z z= k xx k xx x R (5) { [ 1 ] } feaure space. In parcuar, z y z y z = [ k x x k x x ] { (, ),,(, ) (, ),, (, ) } are emprca sampes n he feaure space. Subsung z for x, we have he Kerne Rdge Regresson (KRR) [7] n he feaure space F
3 ( ( λ ) ) w ˆ = arg mn w Z Z+ I w w Z y, (6) ' z1,1 w1 where Z =, w =. he correspondng decson funcon s ',1 z w + 1 f ( x) = wk( x, x ) + w + 1. (7) = 1 From equaon (6) and (7), we can oban ˆ + λ 1 w=[z Z I] Z Y. (8) 3 Sparse Kerne Rdge Regresson For a suabe λ, KRR has a good generazaon performance. Is key dsadvanage s no sparseness, whch seems o prohb s appcaon n some feds. In hs paper, we consder how o deee redundan feaures and smuaneousy keep good generazaon performance a an accepabe compuaona cos. 3.1 Feaure Seecon By Backward Deeon In order o oban he sparseness, a backward deeon procedure s mpemened afer he ranng process. Backward deeon procedure recursvey removes he feaure wh he smaes eave-one-ou score un he sop creron s sasfed. Le H = ( Z Z+λI ) and b = Z y, hen equaon (6) can be rewren as ( L ( )) w ˆ = arg mn = w Hw w b. (9) h When he k feaure s deeed a he h eraon, e H represen he sub-marx h formed by omng he k row and coumn of H. Le R represen he nverse of H, (8), we have w he weghs and f he opma vaue of L. Accordng o equaon k ( ) f b b (, ) = R j j. (10), j P {} k where P s a se of remanng feaures (varabes) a he h eraon. In erms of a rank-1 updae [8-9], R and w can be formuaed n equaon (11) and (1) (see Appendx for deas), where R represens he nverse of H. (, ) k k =,, j P {} k. (11) j j
4 ogeher wh w = bj, P { k}. (1) = R b, equaon (1) s smpfed as k j j P { k} P n ( w ) P k ( w ) = w wk, P { k}. (13) Subsung equaon (11) no (10), we oban (, ) k k f = b bj + j b b j, j P, j P. (14) bj j P = f + By he vrue of b j P j k = w, equaon (14) s ransaed no ( w ) k Δ f = f f =. (15) (, ) We ca Δ f k eave-one-ou score. A each eraon, we remove he feaure wh he smaes eave-one-ou score. he feaure o be deeed can be obaned by he foowng expresson. s = arg mn ( Δ f ). (16) k P { + 1} (,) s Z (,) s w (,) s H Fg.1. Dsrbuon of parameers afer he h s feaure was deeed. Fgure 1 shows he dsrbuon of parameers afer he s h feaure was deeed. Noe ha he ( + 1) h varabe s bas ha s reserved durng he feaure seecon process. A he h eraon, he oa ncrease of oss funcon L s op Δ f = bw L. (17) where P op L s he mnmum of equaon (9). We ermnae he agorhm f
5 ε op Δf L. (18) where ε s a sma posve number. Accordng o he dervaon above, Backward Deeon Feaure Seecon (BDFS) can be descrbed as he foowng Agorhm 1: Agorhm 1: BDFS Se P = {1,,, }, R = H 1 1, w = R b;. For k = 1 o, do: ( w ) k 3. (a) s = arg mn ; k P { + 1} ( ) (, ) s R s sj R = R,, j P s j j 4. (b) ( ) ( ) {} (, s) s 5. (c) ( w ) = w ws, ss P { s} ; 6. 1 (, s) (d) P = P {} s, 1 (, s) R = R, w + = w (e) IF ( bw + P bw) εbw, Sop. 8. End For 9. End Agorhm ss ; 3. Mode Seecon here exs free parameers ncudng kerne parameer and reguarzaon parameer n SKRR. In order o oban good generazaon, s needed o choose he suabe parameers. Cross-vadaon s a mode seecon mehod ofen used n esmang he generazaon performance of sasca cassfers. 10-fod cross-vadaon s ofen used n some kerne-based earnng agorhms such as SVMs. Leave-one-ou cross vadaon s he mos exreme form of cross-vadaon. Leave-one-ou cross vadaon error s an aracve mode seecon creron snce provdes an amos unbased esmaor of generazaon performance []. However, hs mehod s rarey adoped n he kerne machnes because s compuaonay expensve. Forunaey for SKRR, here s an effcen mpemenaon for eave-one-ou cross 3 vadaon ha ony ncurs a compuaona cos of O ( ) [10]. Le E( Γ ) be eave-oneou cross vadaon error, where Γ denoes free parameers.
6 1 Lemma 1 [11-1]. E( Γ ) = B( Γ)( I-A) y, where ( Γ) he jj h enry α ( Γ ), ( ) 1 ( 1 jj ) ( ) ( ) 1 Γ = +λ A Z Z Z I Z. Le he snguar vaue decomposon of Z be jj B s a dagona marx wh α Γ beng he jj h enry of Z = UDV, (19) where UV, are orhogona marces, D s a dagona marx. Subsung equaon (19) no A ( Γ), can be smpfed as ( Γ ) = ( +λ ) 1 A UD D D I D U, (0) where DD+ λi s a dagona marx. Hence for dfferen λ, we ony need o perform once marx decomposon. For cary, here we gve he dea seps of eave-one-ou mode seecon agorhm n whch σ represens kerne parameer: Agorhm : LOO Mode Seecon For σ, = 1,,, q Z = UDV ; For λ j, j = 1,,, p α = UD( DD+ λ I) DU ; 1 k k j k k ( ) B = 1 1 α ; r( k) = B( yk A kyk) E j 1 k = 1 = { r } ( k ) End for End for op op ( λ, σ ) = argmn( E )., j k j hus, accordng o he anayss n secon 3.1 and 3., SKRR can be descrbed n he foowng Agorhm 3: Agorhm 3: SKRR 1. Le λ be nose varance, and choose he kerne parameers usng LOO mode seecon agorhm;. ran KRR usng he seeced parameers; 3. Impemen BDFS for KRR; 4. Re-esmae λ usng LOO mode seecon agorhm; 5. Re-ran KRR on he smpfed feaures.
7 4 Smuaon In order o evauae he performance of he proposed agorhm, we performed SKRR on hree daa ses: Snc, Boson Housng and Abaone daa ses, and compared s performance wh ha of KRR, SVR, and RVM. For he sake of comparson, dfferen agorhms use he same npu sequence. he eemens of Gram Marx are consruced usng Gaussan kerne of he form k( xy, ) = exp. For SVR and x y σ RVM, we uze 10-fod cross vadaon procedure o choose he free parameers. For a daases, parameer ε n BDFS s se Large amouns of expermens show ha s a good seecon. 4.1 oy Expermen he Snc funcon y = sn( x) / x+ σ N(0,1) s a popuar choce o usrae suppor vecor machnes regresson. We generae ranng sampes from Snc funcon a 100 equay-spaced x -vaue n [ 10,10] wh added Gaussan nose of sandard devaon 0.1. Resus are averaged over 100 random nsanaons of he nose, wh he error beng measured over 1000 nose-free es sampes n [ 10,10]. he decson funcons and suppor vecors obaned by SKRR and SVR wh ε = 0.1 are shown n Fgure. Suppor vecors number and mean square error are summarzed n abe1. For Snc daa se, SKRR ouperforms oher hree agorhms. SKRR and RVM oban smar suppor vecors number ha s sgnfcany ess han ha of SVR. abe 1. Generazaon error obaned by he four agorhms on Snc daa se. MSE denoes mean square error and NSV denoes suppor vecors number. SKRR KRR SVR RVM MSE 0.85 ± ± ± ± 0.47 NSV 7.0 ± ± ± ± 0.38
8 Fg.. Suppor vecors and decson boundary obaned by SKRR and SVR. Rea red nes denoe he decson boundary and do nes denoe he Snc funcon; crces denoe he suppor vecors. 4. Benchmark Comparson Boson Housng and Abaone daa ses ha come from SALOG COLLECION [13] are popuar choces o evauae he performance of agorhms. Boson Housng daa se ncudes 506 exampes, 13 arbues of each exampe and Abaone daa se ncudes 4177 exampes, 7 arbues of each exampe. For he Boson Housng daa se, we average our resus over 100 random sps of he fu daase no 481 ranng sampes and 5 esng sampes. For he Abaone daa se, we average our resus over 10 random sps of he moher daase no 1000 ranng sampes and 3177 esng sampes. Before expermens, we scae a ranng sampes no [-1, 1] and hen adjus esng sampes usng he same near ransformaon. he resus are summarzed n abe and 3. abe. Generazaon error obaned by he four agorhms on Boson Housng daa se. MSE denoes mean square error and NSV denoes suppor vecors number. SKRR KRR SVR RVM MSE ± ± ± ± 6.8 NSV 5.3 ± ± ± ±.49 From he expermena resus (abe 1, and 3), we observe he foowng SKRR, KRR and RVM oban smar performance and are sghy superor o SVR. Boh SKRR and RVM are raher sparse. Suppor (reevance) vecors number of SKRR and RVM s much ess han ha of SVR. abe 3. Generazaon error obaned by he four agorhms on Abaone daa se. MSE denoes mean square error and NSV denoes suppor vecors number. SKRR KRR SVR RVM MSE 4.64 ± ± ± ± 0.1 NSV ± ± ± ±.45 5 Concuson Backward deeon feaure seecon provdes a sae-of-he-ar oo ha can deee redundan feaures and smuaneousy keep good generazaon performance a an accepabe compuaona cos. Based on BDFS, we propose SKRR ha obans hgh
9 sparseness. Furher appcaon of BDFS ncudes condensng he wo-sage RBF Neworks. Appcaon o near regresson s n progress. Appendx: he marx nverson formua for a symmerc marx wh bock sub-marces s gven n he foowng Lemma. Lemma [8]: Gven nverbe marces A and C, and marx B, he foowng equay hods: A B A + A BE B A E BC = B C C B E C + C B D BC, (A.1) 1 D= A BC B where. 1 E= C B A B Equaon (11) can be derved n erms of Lemma. We assume here ha B C s h he k feaure o be deeed. hus n our probem, R (, 1 corresponds o A, and corresponds o E, and k corresponds o A BE, P { k}. Observng he op ef bock of he as par n equaon (A.1), we have A + A BE E B A, 1 E (A.) whch corresponds o ( ) (, ) k R k +, j, j P {} k. (A.3) Hence, we have (, ) k k =, j j, j P {} k. (A.4) Acknowedgmens hs work was suppored by he Naona Naura Scence Foundaon of Chna under gran No and he Naona Grand Fundamena Research 973 Program of Chna under Gran No. 001CB
10 References 1. Vapnk, V: he Naure of Sasca Learnng heory. New York Sprnger-Verag (1995). Vapnk, V: Sasca Learnng heory. New York Wey-Inerscence Pubcaon (1998) 3. ppng, M.E: he Reevance Vecor Machne. Proc. Advances n Neura Informaon Processng Sysems 1, Soa,S., Leen,. and Müer K.-R. eds. Cambrdge, Mass MI Press (001) Nea, R: Bayesan Learnng for Neura Neworks. New York Sprnger Verag (1996) 5. Rpey R: Paern Recognon and Neura Neworks. Cambrdge, U.K. Cambrdge Unv. Press (1996) 6. Hoer, A. and Kennard, R: Rdge Regresson: Based Esmaon for Nonorhogona Probems. echnomercs 1 (1970) Saunder C. and Gammerman A: Rdge regresson earnng agorhm n dua varabes. In: Shavk J. edor. Machne earnng Proceedngs of he 15 h Inernaona Conference, Morgan Kaufmann (1998) 8. Soer, J. and Bursch, R: Inroducon o Numerca Anayss. Sprnger Verag, New York, Second Edon (1993) 9. Bo, L.F., Wang, L., and Jao, L.C.: Sparse Gaussan processes usng backward emnaon. Lecure Noes n Compuer Scence 3971 (006) Bo, L.F., Wang, L., and Jao, L.C.: Feaure scang for kerne Fsher dscrmnan anayss usng eave-one-ou cross vadaon. Neura Compuaon 18(4) (006) Aen, D.M: he reaonshp beween varabe seecon and predcon. echnomercs 16 (1974) Gavn C.C.: Effcen eave-one-ou cross-vadaon of kerne fsher dscrmnan cassfers. Paern Recognon 36(11) (003) Mche, D, Spegehaer, D. J., and ayor, C.C.: Machne Learnng, Neura and Sasca Cassfcaon. Prence Ha, Engewood Cffs, N.J., 1994, Daa avaabe a anonymous fp: fp.ncc.up.p/pub/saog/.
Sparse Kernel Ridge Regression Using Backward Deletion
Sparse Kerne Rdge Regresson Usng Bacward Deeon Lng Wang, Lefeng Bo, and Lcheng Jao Insue of Inegen Informaon Processng 7007, Xdan Unversy, X an, Chna {wp, bf08}@63.com Absrac. Based on he feaure map prncpe,
More informationThe Research of Algorithm for Data Mining Based on Fuzzy Theory
The Research of Agorhm for Daa Mnng Based on Fuzzy Theory Wang Amn, L Je, Schoo of Compuer and Informaon Engneerng Anyang Norma Unversy Anyang, Chna Shenyang Insue of Compung Technoogy Chnese Academy of
More informationAn introduction to Support Vector Machine
An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,
More informationCHAPTER 10: LINEAR DISCRIMINATION
CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g
More informationCHAPTER 7: CLUSTERING
CHAPTER 7: CLUSTERING Semparamerc Densy Esmaon 3 Paramerc: Assume a snge mode for p ( C ) (Chapers 4 and 5) Semparamerc: p ( C ) s a mure of denses Mupe possbe epanaons/prooypes: Dfferen handwrng syes,
More informationTHE POLYNOMIAL TENSOR INTERPOLATION
Pease ce hs arce as: Grzegorz Berna, Ana Ceo, The oynoma ensor neroaon, Scenfc Research of he Insue of Mahemacs and Comuer Scence, 28, oume 7, Issue, ages 5-. The webse: h://www.amcm.cz./ Scenfc Research
More informationAdvanced Machine Learning & Perception
Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel
More informationLecture 6: Learning for Control (Generalised Linear Regression)
Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson
More informationRobust and Accurate Cancer Classification with Gene Expression Profiling
Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem
More informationLecture VI Regression
Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M
More information( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model
BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng
More information( ) [ ] MAP Decision Rule
Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure
More informationMachine Learning Linear Regression
Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)
More informationJohn Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany
Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy
More informationNew M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)
Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor
More informationRobustness Experiments with Two Variance Components
Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference
More informationStochastic State Estimation and Control for Stochastic Descriptor Systems
Sochasc Sae smaon and Conro for Sochasc Descrpor Sysems hwe Gao and aoyan Sh Schoo of ecrc and ecronc ngneerng ann Unversy ann 372, Chna e-ma: zhwegac@pubc.p..cn bsrac In hs paper, a sochasc observer s
More informationCS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4
CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped
More informationGENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim
Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran
More informationVariants of Pegasos. December 11, 2009
Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on
More informationLecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,
Lecure Sdes for INTRODUCTION TO Machne Learnng ETHEM ALPAYDIN The MIT Press, 2004 aaydn@boun.edu.r h://www.cme.boun.edu.r/~ehem/2m CHAPTER 7: Cuserng Semaramerc Densy Esmaon Paramerc: Assume a snge mode
More informationFall 2010 Graduate Course on Dynamic Learning
Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/
More informationIn the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!
ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal
More informationOn One Analytic Method of. Constructing Program Controls
Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna
More informationSolution in semi infinite diffusion couples (error function analysis)
Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of
More informationOutline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model
Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon
More informationJanuary Examinations 2012
Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons
More informationShort-term Load Forecasting Model for Microgrid Based on HSA-SVM
MATEC Web of Conferences 73 0007 (08) hps://do.org/0.05/maecconf/08730007 SMIMA 08 Shor-erm Load Forecasng Mode for Mcrogrd Based on HSA-SVM Han Aoyang * Yu Lao An Shuhua Zhang Zhsheng 3* Qngdao Eecrc
More informationV.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS
R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon
More informationF-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction
ECOOMICS 35* -- OTE 9 ECO 35* -- OTE 9 F-Tess and Analyss of Varance (AOVA n he Smple Lnear Regresson Model Inroducon The smple lnear regresson model s gven by he followng populaon regresson equaon, or
More informationDepartment of Economics University of Toronto
Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of
More informationLecture 11 SVM cont
Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc
More informationM. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria
IOSR Journal of Mahemacs (IOSR-JM e-issn: 78-578, p-issn: 9-765X. Volume 0, Issue 4 Ver. IV (Jul-Aug. 04, PP 40-44 Mulple SolonSoluons for a (+-dmensonalhroa-sasuma shallow waer wave equaon UsngPanlevé-Bӓclund
More informationDynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005
Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc
More informationHEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD
Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,
More informationClustering (Bishop ch 9)
Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure
More informationVolatility Interpolation
Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local
More informationCHAPTER 5: MULTIVARIATE METHODS
CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he
More informationCH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC
CH.3. COMPATIBILITY EQUATIONS Connuum Mechancs Course (MMC) - ETSECCPB - UPC Overvew Compably Condons Compably Equaons of a Poenal Vecor Feld Compably Condons for Infnesmal Srans Inegraon of he Infnesmal
More informationComparison of Differences between Power Means 1
In. Journal of Mah. Analyss, Vol. 7, 203, no., 5-55 Comparson of Dfferences beween Power Means Chang-An Tan, Guanghua Sh and Fe Zuo College of Mahemacs and Informaon Scence Henan Normal Unversy, 453007,
More informationIntroduction to Boosting
Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled
More informationAppendix H: Rarefaction and extrapolation of Hill numbers for incidence data
Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs
More informationEFFICIENT TRAINING OF RBF NETWORKS VIA THE KURTOSIS AND SKEWNESS MINIMIZATION LEARNING ALGORITHM
Journa o heoreca and Apped Inormaon echnoogy 0 h February 03. Vo. 48 o. 005-03 JAI & LLS. A rghs reserved. ISS: 99-8645 www.a.org E-ISS: 87-395 EFFICIE RAIIG OF RBF EWORKS VIA HE KUROSIS AD SKEWESS MIIMIZAIO
More informationImproved Stumps Combined by Boosting for Text Categorization
1000-985/00/13(08)1361-07 00 Journa of Sofware Vo.13, No.8 Improved Sumps Comned y Boosng for Tex Caegorzaon DIAO L-, HU Ke-yun, LU Yu-chang, SHI Chun-y (Sae Key Laoraory of Inegen Technoogy and Sysem,
More informationIntroduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms
Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably
More informationAttribute Reduction Algorithm Based on Discernibility Matrix with Algebraic Method GAO Jing1,a, Ma Hui1, Han Zhidong2,b
Inernaonal Indusral Informacs and Compuer Engneerng Conference (IIICEC 05) Arbue educon Algorhm Based on Dscernbly Marx wh Algebrac Mehod GAO Jng,a, Ma Hu, Han Zhdong,b Informaon School, Capal Unversy
More informationJ i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.
umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal
More informationNormal Random Variable and its discriminant functions
Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The
More informationMath 128b Project. Jude Yuen
Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally
More informationCHAPTER 2: Supervised Learning
HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu
More informationApproximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy
Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae
More informationBayesian Inference of the GARCH model with Rational Errors
0 Inernaonal Conference on Economcs, Busness and Markeng Managemen IPEDR vol.9 (0) (0) IACSIT Press, Sngapore Bayesan Inference of he GARCH model wh Raonal Errors Tesuya Takash + and Tng Tng Chen Hroshma
More informationWiH Wei He
Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground
More information. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.
Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are
More information( ) () we define the interaction representation by the unitary transformation () = ()
Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger
More informationMachine Learning 2nd Edition
INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle
More informationComb Filters. Comb Filters
The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of
More informationA HIERARCHICAL KALMAN FILTER
A HIERARCHICAL KALMAN FILER Greg aylor aylor Fry Consulng Acuares Level 8, 3 Clarence Sree Sydney NSW Ausrala Professoral Assocae, Cenre for Acuaral Sudes Faculy of Economcs and Commerce Unversy of Melbourne
More informationComputing Relevance, Similarity: The Vector Space Model
Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are
More informationSOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β
SARAJEVO JOURNAL OF MATHEMATICS Vol.3 (15) (2007), 137 143 SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β M. A. K. BAIG AND RAYEES AHMAD DAR Absrac. In hs paper, we propose
More informationTHEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that
THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because
More information. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.
Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons
More informationRELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA
RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA Mchaela Chocholaá Unversy of Economcs Braslava, Slovaka Inroducon (1) one of he characersc feaures of sock reurns
More informationFTCS Solution to the Heat Equation
FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence
More informationTraining Algorithm of Adaptive Neural Fuzzy Inference System Based on Improved SRUKF
IAEG Inernaona Journa of Compuer Scence, 44:4, IJCS_44_4_ ranng Agorhm of Adapve eura Fuy Inference Sysem Based on Improved SRUKF Wang Hong and Gao onghu Absrac o sove he probem of ow predcon accuracy
More informationGraduate Macroeconomics 2 Problem set 5. - Solutions
Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K
More informationForecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets
Forecasng Usng Frs-Order Dfference of Tme Seres and Baggng of Compeve Assocave Nes Shuch Kurog, Ryohe Koyama, Shnya Tanaka, and Toshhsa Sanuk Absrac Ths arcle descrbes our mehod used for he 2007 Forecasng
More informationOn Kalman Information Fusion for Multiple Wireless Sensors Networks Systems with Multiplicative Noise
Sensors & ransducers Vo. 76 Issue 8 Auus 04 pp. 5764 Sensors & ransducers 04 b IFSA Pubshn S. L. hp://www.sensorspora.com On Kaman Informaon Fuson for Mupe Wreess Sensors Neworks Ssems wh Mupcave Nose
More informationBlock compressed sensing of video based on unstable sampling rates and multihypothesis predictions
ISSN : 0974-745 Voume 0 Issue 7 BoTechnoog BoTechnoog An Indan Journa BTAIJ, 0(7), 04 [95-95] Bock compressed sensng of vdeo based on unsabe sampng raes and muhpohess predcons Zhao Chengn *, Lu Jangun
More informationOrdinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s
Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class
More informationBayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance
INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule
More informationA New Excitation Control for Multimachine Power Systems II: Robustness and Disturbance Attenuation Analysis
88 Inernaona Journa of Conro Hars Auomaon E Psaks and and Sysems Anono vo T Aexandrds no (speca edon) pp 88-95 June 5 A New Excaon Conro for umachne Power Sysems II: Robusness and Dsurbance Aenuaon Anayss
More informationDEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL
DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA
More informationThe Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c
h Naonal Conference on Elecrcal, Elecroncs and Compuer Engneerng (NCEECE The Analyss of he Thcknesspredcve Model Based on he SVM Xumng Zhao,a,Yan Wang,band Zhmn B,c School of Conrol Scence and Engneerng,
More informationRobust Output Tracking of Uncertain Large-Scale Input-Delay Systems via Decentralized Fuzzy Sliding Mode Control
Inernaona Journa of Conro Scence and Engneerng (6: 57-7 DOI:.593/.conro.6.4 Robus Oupu rackng of Unceran Large-Scae Inpu-Deay Sysems va Decenrazed Fuzzy Sdng Mode Conro Chang-Che ng Chang Deparmen of Eecrca
More informationEcon107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)
Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen
More informationA Fully Distributed Reactive Power Optimization and Control Method for Active Distribution Networks
1 A Fuy Dsrbued Reacve Power Opmzaon Conro Mehod for Acve Dsrbuon Newors Weye Zheng, Wenchuan Wu, Senor Member, IEEE, Bomng Zhang, Feow, IEEE, Hongbn Sun, Senor Member, IEEE, Lu Ybng, Suden Member, IEEE
More informationDeriving the Dual. Prof. Bennett Math of Data Science 1/13/06
Dervng the Dua Prof. Bennett Math of Data Scence /3/06 Outne Ntty Grtty for SVM Revew Rdge Regresson LS-SVM=KRR Dua Dervaton Bas Issue Summary Ntty Grtty Need Dua of w, b, z w 2 2 mn st. ( x w ) = C z
More informationKayode Ayinde Department of Pure and Applied Mathematics, Ladoke Akintola University of Technology P. M. B. 4000, Ogbomoso, Oyo State, Nigeria
Journal of Mahemacs and Sascs 3 (4): 96-, 7 ISSN 549-3644 7 Scence Publcaons A Comparave Sudy of he Performances of he OLS and some GLS Esmaors when Sochasc egressors are boh Collnear and Correlaed wh
More informationSecure and Fast Digital Signatures using BCH Codes
0 IJCSNS Inernaona Journa of Compuer Scence and Nework Secury, VOL6 No0, Ocober 006 Secure and Fas Dga Sgnaures usng BCH Codes Omessaâd HAMDI *, Sam HARARI ** and Ammar BOUALLEGUE ***, (*),(***) SYSCOM
More informationAppendix to Online Clustering with Experts
A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper
More informationLecture 18: The Laplace Transform (See Sections and 14.7 in Boas)
Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on
More informationUsing Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code
Usng Fuzzy Paern Recognon o Deec Unknown Malcous Execuables Code Boyun Zhang,, Janpng Yn, and Jngbo Hao School of Compuer Scence, Naonal Unversy of Defense Technology, Changsha 40073, Chna hnxzby@yahoo.com.cn
More informationTSS = SST + SSE An orthogonal partition of the total SS
ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally
More informationMechanics Physics 151
Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu
More informationMechanics Physics 151
Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as
More informationShould Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth
Should Exac Index umbers have Sandard Errors? Theory and Applcaon o Asan Growh Rober C. Feensra Marshall B. Rensdorf ovember 003 Proof of Proposon APPEDIX () Frs, we wll derve he convenonal Sao-Vara prce
More informationFall 2009 Social Sciences 7418 University of Wisconsin-Madison. Problem Set 2 Answers (4) (6) di = D (10)
Publc Affars 974 Menze D. Chnn Fall 2009 Socal Scences 7418 Unversy of Wsconsn-Madson Problem Se 2 Answers Due n lecure on Thursday, November 12. " Box n" your answers o he algebrac quesons. 1. Consder
More informationLi An-Ping. Beijing , P.R.China
A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.
More informationScattering at an Interface: Oblique Incidence
Course Insrucor Dr. Raymond C. Rumpf Offce: A 337 Phone: (915) 747 6958 E Mal: rcrumpf@uep.edu EE 4347 Appled Elecromagnecs Topc 3g Scaerng a an Inerface: Oblque Incdence Scaerng These Oblque noes may
More informationLinear Response Theory: The connection between QFT and experiments
Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure
More informationGMM parameter estimation. Xiaoye Lu CMPS290c Final Project
GMM paraeer esaon Xaoye Lu M290c Fnal rojec GMM nroducon Gaussan ure Model obnaon of several gaussan coponens Noaon: For each Gaussan dsrbuon:, s he ean and covarance ar. A GMM h ures(coponens): p ( 2π
More informationThis document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.
Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,
More informationCHAPTER FOUR REPEATED MEASURES IN TOXICITY TESTING
CHAPTER FOUR REPEATED MEASURES IN TOXICITY TESTING 4. Inroducon The repeaed measures sudy s a very commonly used expermenal desgn n oxcy esng because no only allows one o nvesgae he effecs of he oxcans,
More informationMixtures Experiments with Mixing Errors
Mures Epermens h Mng Errors Aaa Ahuba & Aeander N. Donev Frs verson: June 009 Research Repor No. 0, 009, Probaby and Sascs Group Schoo o Mahemacs, he Unversy o Mancheser Mures Epermens h Mng Errors Aaa
More informationCS286.2 Lecture 14: Quantum de Finetti Theorems II
CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2
More informationOnline Appendix for. Strategic safety stocks in supply chains with evolving forecasts
Onlne Appendx for Sraegc safey socs n supply chans wh evolvng forecass Tor Schoenmeyr Sephen C. Graves Opsolar, Inc. 332 Hunwood Avenue Hayward, CA 94544 A. P. Sloan School of Managemen Massachuses Insue
More informationNeural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process
Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson
More information5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)
5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and
More informationMARKOV CHAIN AND HIDDEN MARKOV MODEL
MARKOV CHAIN AND HIDDEN MARKOV MODEL JIAN ZHANG JIANZHAN@STAT.PURDUE.EDU Markov chan and hdden Markov mode are probaby the smpest modes whch can be used to mode sequenta data,.e. data sampes whch are not
More information