Machine Learning Linear Regression

Size: px
Start display at page:

Download "Machine Learning Linear Regression"

Transcription

1 Machne Learnng Lnear Regresson Lesson 3

2 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML) esmaon Mamum A-Poseror (MAP) esmaon Bayesan Regresson Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( )

3 Inpu: Se of uples: d R s he sample Bascs n Regresson,, D,, R predcon or a value of an unknon funcon over he feaures of Supervsed echnque Goal: creae a funcon y : n one dmenson n d dmensons D : y, θ s he (unknon) se of parameers Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 3 )

4 Graphcal Eample of Regresson *? * Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 4 )

5 Graphcal Eample of Regresson y, * Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 5 )

6 Graphcal Eample of Regresson y *, * Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 6 )

7 arge Lnear regresson: he case of -dmensonal daa 40 Inpu daase D,,,, 0 R (one feaure) Inpu Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 7 )

8 arge Lnear regresson: he case of -dmensonal daa 40 Inpu daase D,,,, 0 R (one feaure) Inpu Predcor: Evaluae lne y, 0 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 8 )

9 arge Lnear regresson: he case of -dmensonal daa 40 Inpu daase so as Inpu D : 0 y D,,,, Learnng: Esmang he regresson coeffcens { 0, } hch are he eghs of he lnear equaon, 0 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 9 )

10 Leas squares lnear f o daa Mos popular esmaon mehod s leas squares. Deermne lnear eghs ha mnmze he sum of squared loss (SSL): J y, 0 Use sandard dfferenal calculus: dfferenae SSL h respec o 0, fnd zeros of each paral dfferenal equaon solve for 0, Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 0 )

11 Dervaves of parameers: J J ) var( ), cov( ˆ 0 ˆ ˆ Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( )

12 More dmensons (d>) R d Inpu has d feaures: d here are d+ lnear eghs for descrbng he regresson funcon y, Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( )

13 Lnear regresson model y, 0 d d 0 Alernave represenaon y, ~ d j j j here ~ 0 d d Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 3 )

14 Sum of Squared Error (SSE) Ho can e quanfy he error? y J, J 0 ~ Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 4 )

15 0 0 0 Error or resdual Predcon Observaon y, y ~, Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 5 ) Sum of Squared Error (SSE) y J ~,

16 Ho can e quanfy he error? ˆ X J X X J d d X d 0 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 6 )

17 Fndng good parameers Wan o fnd parameers hch mnmze he error hnk of a cos surface : error resdual for ha 0 J ˆ arg mn J Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 7 )

18 SSE Mnmzaon Consder a smple problem One feaure (d=), o daa pons (>) o unknons: 0, o equaons: 0 0 Can solve hs sysem drecly (X : ): X ˆ Hoever, mos of he me, > d+ here may be no lnear funcon ha hs all he daa eacly Insead, solve drecly for mnmum of SSE funcon X Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 8 )

19 J Reorderng, e ake he general case X X X J X 0 X X 0 Xˆ X X X X ˆ LS Leas Squares esmaor (X X) - X s called he pseudo-nverse If X s square and ndependen, hs s he nverse X - Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 9 )

20 Even hen (X X) s nverble, mgh be compuaonally epensve f X s huge. rea as an opmzaon problem: ˆ Opmzaon mehods of SSE arg mn Ho o fnd an esmaor? J() s conve n J arg mn J( ) J(, ) X Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 0 )

21 Graden Descen Opmzaon Seepes Descen Funcon decreases mos quckly n he drecon of he negave graden. J() J mn () η: learnng rae (or sep-sze) ne old J Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( )

22 Effec of sep-sze (η) J J (0) (0) Large η : fas convergence bu larger resdual error. May cause oscllaons Small η : slo convergence bu small resdual error. Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( )

23 . Graden descen Opmzaon scheme Snce J() s conve, move along negave of graden Inalze: (=0) Updae rule: ( ne) ( ne) ( old) (0) ( old ) X J X Sop: hen some creron s me (e.g. fed # eraons) or hen J (old ) ( old) ŵ (0) Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 3 )

24 . Sochasc Graden descen Prevous scheme as a bach graden descen: all ranng eamples are parcpaed a every sep ( ne) ( old ) X ( old ) ( old ) ( old ) X Sochasc graden descen scheme repeaedly eamnes a sngle eample a every sep: for,, ( ne) ( old) Advanage: ofen ges close o he mnmum much faser han bach mode. I s preferred over bach hen ranng se s large. ( old ) Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 4 )

25 3. eon-raphson Opmzaon scheme Updae rule: ( ne) H J here H: Hessan mar (second dervaves of J()) J H By subsung he prevous e oban he rule: ( ne) ( old) ( old) X J X X X hs s he leas-squares soluon and s eac soluon n one sep X X X X ( old) X X X X X Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 5 )

26 Effecs of SSE (l error) choce Sensvy o oulers cos for hs one daum Heavy penaly for large errors Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 6 )

27 Use he sum of absolue error (SAE) (l error) J 8 L, orgnal daa Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 7 )

28 Use he sum of absolue error (SAE) (l error) J 8 L, orgnal daa 6 L, orgnal daa Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 8 )

29 Use he sum of absolue error (SAE) (l error) J 8 L, orgnal daa L, orgnal daa L, ouler daa L s more robus o oulers. Hoever, he soluon s unsable and he opmzaon problem s harder (use paern search schemes, e.g. SIMPLEX). Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 9 )

30 on-lnear Regresson Consder non-lnear regresson Order polynom al E: hgher-order polynomals Order polynom al 8 Order 3 polynom al Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 30 )

31 J d 0 y k k k 0 0, Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 3 ) Polynomal Regresson

32 Eample: Use 3d polynomal. Sngle feaure, predc arge : D, Add feaures:, D,, Lnear regresson n ne feaures Somemes useful o hnk of feaure ransform y y y [,,, 3 ] Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 3 )

33 Learnng polynomal regresson coeffcens Usng Leas-squares: ˆ LS Usng Graden descen updae rule: ( ne) ( old) ( old) Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 33 )

34 Lnear Bass Funcon Models In general, can use any feaure se e hnk s useful Oher nformaon abou he problem e.g. locaon, age, Polynomal funcons Feaures [,,, 3, ] Oher funcons /, sqr(), *, [,, ] m Regresson remans Lnear = lnear n he parameers Feaures e can make as comple as e an! y m j j j j : R d bass funcons R Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 34 )

35 Eamples of Bass Funcon Polynomal bass funcons hese are global A small change n affec all bass funcons. Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 35 )

36 Gaussan bass funcons hese are local A small change n only affec nearby bass funcons. Parameers μ j and s conrol locaon and scale (dh). Relaed o kernel mehods. Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 36 )

37 Sgmod bass funcons here Also hese are local: a small change n only affec nearby bass funcons. Parameers μ j and s conrol locaon and scale (slope). Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 37 )

38 Addve models he bass funcons can capure varous properes of he npus (e.g. qualave) hese are called Addve models For eample: e can ry o rae documens based on e descrpors Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 38 )

39 We can ve he addve models graphcally n erms of smple uns and eghs. Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 39 )

40 Overfng 0 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 40 )

41 Overfng 3 9 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 4 )

42 Overfng and compley More comple models ll alays f he ranng daa beer Bu hey may overf he ranng daa, learnng comple relaonshps ha are no really presen Smple model Comple model y y Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 4 )

43 Mean squared error ranng versus es error Plo MSE as a funcon of model compley Polynomal order Decreases More comple funcon fs ranng daa beer ranng daa e, es daa Wha abou ne daa? 5 Lo order Error decreases 0 Underfng Hgher order 5 Error ncreases Overfng 0 0 Lo Polynomal order hgh Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 43 )

44 Dealng h Overfng Use more daa Use a unng se Regularzaon Be a Bayesan Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 44 ) 44

45 . Avodng overfng: Cross-valdaon Cross-valdaon allos us o esmae he generalzaon error based on ranng eamples alone Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 45 )

46 Leave-one-ou cross-valdaon reas each ranng eample n urn as a es eample: CV y, ˆ here ˆ are he leas squares esmaes of he parameers hou he h ranng eample. Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 46 )

47 Polynomal Regresson eample Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 47 )

48 . Regularzaon Overfng s reduced by mposng a consran on he overall magnude of he parameers. Objecve funcon s modfed J E E λ : regularzaon parameer E () se consrans o lnear eghs D W Daa erm + Regularzaon erm Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 48 )

49 L-Regularzaon or Rdge Regresson he regularzaon erm s quadrac ha penalze large eghs Objecve funcon J M j j W E Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 49 )

50 Regularzaon Dervaon W J 0 ˆ ˆ I 0 ˆ Ideny mar (m m) Rdge regresson or egh decay Snce he squared eghs s compable h he squared error funcon, e ge a nce closed form soluon for he opmal eghs. Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 50 )

51 Several regularzed regresson models More general regularzers Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 5 ) m j q j

52 L Regularzaon (LASSO) he regularzaon erm s E Objecve funcon M W j j J Ably o creae sparse models ha are more easly nerpreed Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 5 )

53 Consraned formulaon of he L Regularzaon mn s.. 0 Leas Absolue Selecon and Shrnkage Operaor (LASSO) Several opmzaon echnques for solvng he problem Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 53 )

54 An Opmzaon Scheme: Sequenally added sgn consrans mn X s.. Eamne all possble combnaons of he sgns of he elemens of. L L bshran, R. (996). Regresson shrnkage and selecon va he lasso. Journal of Royal. Sas. Soc B., Vol. 58, o., pages 67-88). Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 54 )

55 L vs. L regularzaon Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 55 )

56 L vs. L regularzaon. L regularzed regresson Quadrac regularzaon has he advanage ha he soluon s n closed form (no appeared n non-quadrac regularzers). L regularzaon shrnks coeffcens oards (bu no o) zero, and oards each oher. L regularzed regresson Lasso represens a conve opmzaon problem solved by quadrac programmng or oher conve opmzaon mehods L regularzaon shrnks coeffcens o zero a dfferen raes; dfferen values of gve models h dfferen subses of feaures. Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 56 )

57 Model Lkelhood and Emprcal Rsk o relaed bu dsnc ays o look a a model. Emprcal Rsk: Ho much error does he model have on he ranng daa? Model Lkelhood: Wha s he lkelhood ha a model generaed he observed daa? Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 57 )

58 Sascal ve of lnear regresson Assumpon: (Generave model) Observed oupu = consrucve funcon + sochasc nose y, Whaever e canno capure h our chosen famly of funcons ll be nerpreed as nose Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 58 )

59 Sascal ve of lnear regresson y, If e consder (he) Gaussan nose ~ 0, precson or nverse varance hen e nroduce he sochasc model: ~ y,, Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 59 )

60 Model Lkelhood ranng se: Generave model: Lkelhood funcon: assumng Independenly Idencally dsrbued (d) daa D,,,,, ep,,, y y p p D p,,, Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 60 )

61 Log-lkelhood: L L ln pd ln p, ln ln y, assumng a lnear regresson model of m bass funcons L, ln ln Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 6 )

62 Mamum Lkelhood (ML) esmaon Paral dervave of lnear eghs (): L ln ln, 0 ˆ L ˆ 0 ˆ LS ML ˆ ˆ Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 6 )

63 Mamum Lkelhood (ML) esmaon Paral dervave of nverse varance (β): L ln ln, 0 ˆ L ˆ ML ML ML Mean predcon squared error Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 63 )

64 MAP esmaon of Lnear Regresson MAP dervaon of lnear regresson assumes a pror dsrbuon over lnear eghs : p 0, I α: s a hyperparameer over and s he precson or nverse varance of he dsrbuon. hen, he poseror dsrbuon s obaned as p / ep X,,, p X,, p Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 64 )

65 p Opmze he Bayesan poseror X,,, p X,, p Se he MAP log-lkelhood funcon: L MAP ln p X,, ln p lkelhood + pror Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 65 )

66 p Opmze he Bayesan poseror X,,, p X,, p Se he MAP log-lkelhood funcon: L MAP ln p X,, ln p lkelhood L ML ln p X,, ln p,, ln ln Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 66 )

67 p Opmze he Bayesan poseror X,,, p X,, p Se he MAP log-lkelhood funcon: L MAP ln p X,, ln p pror ln p ln 0, I M M ln ln Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 67 )

68 Opmze he Bayesan poseror Ignorng erms ha do no depend on : L ln p X,, ln p MAP MAP esmaon of lnear eghs: L 0 ˆ I MAP MAP hus regularzed (rdge) L regresson reflecs a 0-mean soropc Gaussan pror on he eghs Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 68 )

69 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 69 ) Deermnsc approaches Sascal approaches Leas squares Mamum Lkelhood (ML) A summary of he Lnear Regresson echnques,, p 0, ~

70 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 70 ) Deermnsc approaches Sascal approaches Leas squares Mamum Lkelhood (ML) L regularzed (rdge) MAP h Gaussan pror m j j A summary of he Lnear Regresson echnques,, p 0, ~ I p 0, ln,, ln p X p

71 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 7 ) Deermnsc approaches Sascal approaches Leas squares Mamum Lkelhood (ML) L regularzed (rdge) MAP h Gaussan pror L regularzed (lasso) MAP h Laplacan pror m j j m j j A summary of he Lnear Regresson echnques,, p 0, ~ I p 0, ln,, ln p X p ~ Ce

72 Herarchcal Bayesan model p a, ~ 0, X ~ 0, I Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 7 )

73 Alernave Regresson models Elasc e Comple Pror for he eghs Penalzaon by eghed L and L norms Weghed Leas Squares: Assgn for each case (, ) a egh v 0 (he hgher he more mporan he case) hen: Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 73 ) ~ Ce V v V V WLS ˆ v v v V 0 0 J

74 Bayesan Lnear Regresson he prevous MLE or MAP dervaon of lnear regresson uses pon esmaes for he egh vecor,. Bayesan modelng esmaes he poseror dsrbuon of eghs afer recevng all observaons. hs allos us o fnd he dsrbuon of he arge of he ne comng npu, and hus make predcon. Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 74 )

75 Generave model: p Le observaons:,,, hen, he jon dsrbuon of observaons ={,, } s: p We rea ha lnear eghs as Gaussan random varables h mean m 0 and covarance mar S 0 p m 0, S 0,, D,, X, p,, I Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 75 )

76 Poseror dsrbuon: produc of o Gaussans 0 0,,,, S m I p X p X p C. M. Bshop, Paern Recognon and Machne Learnng, page 689 Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 76 )

77 p Subsung e ake he general form of he poseror dsrbuon:, X, I m, S, 0 0 here: S If m 0 = 0 and S 0 = α - I hen m S S m S S 0 0 m0 S ai m I I Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 77 )

78 Predcve dsrbuon Assumng a ne npu * e are lookng for makng a predcon of s arge *. hs s equvalen on esmang he poseror dsrbuon: Margnalzng e ake: * * p, * * * *, p, p p d here p m, S m S I ai p * * * *,, Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 78 )

79 Predcve dsrbuon (con.) Accordng o Gaussan properes (B.44) e receve: here: m S p, d p p p,, * * * * * * * *,, p * * * * *,, m p * * * S ai S Machne Learnng 07 Compuer Scence & Engneerng, Unversy of Ioannna ML ( 79 )

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

CHAPTER 2: Supervised Learning

CHAPTER 2: Supervised Learning HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

Supervised Learning in Multilayer Networks

Supervised Learning in Multilayer Networks Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp://www.cambrdge.org/521642981 You can buy hs book for 3 pounds or $5. See hp://www.nference.phy.cam.ac.uk/mackay/la/ for lnks.

More information

CSCE 478/878 Lecture 5: Artificial Neural Networks and Support Vector Machines. Stephen Scott. Introduction. Outline. Linear Threshold Units

CSCE 478/878 Lecture 5: Artificial Neural Networks and Support Vector Machines. Stephen Scott. Introduction. Outline. Linear Threshold Units (Adaped from Ehem Alpaydn and Tom Mchell) Consder humans: Toal number of neurons Neuron schng me 3 second (vs ) Connecons per neuron 4 5 Scene recognon me second nference seps doesn seem lke enough ) much

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project GMM paraeer esaon Xaoye Lu M290c Fnal rojec GMM nroducon Gaussan ure Model obnaon of several gaussan coponens Noaon: For each Gaussan dsrbuon:, s he ean and covarance ar. A GMM h ures(coponens): p ( 2π

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

Structural Optimization Using Metamodels

Structural Optimization Using Metamodels Srucural Opmzaon Usng Meamodels 30 Mar. 007 Dep. o Mechancal Engneerng Dong-A Unvers Korea Kwon-Hee Lee Conens. Numercal Opmzaon. Opmzaon Usng Meamodels Impac beam desgn WB Door desgn 3. Robus Opmzaon

More information

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c h Naonal Conference on Elecrcal, Elecroncs and Compuer Engneerng (NCEECE The Analyss of he Thcknesspredcve Model Based on he SVM Xumng Zhao,a,Yan Wang,band Zhmn B,c School of Conrol Scence and Engneerng,

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

Clustering with Gaussian Mixtures

Clustering with Gaussian Mixtures Noe o oher eachers and users of hese sldes. Andrew would be delghed f you found hs source maeral useful n gvng your own lecures. Feel free o use hese sldes verbam, or o modfy hem o f your own needs. PowerPon

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION HAPER : LINEAR DISRIMINAION Dscmnan-based lassfcaon 3 In classfcaon h K classes ( k ) We defned dsmnan funcon g () = K hen gven an es eample e chose (pedced) s class label as f g () as he mamum among g

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

Fitting a Conditional Linear Gaussian Distribution

Fitting a Conditional Linear Gaussian Distribution Fng a Condonal Lnear Gaussan Dsrbuon Kevn P. Murphy 28 Ocober 1998 Revsed 29 January 2003 1 Inroducon We consder he problem of fndng he maxmum lkelhood ML esmaes of he parameers of a condonal Gaussan varable

More information

The Performance of Optimum Response Surface Methodology Based on MM-Estimator

The Performance of Optimum Response Surface Methodology Based on MM-Estimator The Performance of Opmum Response Surface Mehodology Based on MM-Esmaor Habshah Md, Mohd Shafe Musafa, Anwar Frano Absrac The Ordnary Leas Squares (OLS) mehod s ofen used o esmae he parameers of a second-order

More information

Panel Data Regression Models

Panel Data Regression Models Panel Daa Regresson Models Wha s Panel Daa? () Mulple dmensoned Dmensons, e.g., cross-secon and me node-o-node (c) Pongsa Pornchawseskul, Faculy of Economcs, Chulalongkorn Unversy (c) Pongsa Pornchawseskul,

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

Introduction to Compact Dynamical Modeling. III.1 Reducing Linear Time Invariant Systems. Luca Daniel Massachusetts Institute of Technology

Introduction to Compact Dynamical Modeling. III.1 Reducing Linear Time Invariant Systems. Luca Daniel Massachusetts Institute of Technology SF & IH Inroducon o Compac Dynamcal Modelng III. Reducng Lnear me Invaran Sysems Luca Danel Massachuses Insue of echnology Course Oulne Quck Sneak Prevew I. Assemblng Models from Physcal Problems II. Smulang

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations.

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations. Soluons o Ordnary Derenal Equaons An ordnary derenal equaon has only one ndependen varable. A sysem o ordnary derenal equaons consss o several derenal equaons each wh he same ndependen varable. An eample

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

Pattern Classification (III) & Pattern Verification

Pattern Classification (III) & Pattern Verification Preare by Prof. Hu Jang CSE638 --4 CSE638 3. Seech & Language Processng o.5 Paern Classfcaon III & Paern Verfcaon Prof. Hu Jang Dearmen of Comuer Scence an Engneerng York Unversy Moel Parameer Esmaon Maxmum

More information

arxiv: v1 [math.oc] 11 Dec 2014

arxiv: v1 [math.oc] 11 Dec 2014 Nework Newon Aryan Mokhar, Qng Lng and Alejandro Rbero Dep. of Elecrcal and Sysems Engneerng, Unversy of Pennsylvana Dep. of Auomaon, Unversy of Scence and Technology of Chna arxv:1412.374v1 [mah.oc] 11

More information

P R = P 0. The system is shown on the next figure:

P R = P 0. The system is shown on the next figure: TPG460 Reservor Smulaon 08 page of INTRODUCTION TO RESERVOIR SIMULATION Analycal and numercal soluons of smple one-dmensonal, one-phase flow equaons As an nroducon o reservor smulaon, we wll revew he smples

More information

Comparison of Supervised & Unsupervised Learning in βs Estimation between Stocks and the S&P500

Comparison of Supervised & Unsupervised Learning in βs Estimation between Stocks and the S&P500 Comparson of Supervsed & Unsupervsed Learnng n βs Esmaon beween Socks and he S&P500 J. We, Y. Hassd, J. Edery, A. Becker, Sanford Unversy T I. INTRODUCTION HE goal of our proec s o analyze he relaonshps

More information

Kernel-Based Bayesian Filtering for Object Tracking

Kernel-Based Bayesian Filtering for Object Tracking Kernel-Based Bayesan Flerng for Objec Trackng Bohyung Han Yng Zhu Dorn Comancu Larry Davs Dep. of Compuer Scence Real-Tme Vson and Modelng Inegraed Daa and Sysems Unversy of Maryland Semens Corporae Research

More information

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets

Forecasting Using First-Order Difference of Time Series and Bagging of Competitive Associative Nets Forecasng Usng Frs-Order Dfference of Tme Seres and Baggng of Compeve Assocave Nes Shuch Kurog, Ryohe Koyama, Shnya Tanaka, and Toshhsa Sanuk Absrac Ths arcle descrbes our mehod used for he 2007 Forecasng

More information

Advanced time-series analysis (University of Lund, Economic History Department)

Advanced time-series analysis (University of Lund, Economic History Department) Advanced me-seres analss (Unvers of Lund, Economc Hsor Dearmen) 3 Jan-3 Februar and 6-3 March Lecure 4 Economerc echnues for saonar seres : Unvarae sochasc models wh Box- Jenns mehodolog, smle forecasng

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems Genec Algorhm n Parameer Esmaon of Nonlnear Dynamc Sysems E. Paeraks manos@egnaa.ee.auh.gr V. Perds perds@vergna.eng.auh.gr Ah. ehagas kehagas@egnaa.ee.auh.gr hp://skron.conrol.ee.auh.gr/kehagas/ndex.hm

More information

Chapter 6 DETECTION AND ESTIMATION: Model of digital communication system. Fundamental issues in digital communications are

Chapter 6 DETECTION AND ESTIMATION: Model of digital communication system. Fundamental issues in digital communications are Chaper 6 DEECIO AD EIMAIO: Fundamenal ssues n dgal communcaons are. Deecon and. Esmaon Deecon heory: I deals wh he desgn and evaluaon of decson makng processor ha observes he receved sgnal and guesses

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA Mchaela Chocholaá Unversy of Economcs Braslava, Slovaka Inroducon (1) one of he characersc feaures of sock reurns

More information

ABSTRACT KEYWORDS. Bonus-malus systems, frequency component, severity component. 1. INTRODUCTION

ABSTRACT KEYWORDS. Bonus-malus systems, frequency component, severity component. 1. INTRODUCTION EERAIED BU-MAU YTEM ITH A FREQUECY AD A EVERITY CMET A IDIVIDUA BAI I AUTMBIE IURACE* BY RAHIM MAHMUDVAD AD HEI HAAI ABTRACT Frangos and Vronos (2001) proposed an opmal bonus-malus sysems wh a frequency

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

Additive Outliers (AO) and Innovative Outliers (IO) in GARCH (1, 1) Processes

Additive Outliers (AO) and Innovative Outliers (IO) in GARCH (1, 1) Processes Addve Oulers (AO) and Innovave Oulers (IO) n GARCH (, ) Processes MOHAMMAD SAID ZAINOL, SITI MERIAM ZAHARI, KAMARULZAMMAN IBRAHIM AZAMI ZAHARIM, K. SOPIAN Cener of Sudes for Decson Scences, FSKM, Unvers

More information

Linear Regression Linear Regression with Shrinkage

Linear Regression Linear Regression with Shrinkage Lear Regresso Lear Regresso h Shrkage Iroduco Regresso meas predcg a couous (usuall scalar oupu from a vecor of couous pus (feaures x. Example: Predcg vehcle fuel effcec (mpg from 8 arbues: Lear Regresso

More information

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs

More information

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC CH.3. COMPATIBILITY EQUATIONS Connuum Mechancs Course (MMC) - ETSECCPB - UPC Overvew Compably Condons Compably Equaons of a Poenal Vecor Feld Compably Condons for Infnesmal Srans Inegraon of he Infnesmal

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

How about the more general "linear" scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )?

How about the more general linear scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )? lmcd Lnear ransformaon of a vecor he deas presened here are que general hey go beyond he radonal mar-vecor ype seen n lnear algebra Furhermore, hey do no deal wh bass and are equally vald for any se of

More information

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa

More information

Boosted LMS-based Piecewise Linear Adaptive Filters

Boosted LMS-based Piecewise Linear Adaptive Filters 016 4h European Sgnal Processng Conference EUSIPCO) Boosed LMS-based Pecewse Lnear Adapve Flers Darush Kar and Iman Marvan Deparmen of Elecrcal and Elecroncs Engneerng Blken Unversy, Ankara, Turkey {kar,

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5 TPG460 Reservor Smulaon 08 page of 5 DISCRETIZATIO OF THE FOW EQUATIOS As we already have seen, fne dfference appromaons of he paral dervaves appearng n he flow equaons may be obaned from Taylor seres

More information

Let s treat the problem of the response of a system to an applied external force. Again,

Let s treat the problem of the response of a system to an applied external force. Again, Page 33 QUANTUM LNEAR RESPONSE FUNCTON Le s rea he problem of he response of a sysem o an appled exernal force. Agan, H() H f () A H + V () Exernal agen acng on nernal varable Hamlonan for equlbrum sysem

More information

Approximation Lasso Methods for Language Modeling

Approximation Lasso Methods for Language Modeling Approxmaon Lasso Mehods for Language Modelng Janfeng Gao Mcrosof Research One Mcrosof Way Redmond WA 98052 USA jfgao@mcrosof.com Hsam Suzuk Mcrosof Research One Mcrosof Way Redmond WA 98052 USA hsams@mcrosof.com

More information

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria IOSR Journal of Mahemacs (IOSR-JM e-issn: 78-578, p-issn: 9-765X. Volume 0, Issue 4 Ver. IV (Jul-Aug. 04, PP 40-44 Mulple SolonSoluons for a (+-dmensonalhroa-sasuma shallow waer wave equaon UsngPanlevé-Bӓclund

More information

ISCA Archive

ISCA Archive ISCA Archve hp://www.sca-speech.org/archve ODYSSEY4 - The Speaker and Language Recognon Workshop Toledo, Span May 3 - June 3, 4 Speaker Idenfcaon wh Dual Penalzed Logsc Regresson Machne Tomoko Masu and

More information

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because

More information

Single and Multiple Object Tracking Using a Multi-Feature Joint Sparse Representation

Single and Multiple Object Tracking Using a Multi-Feature Joint Sparse Representation Sngle and Mulple Objec Trackng Usng a Mul-Feaure Jon Sparse Represenaon Wemng Hu, We L, and Xaoqn Zhang (Naonal Laboraory of Paern Recognon, Insue of Auomaon, Chnese Academy of Scences, Bejng 100190) {wmhu,

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

2. SPATIALLY LAGGED DEPENDENT VARIABLES

2. SPATIALLY LAGGED DEPENDENT VARIABLES 2. SPATIALLY LAGGED DEPENDENT VARIABLES In hs chaper, we descrbe a sascal model ha ncorporaes spaal dependence explcly by addng a spaally lagged dependen varable y on he rgh-hand sde of he regresson equaon.

More information

CS286.2 Lecture 14: Quantum de Finetti Theorems II

CS286.2 Lecture 14: Quantum de Finetti Theorems II CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2

More information

Probabilistic Forecasting of Wind Power Ramps Using Autoregressive Logit Models

Probabilistic Forecasting of Wind Power Ramps Using Autoregressive Logit Models obablsc Forecasng of Wnd Poer Ramps Usng Auoregressve Log Models James W. Taylor Saїd Busness School, Unversy of Oford 8 May 5 Brunel Unversy Conens Wnd poer and ramps Condonal AR log (CARL) Condonal AR

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

, t 1. Transitions - this one was easy, but in general the hardest part is choosing the which variables are state and control variables

, t 1. Transitions - this one was easy, but in general the hardest part is choosing the which variables are state and control variables Opmal Conrol Why Use I - verss calcls of varaons, opmal conrol More generaly More convenen wh consrans (e.g., can p consrans on he dervaves More nsghs no problem (a leas more apparen han hrogh calcls of

More information

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times Reacve Mehods o Solve he Berh AllocaonProblem wh Sochasc Arrval and Handlng Tmes Nsh Umang* Mchel Berlare* * TRANSP-OR, Ecole Polyechnque Fédérale de Lausanne Frs Workshop on Large Scale Opmzaon November

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Appendix to Online Clustering with Experts

Appendix to Online Clustering with Experts A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

Research on Application of Sintering Basicity of Based on Various Intelligent Algorithms

Research on Application of Sintering Basicity of Based on Various Intelligent Algorithms ELKOMIKA Indonesan Journal of Elecrcal Engneerng Vol., o., ovember, pp. 778 ~ 777 DOI:.59/elkomnka.v.678 778 Research on Applcaon of Snerng Bascy of Based on Varous Inellgen Algorhms Song Qang*, Zhang

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

Face Detection: The Problem

Face Detection: The Problem Face Deecon and Head Trackng Yng Wu yngwu@ece.norhwesern.edu Elecrcal Engneerng & Comuer Scence Norhwesern Unversy, Evanson, IL h://www.ece.norhwesern.edu/~yngwu Face Deecon: The Problem The Goal: Idenfy

More information

Tools for Analysis of Accelerated Life and Degradation Test Data

Tools for Analysis of Accelerated Life and Degradation Test Data Acceleraed Sress Tesng and Relably Tools for Analyss of Acceleraed Lfe and Degradaon Tes Daa Presened by: Reuel Smh Unversy of Maryland College Park smhrc@umd.edu Sepember-5-6 Sepember 28-30 206, Pensacola

More information

[Link to MIT-Lab 6P.1 goes here.] After completing the lab, fill in the following blanks: Numerical. Simulation s Calculations

[Link to MIT-Lab 6P.1 goes here.] After completing the lab, fill in the following blanks: Numerical. Simulation s Calculations Chaper 6: Ordnary Leas Squares Esmaon Procedure he Properes Chaper 6 Oulne Cln s Assgnmen: Assess he Effec of Sudyng on Quz Scores Revew o Regresson Model o Ordnary Leas Squares () Esmaon Procedure o he

More information

doi: info:doi/ /

doi: info:doi/ / do: nfo:do/0.063/.322393 nernaonal Conference on Power Conrol and Opmzaon, Bal, ndonesa, -3, June 2009 A COLOR FEATURES-BASED METHOD FOR OBJECT TRACKNG EMPLOYNG A PARTCLE FLTER ALGORTHM Bud Sugand, Hyoungseop

More information