Advanced Introduction to Machine Learning

Size: px
Start display at page:

Download "Advanced Introduction to Machine Learning"

Transcription

1 Advaced Itroducto to Mache Learg 075, Fall 04 Lear Regresso ad Sparst Erc g Lecture, September 0, 04 Readg: Erc CMU, 04

2 Mache learg for apartmet hutg ow ou've moved to Pttsburgh!! Ad ou wat to fd the most reasoabl prced apartmet satsfg our eeds: square-ft., # of bedroom, dstace to campus Lvg area ft # bedroom Ret $ ? 70.5? Erc CMU, 04

3 he learg problem Features: Lvg area, dstace to campus, # bedroom Deote as =[,, k ] arget: Ret Deoted as rag set: ret Lvg area ret Lvg area Locato k k k Y or Erc CMU, 04 3

4 Lear Regresso Assume that Y target s a lear fucto of features: e.g.: let's assume a vacuous "feature" 0 = ths s the tercept term, wh?, ad defe the feature vector to be: the we have the followg geeral represetato of the lear fucto: Our goal s to pck the optmal. How! We seek that mmze the followg cost fucto: 0 ˆ J ˆ Erc CMU, 04 4

5 he Least-Mea-Square LMS method he Cost Fucto: J Cosder a gradet descet algorthm: t t j j J j t Erc CMU, 04 5

6 he Least-Mea-Square LMS method ow we have the followg descet rule: t j t j t j For a sgle trag pot, we have: hs s kow as the LMS update rule, or the Wdrow-Hoff learg rule hs s actuall a "stochastc", "coordate" descet algorthm hs ca be used as a o-le algorthm Erc CMU, 04 6

7 Geometr ad Covergece of LMS t t = = =3 t Clam: whe the step sze satsfes certa codto, ad whe certa other techcal codtos are satsfed, LMS wll coverge to a optmal rego. Erc CMU, 04 7

8 Steepest Descet ad LMS Steepest descet ote that: hs s as a batch gradet descet algorthm k J J J,, t t t Erc CMU, 04 8

9 he ormal equatos Wrte the cost fucto matr form: o mmze Jθ, take dervatve ad set to zero: J 0 J tr tr tr tr he ormal equatos * Erc CMU, 04 9

10 Commets o the ormal equato I most stuatos of practcal terest, the umber of data pots s larger tha the dmesoalt k of the put space ad the matr s of full colum rak. If ths codto holds, the t s eas to verf that s ecessarl vertble. he assumpto that s vertble mples that t s postve defte, thus at the crtcal pot we have foud s a mmum. What f has less tha full colum rak? regularzato later. Erc CMU, 04 0

11 Drect ad Iteratve methods Drect methods: we ca acheve the soluto a sgle step b solvg the ormal equato Usg Gaussa elmato or QR decomposto, we coverge a fte umber of steps It ca be feasble whe data are streamg real tme, or of ver large amout Iteratve methods: stochastc or steepest gradet Covergg a lmtg sese But more attractve large practcal problems Cauto s eeded for decdg the learg rate Erc CMU, 04

12 Covergece rate heorem: the steepest descet equato algorthm coverge to the mmum of the cost characterzed b ormal equato: If A formal aalss of LMS eed more math-mussels; practce, oe ca use a small, or graduall decrease. Erc CMU, 04

13 A Summar: LMS update rule Pros: o-le, low per-step cost, fast covergece ad perhaps less proe to local optmum Cos: covergece to optmum ot alwas guarateed Steepest descet t Pros: eas to mplemet, coceptuall clea, guarateed covergece Cos: batch, ofte slow covergg ormal equatos t j * t t, Pros: a sgle-shot algorthm! Easest to mplemet. t j t Cos: eed to compute pseudo-verse -, epesve, umercal ssues e.g., matr s sgular.., although there are was to get aroud ths Erc CMU, 04 3

14 Geometrc Iterpretato of LMS he predctos o the trag data are: ote that ad s the orthogoal projecto of to the space spaed b the colums of ˆ * I ˆ 0 I ˆ!! ŷ Erc CMU, 04 4

15 Probablstc Iterpretato of LMS Let us assume that the target varable ad the puts are related b the equato: where ε s a error term of umodeled effects or radom ose ow assume that ε follows a Gaussa 0,σ, the we have: B depedece assumpto: ep ; p p L ep ; Erc CMU, 04 5

16 Probablstc Iterpretato of LMS, cot. Hece the log-lkelhood s: l log Do ou recogze the last term? Yes t s: J hus uder depedece assumpto, LMS s equvalet to MLE of θ! Erc CMU, 04 6

17 Case stud: predctg gee epresso he geetc pcture causal SPs CGCACGACAA a uvarate pheotpe:.e., the epresso test of a gee Erc CMU, 04 7

18 Assocato Mappg as Regresso Idvdual Idvdual Idvdual Pheotpe BMI Geotpe.. C C C..... A.. C G..... A.. G A..... C C G C G G Beg SPs Causal SP Erc CMU, 04 8

19 Assocato Mappg as Regresso Pheotpe BMI Geotpe Idvdual Idvdual Idvdual = J j j j SPs wth large β j are relevat Erc CMU, 04 9

20 Epermetal setup Asthama dataset 543 dvduals, geotped at 34 SPs Dplod data was trasformed to 0/ for homozgotes or for heterozgotes =54334 matr Y=Pheotpe varable cotuous A sgle pheotpe was used for regresso Implemetato detals Iteratve methods: Batch update ad ole update mplemeted. For both methods, step sze α s chose to be a small fed value 0-6. hs choce s based o the data used for epermets. Both methods are ol ru to a mamum of 000 epochs or utl the chage trag MSE s less tha 0-4 Erc CMU, 04 0

21 Covergece Curves For the batch method, the trag MSE s tall large due to uformed talzato I the ole update, updates for ever epoch reduces MSE to a much smaller value. Erc CMU, 04

22 he Leared Coeffcets Erc CMU, 04

23 Multvarate Regresso for rat Assocato Aalss rat Geotpe Assocato Stregth G A A C C A G A A G A. =? = β Erc CMU, 04 3

24 Multvarate Regresso for rat Assocato Aalss rat Geotpe Assocato Stregth G A A C C A G A A G A. = Ma o-zero assocatos: Whch SPs are trul sgfcat? Erc CMU, 04 4

25 Sparst Oe commo assumpto to make sparst. Makes bologcal sese: each pheotpe s lkel to be assocated wth a small umber of SPs, rather tha all the SPs. Makes statstcal sese: Learg s ow feasble hgh dmesos wth small sample sze Erc CMU, 04 5

26 Sparst: I a mathematcal sese Cosder least squares lear regresso problem: Sparst meas most of the beta s are zero But ths s ot cove!!! Ma local optma, computatoall tractable. Erc CMU, 04 6

27 L Regularzato LASSO bshra, 996 A cove relaato. Costraed Form Lagraga Form Stll eforces sparst! Erc CMU, 04 7

28 heoretcal Guaratees Assumptos Depedec Codto: Relevat Covarates are ot overl depedet Icoherece Codto: Large umber of rrelevat covarates caot be too correlated wth relevat covarates Strog cocetrato bouds: Sample quattes coverge to epected values quckl If these are assumptos are met, LASSO wll asmptotcall recover correct subset of covarates that relevat. Erc CMU, 04 8

29 Cosstet Structure Recover [Zhao ad Yu 006] Erc CMU, 04 9

30 Lasso for Reducg False Postves rat Geotpe Assocato Stregth. = G A A C C A G A A G A Lasso Pealt for sparst J + β j j Ma zero assocatos sparse results, but what f there are multple related trats? Erc CMU, 04 30

31 Rdge Regresso vs Lasso Rdge Regresso: Lasso: HO! βs wth costat Jβ level sets of Jβ βs wth costat l orm β βs wth costat l orm β Lasso l pealt results sparse solutos vector wth more zero coordates Good for hgh dmesoal problems do t have to store all coordates! Erc CMU, 04 3

32 Baesa Iterpretato reat the dstrbuto parameters also as a radom varable he a posteror dstrbuto of after seem the data s: hs s Baes Rule lkelhood margal pror lkelhood posteror d p D p p D p D p p D p D p he pror p. ecodes our pror kowledge about the doma Erc CMU, 04 3

33 Regularzed Least Squares ad MAP What f s ot vertble? log lkelhood log pror I Gaussa Pror 0 Rdge Regresso Closed form: HW Pror belef that β s Gaussa wth zero mea bases soluto to small β Erc CMU, 04 33

34 Regularzed Least Squares ad MAP What f s ot vertble? log lkelhood log pror II Laplace Pror Lasso Closed form: HW Pror belef that β s Laplace wth zero mea bases soluto to small β Erc CMU, 04 34

35 ake home message Gradet descet O-le Batch ormal equatos Geometrc terpretato of LMS Probablstc terpretato of LMS, ad equvalece of LMS ad MLE uder certa assumpto what? Sparst: Approach: rdge vs. lasso regresso Iterpretato: regularzed regresso versus Baesa regresso Algorthm: cove optmzato we dd ot dscuss ths LR does ot mea fttg lear relatos, but lear combato or bass fuctos that ca be o-lear Weghtg pots b mportace versus b ftess Erc CMU, 04 35

36 After class materal Erc CMU, 04 36

37 Advaced Materal: Beod basc LR LR wth o-lear bass fuctos Locall weghted lear regresso Regresso trees ad Multlear Iterpolato We wll dscuss ths et class after we set the state rght! f we ve got tme Erc CMU, 04 37

38 LR wth o-lear bass fuctos LR does ot mea we ca ol deal wth lear relatoshps We are free to desg o-lear features uder LR 0 m j j where the j are fed bass fuctos ad we defe 0 =. Eample: polomal regresso: :, 3,, We wll be cocered wth estmatg dstrbutos over the weghts θ ad choosg the model order M. Erc CMU, 04 38

39 Bass fuctos here are ma bass fuctos, e.g.: Polomal Radal bass fuctos Sgmodal j j j j j s ep j s Sples, Fourer, Wavelets, etc Erc CMU, 04 39

40 D ad D RBFs D RBF After ft: Erc CMU, 04 40

41 Good ad Bad RBFs A good D RBF wo bad D RBFs Erc CMU, 04 4

42 Overfttg ad uderfttg j 0 j j Erc CMU, 04 4

43 Bas ad varace We defe the bas of a model to be the epected geeralzato error eve f we were to ft t to a ver sa, ftel large trag set. B fttg "spurous" patters the trag set, we mght aga obta a model wth large geeralzato error. I ths case, we sa the model has large varace. Erc CMU, 04 43

44 Locall weghted lear regresso he algorthm: Istead of mmzg ow we ft θ to mmze Where do w 's come from? J J w ep w where s the quer pot for whch we'd lke to kow ts correspodg Essetall we put hgher weghts o errors o trag eamples that are close to the quer pot tha those that are further awa from the quer Erc CMU, 04 44

45 Parametrc vs. o-parametrc Locall weghted lear regresso s the secod eample we are rug to of a o-parametrc algorthm. what s the frst? he uweghted lear regresso algorthm that we saw earler s kow as a parametrc learg algorthm because t has a fed, fte umber of parameters the θ, whch are ft to the data; Oce we've ft the θ ad stored them awa, we o loger eed to keep the trag data aroud to make future predctos. I cotrast, to make predctos usg locall weghted lear regresso, we eed to keep the etre trag set aroud. he term "o-parametrc" roughl refers to the fact that the amout of stuff we eed to keep order to represet the hpothess grows learl wth the sze of the trag set. Erc CMU, 04 45

46 Robust Regresso he best ft from a quadratc regresso But ths s probabl better How ca we do ths? Erc CMU, 04 46

47 LOESS-based Robust Regresso Remember what we do "locall weghted lear regresso"? we "score" each pot for ts mpotece ow we score each pot accordg to ts "ftess" Courtes to Adrew Moor Erc CMU, 04 47

48 Robust regresso For k = to R Let k, k be the kth datapot Let est k be predcted value of k Let w k be a weght for data pot k that s large f the data pot fts well ad small f t fts badl: w k k est k he redo the regresso usg weghted data pots. Repeat whole thg utl coverged! Erc CMU, 04 48

49 Robust regresso probablstc terpretato What regular regresso does: Assume k was orgall geerated usg the followg recpe: k k 0, Computatoal task s to fd the Mamum Lkelhood estmato of θ Erc CMU, 04 49

50 Robust regresso probablstc terpretato What LOESS robust regresso does: Assume k was orgall geerated usg the followg recpe: wth probablt p: k 0, k but otherwse k ~, huge Computatoal task s to fd the Mamum Lkelhood estmates of θ, p, µ ad σ huge. he algorthm ou saw wth teratve reweghtg/refttg does ths computato for us. Later ou wll fd that t s a stace of the famous E.M. algorthm Erc CMU, 04 50

51 Regresso ree Decso tree for regresso Geder Rch? um. Chldre # travel per r. Age Geder? F o 5 38 M o 0 5 M Yes 0 7 : : : : : Female Predcted age=39 Male Predcted age=36 Erc CMU, 04 5

52 A coceptual pcture Assumg regular regresso trees, ca ou sketch a graph of the ftted fucto * over ths dagram? Erc CMU, 04 5

53 How about ths oe? Multlear Iterpolato We wated to create a cotuous ad pecewse lear ft to the data Erc CMU, 04 53

54 ake home message Gradet descet O-le Batch ormal equatos Geometrc terpretato of LMS Probablstc terpretato of LMS, ad equvalece of LMS ad MLE uder certa assumpto what? Sparst: Approach: rdge vs. lasso regresso Iterpretato: regularzed regresso versus Baesa regresso Algorthm: cove optmzato we dd ot dscuss ths LR does ot mea fttg lear relatos, but lear combato or bass fuctos that ca be o-lear Weghtg pots b mportace versus b ftess Erc CMU, 04 54

55 Apped Erc CMU, 04 55

56 Parameter Learg from d Data Goal: estmate dstrbuto parameters from a dataset of depedet, detcall dstrbuted d, full observed, trag cases D = {,..., } Mamum lkelhood estmato MLE. Oe of the most commo estmators. Wth d ad full-observablt assumpto, wrte L as the lkelhood of the data: L P,,, ; P ; P P 3. pck the settg of parameters most lkel to have geerated the data we saw: * ; ;,, P ; argma L argma log L Erc CMU, 04 56

57 Eample: Beroull model Data: We observed d co tossg: D={, 0,,, 0} Represetato: Bar r.v: { 0, } Model: for 0 P for How to wrte the lkelhood of a sgle observato? P P he lkelhood of datasetd={,, }: P,,..., P #head #tals Erc CMU, 04 57

58 Mamum Lkelhood Estmato Objectve fucto: h t l ; D log P D log log log We eed to mamze ths w.r.t. ake dervatves wrt h h l h h 0 MLE h or MLE Suffcet statstcs h, where k Frequec as sample mea he couts, are suffcet statstcs of data D, Erc CMU, 04 58

59 Overfttg Recall that for Beroull Dstrbuto, we have head ML head head tal What f we tossed too few tmes so that we saw zero head? We have head ad we wll predct that the probablt of ML 0, seeg a head et s zero!!! he rescue: "smoothg" Where ' s kow as the pseudo- magar cout head ML But ca we make ths more formal? head ' tal ' head Erc CMU, 04 59

60 Baesa Parameter Estmato reat the dstrbuto parameters also as a radom varable he a posteror dstrbuto of after seem the data s: hs s Baes Rule lkelhood margal pror lkelhood posteror d p D p p D p D p p D p D p he pror p. ecodes our pror kowledge about the doma Erc CMU, 04 60

61 Frequetst Parameter Estmato wo people wth dfferet prors p wll ed up wth dfferet estmates p D. Frequetsts dslke ths subjectvt. Frequetsts thk of the parameter as a fed, ukow costat, ot a radom varable. Hece the have to come up wth dfferet "objectve" estmators was of computg from data, stead of usg Baes rule. hese estmators have dfferet propertes, such as beg ubased, mmum varace, etc. he mamum lkelhood estmator, s oe such estmator. Erc CMU, 04 6

62 Dscusso or p, ths s the problem! Baesas kow t Erc CMU, 04 6

63 Baesa estmato for Beroull Beta dstrbuto: Whe s dscrete Posteror dstrbuto of : otce the somorphsm of the posteror to the pror, such a pror s called a cojugate pror ad are hperparameters parameters of the pror ad correspod to the umber of vrtual heads/tals pseudo couts t h t h p p p P,...,,...,,...,,, ; P B! Erc CMU, 04 63

64 Baesa estmato for Beroull, co'd Posteror dstrbuto of : Mamum a posteror MAP estmato: Posteror mea estmato: Pror stregth: A=+ A ca be teroperated as the sze of a magar data set from whch we obta the pseudo-couts t h t h p p p P,...,,...,,..., d C d D p h Baes t h Bata parameters ca be uderstood as pseudo-couts,..., ma log arg MAP P Erc CMU, 04 64

65 Effect of Pror Stregth Suppose we have a uform pror ==/, ad we observe h, 8 Weak pror A =. Posteror predcto: Strog pror A = 0. Posteror predcto: However, f we have eough data, t washes awa the pror. e.g., h 00,. he the estmates uder t weak ad strog pror are 000 ad, respectvel, 0000 both of whch are close to 0. t p h h, t 8, ' p h h, t 8, ' Erc CMU, 04 65

66 Eample : Gaussa dest Data: We observed d real samples: D={-0., 0,, -5.,, 3} Model: P Log lkelhood: / ep / D P D l ; log log MLE: take dervatve ad set to zero: l l / 4 MLE MLE ML Erc CMU, 04 66

67 MLE for a multvarate-gaussa It ca be show that the MLE for µ ad Σ s where the scatter matr s he suffcet statstcs are ad. ote that = ma ot be full rak eg. f <D, whch case Σ ML s ot vertble S ML ML MLE MLE ML ML ML ML S K Erc CMU, 04 67

68 Baesa estmato ormal Pror: Jot probablt: Posteror: / ep / P ~ ad, / / / / / / ~ where Sample mea / ep ep, / / P ~ / ~ ep ~ / P Erc CMU, 04 68

69 Baesa estmato: ukow µ, kow σ he posteror mea s a cove combato of the pror ad the MLE, wth weghts proportoal to the relatve ose levels. he precso of the posteror /σ s the precso of the pror /σ 0 plus oe cotrbuto of data precso /σ for each observed data pot. Sequetall updatg the mea µ = 0.8 ukow, σ = 0. kow Effect of sgle data pot Uformatve vague/ flat pror, σ ~, / / / / / / Erc CMU, 04 69

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Mache Learg CSE6740/CS764/ISYE6740, Fall 0 Itroducto to Regresso Le Sog Lecture 4, August 30, 0 Based o sldes from Erc g, CMU Readg: Chap. 3, CB Mache learg for apartmet hutg Suppose ou are to move to

More information

Generative vs. Discriminative Classifiers

Generative vs. Discriminative Classifiers Geerate s. Dscrmate Classfers Goal: Wsh to lear f: Y, e.g., P(Y ) Geerate classfers (e.g., Naïe Baes): Assume some fuctoal form for P( Y), P(Y) hs s a geerate model of the data! Estmate parameters of P(

More information

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x CS 75 Mache Learg Lecture 8 Lear regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear combato of put compoets f + + + K d d K k - parameters

More information

Generative classification models

Generative classification models CS 75 Mache Learg Lecture Geeratve classfcato models Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Data: D { d, d,.., d} d, Classfcato represets a dscrete class value Goal: lear f : X Y Bar classfcato

More information

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model Chapter 3 Asmptotc Theor ad Stochastc Regressors The ature of eplaator varable s assumed to be o-stochastc or fed repeated samples a regresso aalss Such a assumpto s approprate for those epermets whch

More information

Objectives of Multiple Regression

Objectives of Multiple Regression Obectves of Multple Regresso Establsh the lear equato that best predcts values of a depedet varable Y usg more tha oe eplaator varable from a large set of potetal predctors {,,... k }. Fd that subset of

More information

Bayes (Naïve or not) Classifiers: Generative Approach

Bayes (Naïve or not) Classifiers: Generative Approach Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg

More information

Kernel-based Methods and Support Vector Machines

Kernel-based Methods and Support Vector Machines Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg

More information

Point Estimation: definition of estimators

Point Estimation: definition of estimators Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.

More information

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab Lear Regresso Lear Regresso th Shrkage Some sldes are due to Tomm Jaakkola, MIT AI Lab Itroducto The goal of regresso s to make quattatve real valued predctos o the bass of a vector of features or attrbutes.

More information

Econometric Methods. Review of Estimation

Econometric Methods. Review of Estimation Ecoometrc Methods Revew of Estmato Estmatg the populato mea Radom samplg Pot ad terval estmators Lear estmators Ubased estmators Lear Ubased Estmators (LUEs) Effcecy (mmum varace) ad Best Lear Ubased Estmators

More information

Supervised learning: Linear regression Logistic regression

Supervised learning: Linear regression Logistic regression CS 57 Itroducto to AI Lecture 4 Supervsed learg: Lear regresso Logstc regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Data: D { D D.. D D Supervsed learg d a set of eamples s

More information

i 2 σ ) i = 1,2,...,n , and = 3.01 = 4.01

i 2 σ ) i = 1,2,...,n , and = 3.01 = 4.01 ECO 745, Homework 6 Le Cabrera. Assume that the followg data come from the lear model: ε ε ~ N, σ,,..., -6. -.5 7. 6.9 -. -. -.9. -..6.4.. -.6 -.7.7 Fd the mamum lkelhood estmates of,, ad σ ε s.6. 4. ε

More information

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines CS 675 Itroducto to Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Mdterm eam October 9, 7 I-class eam Closed book Stud materal: Lecture otes Correspodg chapters

More information

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1 STA 08 Appled Lear Models: Regresso Aalyss Sprg 0 Soluto for Homework #. Let Y the dollar cost per year, X the umber of vsts per year. The the mathematcal relato betwee X ad Y s: Y 300 + X. Ths s a fuctoal

More information

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model Lecture 7. Cofdece Itervals ad Hypothess Tests the Smple CLR Model I lecture 6 we troduced the Classcal Lear Regresso (CLR) model that s the radom expermet of whch the data Y,,, K, are the outcomes. The

More information

Lecture Notes Forecasting the process of estimating or predicting unknown situations

Lecture Notes Forecasting the process of estimating or predicting unknown situations Lecture Notes. Ecoomc Forecastg. Forecastg the process of estmatg or predctg ukow stuatos Eample usuall ecoomsts predct future ecoomc varables Forecastg apples to a varet of data () tme seres data predctg

More information

Binary classification: Support Vector Machines

Binary classification: Support Vector Machines CS 57 Itroducto to AI Lecture 6 Bar classfcato: Support Vector Maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Supervsed learg Data: D { D, D,.., D} a set of eamples D, (,,,,,

More information

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Multivariate Transformation of Variables and Maximum Likelihood Estimation Marquette Uversty Multvarate Trasformato of Varables ad Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Assocate Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Copyrght 03 by Marquette Uversty

More information

Dimensionality Reduction and Learning

Dimensionality Reduction and Learning CMSC 35900 (Sprg 009) Large Scale Learg Lecture: 3 Dmesoalty Reducto ad Learg Istructors: Sham Kakade ad Greg Shakharovch L Supervsed Methods ad Dmesoalty Reducto The theme of these two lectures s that

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Marquette Uverst Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 08 b Marquette Uverst Maxmum Lkelhood Estmato We have bee sag that ~

More information

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements Aoucemets No-Parametrc Desty Estmato Techques HW assged Most of ths lecture was o the blacboard. These sldes cover the same materal as preseted DHS Bometrcs CSE 90-a Lecture 7 CSE90a Fall 06 CSE90a Fall

More information

Support vector machines II

Support vector machines II CS 75 Mache Learg Lecture Support vector maches II Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Learl separable classes Learl separable classes: here s a hperplae that separates trag staces th o error

More information

Regression and the LMS Algorithm

Regression and the LMS Algorithm CSE 556: Itroducto to Neural Netorks Regresso ad the LMS Algorthm CSE 556: Regresso 1 Problem statemet CSE 556: Regresso Lear regresso th oe varable Gve a set of N pars of data {, d }, appromate d b a

More information

CS 2750 Machine Learning. Lecture 7. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

CS 2750 Machine Learning. Lecture 7. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x CS 75 Mache Learg Lecture 7 Lear regresso Mlos Hauskrecht los@cs.ptt.edu 59 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear cobato of put copoets f + + + K d d K k - paraeters eghts

More information

Model Fitting, RANSAC. Jana Kosecka

Model Fitting, RANSAC. Jana Kosecka Model Fttg, RANSAC Jaa Kosecka Fttg: Issues Prevous strateges Le detecto Hough trasform Smple parametrc model, two parameters m, b m + b Votg strateg Hard to geeralze to hgher dmesos a o + a + a 2 2 +

More information

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression Overvew Basc cocepts of Bayesa learg Most probable model gve data Co tosses Lear regresso Logstc regresso Bayesa predctos Co tosses Lear regresso 30 Recap: regresso problems Iput to learg problem: trag

More information

Part I: Background on the Binomial Distribution

Part I: Background on the Binomial Distribution Part I: Bacgroud o the Bomal Dstrbuto A radom varable s sad to have a Beroull dstrbuto f t taes o the value wth probablt "p" ad the value wth probablt " - p". The umber of "successes" "" depedet Beroull

More information

Lecture 3 Probability review (cont d)

Lecture 3 Probability review (cont d) STATS 00: Itroducto to Statstcal Iferece Autum 06 Lecture 3 Probablty revew (cot d) 3. Jot dstrbutos If radom varables X,..., X k are depedet, the ther dstrbuto may be specfed by specfyg the dvdual dstrbuto

More information

Lecture 3. Sampling, sampling distributions, and parameter estimation

Lecture 3. Sampling, sampling distributions, and parameter estimation Lecture 3 Samplg, samplg dstrbutos, ad parameter estmato Samplg Defto Populato s defed as the collecto of all the possble observatos of terest. The collecto of observatos we take from the populato s called

More information

Lecture Notes 2. The ability to manipulate matrices is critical in economics.

Lecture Notes 2. The ability to manipulate matrices is critical in economics. Lecture Notes. Revew of Matrces he ablt to mapulate matrces s crtcal ecoomcs.. Matr a rectagular arra of umbers, parameters, or varables placed rows ad colums. Matrces are assocated wth lear equatos. lemets

More information

STK4011 and STK9011 Autumn 2016

STK4011 and STK9011 Autumn 2016 STK4 ad STK9 Autum 6 Pot estmato Covers (most of the followg materal from chapter 7: Secto 7.: pages 3-3 Secto 7..: pages 3-33 Secto 7..: pages 35-3 Secto 7..3: pages 34-35 Secto 7.3.: pages 33-33 Secto

More information

Lecture 16: Backpropogation Algorithm Neural Networks with smooth activation functions

Lecture 16: Backpropogation Algorithm Neural Networks with smooth activation functions CO-511: Learg Theory prg 2017 Lecturer: Ro Lv Lecture 16: Bacpropogato Algorthm Dsclamer: These otes have ot bee subected to the usual scruty reserved for formal publcatos. They may be dstrbuted outsde

More information

Support vector machines

Support vector machines CS 75 Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Outle Outle: Algorthms for lear decso boudary Support vector maches Mamum marg hyperplae.

More information

Correlation and Simple Linear Regression

Correlation and Simple Linear Regression Correlato ad Smple Lear Regresso Berl Che Departmet of Computer Scece & Iformato Egeerg Natoal Tawa Normal Uverst Referece:. W. Navd. Statstcs for Egeerg ad Scetsts. Chapter 7 (7.-7.3) & Teachg Materal

More information

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture) CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.

More information

Line Fitting and Regression

Line Fitting and Regression Marquette Uverst MSCS6 Le Fttg ad Regresso Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 8 b Marquette Uverst Least Squares Regresso MSCS6 For LSR we have pots

More information

Introduction to local (nonparametric) density estimation. methods

Introduction to local (nonparametric) density estimation. methods Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest

More information

9.1 Introduction to the probit and logit models

9.1 Introduction to the probit and logit models EC3000 Ecoometrcs Lecture 9 Probt & Logt Aalss 9. Itroducto to the probt ad logt models 9. The logt model 9.3 The probt model Appedx 9. Itroducto to the probt ad logt models These models are used regressos

More information

Chapter 5 Properties of a Random Sample

Chapter 5 Properties of a Random Sample Lecture 6 o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for the prevous lecture Cocepts: t-dstrbuto, F-dstrbuto Theorems: Dstrbutos of sample mea ad sample varace, relatoshp betwee sample mea ad sample

More information

Radial Basis Function Networks

Radial Basis Function Networks Radal Bass Fucto Netorks Radal Bass Fucto Netorks A specal types of ANN that have three layers Iput layer Hdde layer Output layer Mappg from put to hdde layer s olear Mappg from hdde to output layer s

More information

Lecture 8: Linear Regression

Lecture 8: Linear Regression Lecture 8: Lear egresso May 4, GENOME 56, Sprg Goals Develop basc cocepts of lear regresso from a probablstc framework Estmatg parameters ad hypothess testg wth lear models Lear regresso Su I Lee, CSE

More information

Lecture 9: Tolerant Testing

Lecture 9: Tolerant Testing Lecture 9: Tolerat Testg Dael Kae Scrbe: Sakeerth Rao Aprl 4, 07 Abstract I ths lecture we prove a quas lear lower boud o the umber of samples eeded to do tolerat testg for L dstace. Tolerat Testg We have

More information

ESS Line Fitting

ESS Line Fitting ESS 5 014 17. Le Fttg A very commo problem data aalyss s lookg for relatoshpetwee dfferet parameters ad fttg les or surfaces to data. The smplest example s fttg a straght le ad we wll dscuss that here

More information

CHAPTER VI Statistical Analysis of Experimental Data

CHAPTER VI Statistical Analysis of Experimental Data Chapter VI Statstcal Aalyss of Expermetal Data CHAPTER VI Statstcal Aalyss of Expermetal Data Measuremets do ot lead to a uque value. Ths s a result of the multtude of errors (maly radom errors) that ca

More information

Unsupervised Learning and Other Neural Networks

Unsupervised Learning and Other Neural Networks CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all

More information

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes coometrcs, CON Sa Fracsco State Uverst Mchael Bar Sprg 5 Mdterm xam, secto Soluto Thursda, Februar 6 hour, 5 mutes Name: Istructos. Ths s closed book, closed otes exam.. No calculators of a kd are allowed..

More information

PGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation

PGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation PGE 30: Formulato ad Soluto Geosystems Egeerg Dr. Balhoff Iterpolato Numercal Methods wth MATLAB, Recktewald, Chapter 0 ad Numercal Methods for Egeers, Chapra ad Caale, 5 th Ed., Part Fve, Chapter 8 ad

More information

Simple Linear Regression

Simple Linear Regression Statstcal Methods I (EST 75) Page 139 Smple Lear Regresso Smple regresso applcatos are used to ft a model descrbg a lear relatoshp betwee two varables. The aspects of least squares regresso ad correlato

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Exam: ECON430 Statstcs Date of exam: Frday, December 8, 07 Grades are gve: Jauary 4, 08 Tme for exam: 0900 am 00 oo The problem set covers 5 pages Resources allowed:

More information

4. Standard Regression Model and Spatial Dependence Tests

4. Standard Regression Model and Spatial Dependence Tests 4. Stadard Regresso Model ad Spatal Depedece Tests Stadard regresso aalss fals the presece of spatal effects. I case of spatal depedeces ad/or spatal heterogeet a stadard regresso model wll be msspecfed.

More information

Lecture 3. Least Squares Fitting. Optimization Trinity 2014 P.H.S.Torr. Classic least squares. Total least squares.

Lecture 3. Least Squares Fitting. Optimization Trinity 2014 P.H.S.Torr. Classic least squares. Total least squares. Lecture 3 Optmzato Trt 04 P.H.S.Torr Least Squares Fttg Classc least squares Total least squares Robust Estmato Fttg: Cocepts ad recpes Least squares le fttg Data:,,,, Le equato: = m + b Fd m, b to mmze

More information

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions. Ordary Least Squares egresso. Smple egresso. Algebra ad Assumptos. I ths part of the course we are gog to study a techque for aalysg the lear relatoshp betwee two varables Y ad X. We have pars of observatos

More information

QR Factorization and Singular Value Decomposition COS 323

QR Factorization and Singular Value Decomposition COS 323 QR Factorzato ad Sgular Value Decomposto COS 33 Why Yet Aother Method? How do we solve least-squares wthout currg codto-squarg effect of ormal equatos (A T A A T b) whe A s sgular, fat, or otherwse poorly-specfed?

More information

ENGI 3423 Simple Linear Regression Page 12-01

ENGI 3423 Simple Linear Regression Page 12-01 ENGI 343 mple Lear Regresso Page - mple Lear Regresso ometmes a expermet s set up where the expermeter has cotrol over the values of oe or more varables X ad measures the resultg values of aother varable

More information

Lecture Notes Types of economic variables

Lecture Notes Types of economic variables Lecture Notes 3 1. Types of ecoomc varables () Cotuous varable takes o a cotuum the sample space, such as all pots o a le or all real umbers Example: GDP, Polluto cocetrato, etc. () Dscrete varables fte

More information

Chapter 2 Supplemental Text Material

Chapter 2 Supplemental Text Material -. Models for the Data ad the t-test Chapter upplemetal Text Materal The model preseted the text, equato (-3) s more properl called a meas model. ce the mea s a locato parameter, ths tpe of model s also

More information

Recall MLR 5 Homskedasticity error u has the same variance given any values of the explanatory variables Var(u x1,...,xk) = 2 or E(UU ) = 2 I

Recall MLR 5 Homskedasticity error u has the same variance given any values of the explanatory variables Var(u x1,...,xk) = 2 or E(UU ) = 2 I Chapter 8 Heterosedastcty Recall MLR 5 Homsedastcty error u has the same varace gve ay values of the eplaatory varables Varu,..., = or EUU = I Suppose other GM assumptos hold but have heterosedastcty.

More information

Chapter Two. An Introduction to Regression ( )

Chapter Two. An Introduction to Regression ( ) ubject: A Itroducto to Regresso Frst tage Chapter Two A Itroducto to Regresso (018-019) 1 pg. ubject: A Itroducto to Regresso Frst tage A Itroducto to Regresso Regresso aalss s a statstcal tool for the

More information

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ  1 STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS Recall Assumpto E(Y x) η 0 + η x (lear codtoal mea fucto) Data (x, y ), (x 2, y 2 ),, (x, y ) Least squares estmator ˆ E (Y x) ˆ " 0 + ˆ " x, where ˆ

More information

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections ENGI 441 Jot Probablty Dstrbutos Page 7-01 Jot Probablty Dstrbutos [Navd sectos.5 ad.6; Devore sectos 5.1-5.] The jot probablty mass fucto of two dscrete radom quattes, s, P ad p x y x y The margal probablty

More information

Special Instructions / Useful Data

Special Instructions / Useful Data JAM 6 Set of all real umbers P A..d. B, p Posso Specal Istructos / Useful Data x,, :,,, x x Probablty of a evet A Idepedetly ad detcally dstrbuted Bomal dstrbuto wth parameters ad p Posso dstrbuto wth

More information

Simulation Output Analysis

Simulation Output Analysis Smulato Output Aalyss Summary Examples Parameter Estmato Sample Mea ad Varace Pot ad Iterval Estmato ermatg ad o-ermatg Smulato Mea Square Errors Example: Sgle Server Queueg System x(t) S 4 S 4 S 3 S 5

More information

Simple Linear Regression

Simple Linear Regression Correlato ad Smple Lear Regresso Berl Che Departmet of Computer Scece & Iformato Egeerg Natoal Tawa Normal Uversty Referece:. W. Navd. Statstcs for Egeerg ad Scetsts. Chapter 7 (7.-7.3) & Teachg Materal

More information

Chapter 14 Logistic Regression Models

Chapter 14 Logistic Regression Models Chapter 4 Logstc Regresso Models I the lear regresso model X β + ε, there are two types of varables explaatory varables X, X,, X k ad study varable y These varables ca be measured o a cotuous scale as

More information

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model ECON 48 / WH Hog The Smple Regresso Model. Defto of the Smple Regresso Model Smple Regresso Model Expla varable y terms of varable x y = β + β x+ u y : depedet varable, explaed varable, respose varable,

More information

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then Secto 5 Vectors of Radom Varables Whe workg wth several radom varables,,..., to arrage them vector form x, t s ofte coveet We ca the make use of matrx algebra to help us orgaze ad mapulate large umbers

More information

Simple Linear Regression and Correlation.

Simple Linear Regression and Correlation. Smple Lear Regresso ad Correlato. Correspods to Chapter 0 Tamhae ad Dulop Sldes prepared b Elzabeth Newto (MIT) wth some sldes b Jacquele Telford (Johs Hopks Uverst) Smple lear regresso aalss estmates

More information

Chapter 8. Inferences about More Than Two Population Central Values

Chapter 8. Inferences about More Than Two Population Central Values Chapter 8. Ifereces about More Tha Two Populato Cetral Values Case tudy: Effect of Tmg of the Treatmet of Port-We tas wth Lasers ) To vestgate whether treatmet at a youg age would yeld better results tha

More information

Section 2 Notes. Elizabeth Stone and Charles Wang. January 15, Expectation and Conditional Expectation of a Random Variable.

Section 2 Notes. Elizabeth Stone and Charles Wang. January 15, Expectation and Conditional Expectation of a Random Variable. Secto Notes Elzabeth Stoe ad Charles Wag Jauar 5, 9 Jot, Margal, ad Codtoal Probablt Useful Rules/Propertes. P ( x) P P ( x; ) or R f (x; ) d. P ( xj ) P (x; ) P ( ) 3. P ( x; ) P ( xj ) P ( ) 4. Baes

More information

1 Solution to Problem 6.40

1 Solution to Problem 6.40 1 Soluto to Problem 6.40 (a We wll wrte T τ (X 1,...,X where the X s are..d. wth PDF f(x µ, σ 1 ( x µ σ g, σ where the locato parameter µ s ay real umber ad the scale parameter σ s > 0. Lettg Z X µ σ we

More information

Regresso What s a Model? 1. Ofte Descrbe Relatoshp betwee Varables 2. Types - Determstc Models (o radomess) - Probablstc Models (wth radomess) EPI 809/Sprg 2008 9 Determstc Models 1. Hypothesze

More information

Lecture Note to Rice Chapter 8

Lecture Note to Rice Chapter 8 ECON 430 HG revsed Nov 06 Lecture Note to Rce Chapter 8 Radom matrces Let Y, =,,, m, =,,, be radom varables (r.v. s). The matrx Y Y Y Y Y Y Y Y Y Y = m m m s called a radom matrx ( wth a ot m-dmesoal dstrbuto,

More information

CSE 5526: Introduction to Neural Networks Linear Regression

CSE 5526: Introduction to Neural Networks Linear Regression CSE 556: Itroducto to Neural Netorks Lear Regresso Part II 1 Problem statemet Part II Problem statemet Part II 3 Lear regresso th oe varable Gve a set of N pars of data , appromate d by a lear fucto

More information

Point Estimation: definition of estimators

Point Estimation: definition of estimators Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.

More information

Summary of the lecture in Biostatistics

Summary of the lecture in Biostatistics Summary of the lecture Bostatstcs Probablty Desty Fucto For a cotuos radom varable, a probablty desty fucto s a fucto such that: 0 dx a b) b a dx A probablty desty fucto provdes a smple descrpto of the

More information

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes coometrcs, CON Sa Fracsco State Uversty Mchael Bar Sprg 5 Mdterm am, secto Soluto Thursday, February 6 hour, 5 mutes Name: Istructos. Ths s closed book, closed otes eam.. No calculators of ay kd are allowed..

More information

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier Baa Classfcato CS6L Data Mg: Classfcato() Referece: J. Ha ad M. Kamber, Data Mg: Cocepts ad Techques robablstc learg: Calculate explct probabltes for hypothess, amog the most practcal approaches to certa

More information

6.867 Machine Learning

6.867 Machine Learning 6.867 Mache Learg Problem set Due Frday, September 9, rectato Please address all questos ad commets about ths problem set to 6.867-staff@a.mt.edu. You do ot eed to use MATLAB for ths problem set though

More information

An Introduction to. Support Vector Machine

An Introduction to. Support Vector Machine A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork

More information

Generalized Linear Regression with Regularization

Generalized Linear Regression with Regularization Geeralze Lear Regresso wth Regularzato Zoya Bylsk March 3, 05 BASIC REGRESSION PROBLEM Note: I the followg otes I wll make explct what s a vector a what s a scalar usg vec t or otato, to avo cofuso betwee

More information

X ε ) = 0, or equivalently, lim

X ε ) = 0, or equivalently, lim Revew for the prevous lecture Cocepts: order statstcs Theorems: Dstrbutos of order statstcs Examples: How to get the dstrbuto of order statstcs Chapter 5 Propertes of a Radom Sample Secto 55 Covergece

More information

Dimensionality reduction Feature selection

Dimensionality reduction Feature selection CS 750 Mache Learg Lecture 3 Dmesoalty reducto Feature selecto Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 750 Mache Learg Dmesoalty reducto. Motvato. Classfcato problem eample: We have a put data

More information

Big Data Analytics. Data Fitting and Sampling. Acknowledgement: Notes by Profs. R. Szeliski, S. Seitz, S. Lazebnik, K. Chaturvedi, and S.

Big Data Analytics. Data Fitting and Sampling. Acknowledgement: Notes by Profs. R. Szeliski, S. Seitz, S. Lazebnik, K. Chaturvedi, and S. Bg Data Aaltcs Data Fttg ad Samplg Ackowledgemet: Notes b Profs. R. Szelsk, S. Setz, S. Lazebk, K. Chaturved, ad S. Shah Fttg: Cocepts ad recpes A bag of techques If we kow whch pots belog to the le, how

More information

TESTS BASED ON MAXIMUM LIKELIHOOD

TESTS BASED ON MAXIMUM LIKELIHOOD ESE 5 Toy E. Smth. The Basc Example. TESTS BASED ON MAXIMUM LIKELIHOOD To llustrate the propertes of maxmum lkelhood estmates ad tests, we cosder the smplest possble case of estmatg the mea of the ormal

More information

CLASS NOTES. for. PBAF 528: Quantitative Methods II SPRING Instructor: Jean Swanson. Daniel J. Evans School of Public Affairs

CLASS NOTES. for. PBAF 528: Quantitative Methods II SPRING Instructor: Jean Swanson. Daniel J. Evans School of Public Affairs CLASS NOTES for PBAF 58: Quattatve Methods II SPRING 005 Istructor: Jea Swaso Dael J. Evas School of Publc Affars Uversty of Washgto Ackowledgemet: The structor wshes to thak Rachel Klet, Assstat Professor,

More information

Sampling Theory MODULE V LECTURE - 14 RATIO AND PRODUCT METHODS OF ESTIMATION

Sampling Theory MODULE V LECTURE - 14 RATIO AND PRODUCT METHODS OF ESTIMATION Samplg Theor MODULE V LECTUE - 4 ATIO AND PODUCT METHODS OF ESTIMATION D. SHALABH DEPATMENT OF MATHEMATICS AND STATISTICS INDIAN INSTITUTE OF TECHNOLOG KANPU A mportat objectve a statstcal estmato procedure

More information

ECON 5360 Class Notes GMM

ECON 5360 Class Notes GMM ECON 560 Class Notes GMM Geeralzed Method of Momets (GMM) I beg by outlg the classcal method of momets techque (Fsher, 95) ad the proceed to geeralzed method of momets (Hase, 98).. radtoal Method of Momets

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Postpoed exam: ECON430 Statstcs Date of exam: Jauary 0, 0 Tme for exam: 09:00 a.m. :00 oo The problem set covers 5 pages Resources allowed: All wrtte ad prted

More information

Multiple Linear Regression Analysis

Multiple Linear Regression Analysis LINEA EGESSION ANALYSIS MODULE III Lecture - 4 Multple Lear egresso Aalyss Dr. Shalabh Departmet of Mathematcs ad Statstcs Ida Isttute of Techology Kapur Cofdece terval estmato The cofdece tervals multple

More information

Qualifying Exam Statistical Theory Problem Solutions August 2005

Qualifying Exam Statistical Theory Problem Solutions August 2005 Qualfyg Exam Statstcal Theory Problem Solutos August 5. Let X, X,..., X be d uform U(,),

More information

Statistics. Correlational. Dr. Ayman Eldeib. Simple Linear Regression and Correlation. SBE 304: Linear Regression & Correlation 1/3/2018

Statistics. Correlational. Dr. Ayman Eldeib. Simple Linear Regression and Correlation. SBE 304: Linear Regression & Correlation 1/3/2018 /3/08 Sstems & Bomedcal Egeerg Departmet SBE 304: Bo-Statstcs Smple Lear Regresso ad Correlato Dr. Ama Eldeb Fall 07 Descrptve Orgasg, summarsg & descrbg data Statstcs Correlatoal Relatoshps Iferetal Geeralsg

More information

CS 2750 Machine Learning Lecture 5. Density estimation. Density estimation

CS 2750 Machine Learning Lecture 5. Density estimation. Density estimation CS 750 Mache Learg Lecture 5 esty estmato Mlos Hausrecht mlos@tt.edu 539 Seott Square esty estmato esty estmato: s a usuervsed learg roblem Goal: Lear a model that rereset the relatos amog attrbutes the

More information

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution: Chapter 4 Exercses Samplg Theory Exercse (Smple radom samplg: Let there be two correlated radom varables X ad A sample of sze s draw from a populato by smple radom samplg wthout replacemet The observed

More information

Example: Multiple linear regression. Least squares regression. Repetition: Simple linear regression. Tron Anders Moger

Example: Multiple linear regression. Least squares regression. Repetition: Simple linear regression. Tron Anders Moger Example: Multple lear regresso 5000,00 4000,00 Tro Aders Moger 0.0.007 brthweght 3000,00 000,00 000,00 0,00 50,00 00,00 50,00 00,00 50,00 weght pouds Repetto: Smple lear regresso We defe a model Y = β0

More information

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames KLT Tracker Tracker. Detect Harrs corers the frst frame 2. For each Harrs corer compute moto betwee cosecutve frames (Algmet). 3. Lk moto vectors successve frames to get a track 4. Itroduce ew Harrs pots

More information

Parameter, Statistic and Random Samples

Parameter, Statistic and Random Samples Parameter, Statstc ad Radom Samples A parameter s a umber that descrbes the populato. It s a fxed umber, but practce we do ot kow ts value. A statstc s a fucto of the sample data,.e., t s a quatty whose

More information

Third handout: On the Gini Index

Third handout: On the Gini Index Thrd hadout: O the dex Corrado, a tala statstca, proposed (, 9, 96) to measure absolute equalt va the mea dfferece whch s defed as ( / ) where refers to the total umber of dvduals socet. Assume that. The

More information

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971)) art 4b Asymptotc Results for MRR usg RESS Recall that the RESS statstc s a specal type of cross valdato procedure (see Alle (97)) partcular to the regresso problem ad volves fdg Y $,, the estmate at the

More information

3. Basic Concepts: Consequences and Properties

3. Basic Concepts: Consequences and Properties : 3. Basc Cocepts: Cosequeces ad Propertes Markku Jutt Overvew More advaced cosequeces ad propertes of the basc cocepts troduced the prevous lecture are derved. Source The materal s maly based o Sectos.6.8

More information