Linear Regression & Least Squares!
|
|
- Drusilla Wilkerson
- 6 years ago
- Views:
Transcription
1 Lner Regresson & Lest Squres Al Borj UWM CS 790 Slde credt: Aykut Erdem
2 Ths&week Lner®resson&prolem&& ' con0nuous&outputs& ' smple&model Introduce&key&concepts:&& ' loss&func0ons& ' generlz0on& ' op0mz0on& ' model&complety& ®ulrz0on Sldes'dpted'from'Rchrd'Zemel,'Ern'Hlpern,'Zv:Br'Joseph,'Ar>'Sngh,'Brns'Poczos,'J.P.'Lews,' Erk'Sudderth 2
3 Clssfc0on Input:&X& ' Rel&vlued,&vectors&over&rel.& ' Dscrete&vlues&(0,,2, )& ' Other&structures&(e.g.,&strngs,&grphs,&etc.) Output:&Y& ' Dscrete&(0,,2,...) Sports% Scence% News% Anemc%cell% Helthy%cell% X''Document' Y''Topc' X''Cell'Imge' Y''Dgnoss' 3
4 Regresson Input:&X& ' Rel&vlued,&vectors&over&rel.& ' Dscrete&vlues&(0,,2, )& ' Other&structures&(e.g.,&strngs,&grphs,&etc.) Output:&Y& ' Rel&vlued,&vectors&over&rel. Stock%Mrket%% Predcon% Y''?' X''Fe0'' 4
5 Choosng&&resturnt In&everydy&lfe&we&need&to&mke&decsons& y&tkng&nto&ccount&lots&of&fctors& The&ques0on&s&wht&weght&we&put&on& ech&of&these&fctors&(how&mportnt&re& they&wth&respect&to&the&others).& Assume&we&would&lke&to&uld&& recommender&system&sed&on&n& ndvduls &preferences& If&we&hve&mny& oserv0ons&we& my&e&le&to& recover&the&weghts Revews (out of 5 strs) Dstnce Cusne (out of 0) ? 5
6 Some%other%emples Weght%+%heght% cholesterol%level% Age%+%gender%6me%% 6me%spent%n%front%of%the%TV% Pst%choces%of%%user% 'NeHl%score'% Profle%of%%jo% (user,%mchne,%6me) Memory%usge%of% %sumked%process. 6
7 Emple:%Polynoml%Curve%FQng The%green%curve%s%the%true%func6on% (whch%s%not%%polynoml)% %not%known% The%dt%ponts%re%unform%n%%ut%hve% nose%n%t.% t() f() + Am:%ft%%curve%to%these%ponts% Key%ques6ons:% own from Bshop %How%do%we%prmetrze%the%model%(the%curve)?% %%%% %%Wht%loss%(ojec6ve)%func6on%should%we%use%to%judge%ft?%% %%%% %%How%do%we%op6mze%ft%to%unseen%test%dt%(generlz6on)?%% 7
8 \D%regresson 8
9 One\dmensonl%regresson Fnd%%lne%tht%represent%the% est %lner%rel6onshp: 9
10 One\dmensonl%regresson Prolem:%the%dt%does%not% go%through%%lne% e 0
11 One\dmensonl%regresson Prolem:%the%dt%does%not% go%through%%lne% e Fnd%the%lne%tht%mnmzes% the%sum:% ( 2 )
12 Prolem:%the%dt%does%not% go%through%%lne% Fnd%the%lne%tht%mnmzes% the%sum:% We%re%lookng%for%%%%%%tht% mnmzes% One\dmensonl%regresson 2 ˆ e 2 ) ( ) ( e 2 ) (
13 Mtr%not6on Usng%the%followng%not6ons & : % %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%nd n # " & : % n # " 3
14 Usng%the%followng%not6ons %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%nd We%cn%rewrte%the%error%func6on%usng%lner% lger%s: Mtr%not6on 4 " # % & n : " # % & n : 2 2 ) ( ) ( ) ( ) ( ) ( e e T
15 Emple:%Boston%House%Przes Es6mte%medn%house% prce%n%%neghorhood% sed%on%neghorhood% st6s6cs%% Look%t%frst%(of%3)% Krutes:%per%cpt% crme%rte%% Use%ths%to%predct%house% prces%n%other% neghorhoods hkps://rchve.cs.uc.edu/ml/dtsets/housng 5
16 Represent%the%dt Dt%descred%s%prs%D%%(( (),t () ),%( (2),t (2) ),...,%( (N),t (N) ))%% \ %s%the%nput%feture%(per%cpt%crme%rte)% \ t%s%the%trget%output%(medn%house%prce)% Here%t%s%con6nuous,%so%ths%s%%regresson%prolem%% Could%tke%frst%300%emples%s%trnng%set,%remnng%206% s%test%set% \ Use%the%trnng%emples%to%construct%hypothess,%or% func6on%ppromtor,%tht%mps%%to%predcted%y% \ Evlute%hypothess%on%test%set 6
17 Nose A%smple%model%typclly%does%not%ectly%ft%the%dt% % lck%of%ft%cn%e%consdered%nose Sources%of%nose% %Imprecson%n%dt%Krutes%(nput%nose)% %%% %% %%Errors%n%dt%trgets%(mslelng)%% %%% %% %%Add6onl%Krutes%not%tken%nto%ccount%y%dt% Krutes,%ffect%trget%vlues%(ltent%vrles)%% %%% %% %%Model%my%e%too%smple%to%ccount%for%dt% trgets 7
18 Lest\Squres%Regresson Stndrd%loss/cost/ojec6ve% func6on%mesures%the% squred%error%n%the% predc6on%of%t()%from%. N J(w) " [t (n) (w + w (n) )] 2 0 n from Bshop The%loss%for%the%red% hypothess%s%the%sum%of%the% squred%ver6cl%errors. 8
19 Op6mzng%the%Ojec6ve One%strghHorwrd%method:%n6lze%w%rndomly,% repetedly%updte%sed%on%grdent%descent%n%j% w w " #J #w Here%λ%s%the%lernng%rte% rnng rte For%%sngle%trnng%cse,%ths%gves%the%LMS%updte% rule: w w + 2(t (n) " y( (n) )) (n) Note:%s%error%pproches%zero,%so%does%updte error pproches zero, so does upd 9
20 Effect%of%step\sze%λ Lrge%λ%>%Fst%convergence%ut%lrger%resdul%error% %%%%%%%%%%%%%%%%%%%Also%possle%oscll6ons% Smll%λ%>%Slow%convergence%ut%smll%resdul%error 20
21 Op6mzng%Across%Trnng%Set Two%wys%to%generlze%ths%for%ll%emples%n%trnng%set:%. Stochs6c/onlne%updtes% updte%the%prmeters% for%ech%trnng%cse%n%turn,%ccordng%to%ts%own% grdents%% 2. Btch%updtes:%sum%or%verge%updtes%cross%every% emple%,%then%chnge%the%prmeter% vlues% w w + 2 N # n Underlyng%ssump6on:%smple%s%ndependent%nd% ssumpton: smple s ndepend den6clly%dstruted%(..d.) (t (n) " y( (n) )) (n) 2
22 Non\ter6ve%Lest\squres%Regresson An%ltern6ve%op6mz6on%pproch%s%non\ ter6ve:%tke%derv6ves,%set%to%zero,%nd%solve% for%prmeters. N dj(w) 2 "[t (n) (w + w (n) )] 0 dw 0 0 N n w (" t (n) w (n) ) / N t w 0 n w " n " n (t (n) t )( (n) ) ( (n) ) 2 22
23 Mul6\dmensonl%lner%regresson Usng%%model%wth%m%prmeters m m j j j 23
24 Mul6\dmensonl%lner%regresson Usng%%model%wth%m%prmeters m m j j j 2 24
25 Mul6\dmensonl%lner%regresson Usng%%model%wth%m%prmeters m m j j j 2 25
26 Mul6\dmensonl%lner%regresson Usng%%model%wth%m%prmeters nd%n%mesurements j j j m m , 2, ) ( ) ( A " # % & ' m j j j n m j j j e
27 Mul6\dmensonl%lner%regresson Usng%%model%wth%m%prmeters nd%n%mesurements j j j m m , 2, ) ( ) ( ) ( A " # % & ' e e m j j j n m j j j
28 :A 28 " # % & " # % & " # % & m m n n m n :... : :.. :,,,, A
29 :A 29 " # % & " # % & " # % & " # % & )... ( : )... ( :... : :.. :,,,,,,,, m m n n n m m n m n n m n A
30 :A 30 " # % & " # % & " # % & " # % & )... ( : )... ( :... : :.. :,,,,,,,, m m n n n m m n m n n m n A prmeter'
31 :A 3 " # % & " # % & " # % & " # % & )... ( : )... ( :... : :.. :,,,,,,,, m m n n n m m n m n n m n A mesurement'n prmeter'
32 Emple:%Boston%House%Przes% %revsted One%method%of%etendng%the% model%s%to%consder%other% nput%dmensons y() w + w + w In%the%Boston%housng% emple,%we%cn%look%t%the% numer%of%rooms%nput% feture% We%cn%use%grdent%descent% to%solve%for%ech%coeffcent,% or%use%lner%lger% %solve% system%of%equ6ons 2 32
33 Lner%Regresson Imgne%now%wnt%to%predct%the%medn%house%prce% from%these%mul6\dmensonl%oserv6ons%% Ech%house%s%%dt%pont%n,%wth%oserv6ons% ndeed%y%j:% (n) ( (n),..., d (n) ) Smple%predctor%s%nlogue%of%lner%clssfer,% producng%rel\vlued%y%for%nput%%wth%prmeters%w% (effec6vely%fng% 0 %%): 0 y w 0 + d j w j j w T 33
34 Mul6\dmensonl%lner%regresson e() A 2 2 ( A) T ( A) T A T A T A T T A + T. A%mnmum%occurs%when%%.%The%frst%derv6ve%s%zero,%% 2.%The%second%derv6ve%s%pos6ve.% Mul6dmensonl%cse:%% \ st %derv6ve%of%%func6on%f()%s%the%grdent,% f()%(%row%vector)% \ 2 nd %derv6ve,%the%hessn,%s%%mtr%tht%we%wll%denote%s%h f' (). e()2a T A 2A T. H e ()2A T A. 34
35 Mnmzng% e() mn mnmzes e( ) f
36 Mnmzng% e() mn mnmzes e( ) f e() mn
37 Mnmzng% e() e() s flt t mn mn mnmzes e( ) f e() mn
38 Mnmzng% e() e() s flt t mn e( ) 0 mn mn mnmzes e( ) f e() mn
39 Mnmzng% e() e() s flt t mn e( ) 0 mn mn mnmzes e( ) f e() does not go down round mn e() mn
40 Mnmzng% e() e() s flt t mn e( ) 0 mn mn mnmzes e( ) f e() does not go down round mn H e ( mn ) s postve e() sem - defnte mn
41 Recp:%Pos6ve%sem\defnte A s postve sem - defnte T A 0, for ll In -D In 2-D 4
42 Mnmzng% e( ) A 2 A T Aˆ A T ˆ mnmzes e( ) f 2A T A s postve sem - defnte
43 Mnmzng% e( ) A 2 A T Aˆ A T ˆ mnmzes e( ) f 2A T A s postve sem - defnte Alwys%true
44 Mnmzng% e( ) A 2 A T Aˆ A T The%norml'equton ˆ mnmzes e( ) f 2A T A s postve sem - defnte Alwys%true
45 Geometrc%nterpret6on 45
46 Geometrc%nterpret6on %s%%vector%n%r n 46
47 Geometrc%nterpret6on %s%%vector%n'r n % The%columns%of%A%defne%%vector%spce%rnge(A) 2 47
48 Geometrc%nterpret6on %s%%vector%n%r n % The%columns%of%A%defne%%vector%spce%rnge(A)% A%s%n%rtrry%vector%n%rnge(A) A 2 48
49 Geometrc%nterpret6on %s%%vector%n%r n % The%columns%of%A%defne%%vector%spce%rnge(A)% A%s%n%rtrry%vector%n%rnge(A) A A 2 49
50 Geometrc%nterpret6on %%%%%%s%the%orthogonl%projec6on%of%%onto%rnge(a) Aˆ A T ( ) T T Aˆ 0 A Aˆ A Aˆ ˆ ˆ Aˆ 2 50
51 The norml equton: A T Aˆ A T
52 The norml equton: A T Aˆ A T T T Estence:%%%%%%%%%%%%%%%%%%%%%%%%hs%lwys%%soluton A Aˆ A
53 The norml equton: A T Aˆ A T T T Estence:%%%%%%%%%%%%%%%%%%%%%%%%hs%lwys%%soluton% A Aˆ A Unqueness:%the%soluton%s%unque%f%the%columns%of% A%re%lnerly%ndependent%
54 The norml equton: A T Aˆ A T T T Estence:%%%%%%%%%%%%%%%%%%%%%%%%hs%lwys%%soluton% A Aˆ A Unqueness:%the%soluton%s%unque%f%the%columns%of% A%re%lnerly%ndependent% Aˆ 2
55 Lner%models % %%It%s%mthem6clly%esy%to%ft%lner%models%to%dt. %We%cn%lern%%lot%out%model\fQng%n%ths%rel6vely% smple%cse.%% % %%There%re%mny%wys%to%mke%lner%models%more%powerful%whle%retnng% ther%nce%mthem6cl%proper6es:% %By%usng%non\lner,%non\dp6ve%ss%func6ons,%we%cn%get%generlzed% lner%models%tht%lern%non\lner%mppngs%from%nput%to%output%ut%re% lner%n%ther%prmeters% %only%the%lner%prt%of%the%model%lerns.%% %%% %%%%%% %By%usng%kernel%methods%we%cn%hndle%epnsons%of%the%rw%dt%tht%use% %huge%numer%of%non\lner,%non\dp6ve%ss%func6ons.% % %By%usng%lrge%mrgn%kernel%methods%we%cn%vod%overfQng%even%when% we%use%huge%numers%of%ss%func6ons.% %But%lner%methods%wll%not%solve%most%AI%prolems.%% % %They%hve%fundmentl%lmt6ons.% 55
56 Some%types%of%ss%func6ons%n%\D { Sgmods Gussns Polynomls ( ) ( φ j () ep { ( µ } µj ) φ j) 2 j () σ s 2s 2 σ() +ep( ). 56
57 Two%types%of%lner%model%tht%re%equvlent%wth% y(, w) y(, w) s w w 0 0 respect%to%lernng + + w w φ + w 2 ( ) + 2 w 2 + φ... 2 w ( ) + T... w T Φ( ) The%frst%model%hs%the%sme%numer%of%dp6ve%coeffcents%s% the%dmensonlty%of%the%dt%+.%% The%second%model%hs%the%sme%numer%of%dp6ve% coeffcents%s%the%numer%of%ss%func6ons%+.%% Once%we%hve%replced%the%dt%y%the%outputs%of%the%ss% func6ons,%fqng%the%second%model%s%ectly%the%sme%prolem% s%fqng%the%frst%model%(unless%we%use%the%kernel%trck)% So%we ll%just%focus%on%the%frst%model 57
58 Generl%lner%regresson%prolem Usng%our%new%not6ons%for%the%ss%func6on%lner% regresson%cn%e%wrken%s n y w () j j j 0 where%%%%%%%%%cn%e%ether% j () j %for%mul6vrte%regresson%or% one%of%the%nonlner%ss%we%defned% er Once%gn%we%cn%use% lest%squres %to%fnd%the%op6ml% solu6on. 58
59 LMS%for%the%generl%lner%regresson%prolem regresson prolem Our gol s to mnmze the followng loss functon: J(w) (y w j j ( )) Movng to vector nottons we get: j 2 y n j 0 w j j () w vector of dmenson k+ ( ) vector of dmenson k+ y scler J(w) (y w T ( )) 2 We tke the dervtve w.r.t w w (y w T ( )) 2 2 (y w T ( )) Equtng to 0 we get 2 (y w T ( )) ( ) T ( ) T 0 y ( ) T w T ( ) ( ) T 59
60 LMS%for%the%generl%lner%regresson%prolem We tke the dervtve w.r.t w w (y w T ( )) 2 2 (y w T ( )) ( ) T Equtng to 0 we get 2 (y w T ( )) ( ) T 0 Defne: y ( ) T w T ( ) 0 ( ) ( ) m ( ) 0 ( 2 ) ( 2 ) m ( 2 ) 0 ( n ) ( n ) m ( n ) ( ) T J(w) (y w T ( )) 2 Then dervng w we get: w ( T ) T y 60
61 LMS%for%the%generl%lner%regresson%prolem Dervng w we get: w ( T ) T y J(w) (y w T ( )) 2 k+ entres vector n entres vector n y k+ mtr Ths soluton s lso known s psuedo nverse 6
62 FQng%%polynoml Now%we%use%one%of%these% ss%func6ons:%n%m th %order %polynoml%func6on% own from Bshop We%cn%use%the%sme%pproches%to%op6mze%the% vlues%of%the%weghts%on%ech%coeffcent:%nly6c,% nd%ter6ve 62
63 0 th %order%polynoml 63
64 st %order%polynoml 64
65 3 rd %order%polynoml 65
66 9 th %order%polynoml 66
67 Root%Men%Squre%(RMS)%Error E(w) 2 N n {y( n, w) t n } 2 t 0 M 0 t 0 M E RMS 2E(w )/N 0 0 The%dvson%y%N%llows%us%to%compre% dfferent%szes%of%dt%sets%on%n%equl% foo6ng,%nd%the%squre%root%ensures% tht%erms%s%mesured%on%the%sme% scle%(nd%n%the%sme%unts)%s%the% trget%vrle%t% t 0 M 3 0 t 0 M
68 Root%Men%Squre%(RMS)%Error Trnng Test ERMS M 6 9 Root>Men>Squre'(RMS)'Error:' E(w) 2 NX (t n ( n ) T w) 2 t w 2 2 n Theoverf)ngprolem 68
69 Root%Men%Squre%(RMS)%Error Tle of the coeffcents w for polynomls of vrous order. Oserve how the typcl mgntude of the coeffcents ncreses drmtclly s the order of the polynoml ncreses. M 0 M M 6 M 9 w w w w w w w w w w Theoverf)ngprolem 69
70 Incresng%the%sze%of%trnng%dt N 5 N 00 t t M 9 0 For%%gven%model%complety,%the%over\fQng%prolem%ecome%less%severe% s%the%sze%of%the%dt%set%ncreses.% Another%wy%to%sy%ths%s%tht%the%lrger%the%dt%set,%the%more%comple%(n% other%words%more%flele)%the%model%tht%we%cn%fford%to%ft%to%the%dt.% 70
71 \D%regresson%llustrtes%key%concepts Dt%fts% %s%lner%model%est%(model%selec6on)?% %%% %% %%Smplest%models%do%not%cpture%ll%the%mportnt%% %%% %% %%More%comple%model%my%overft%the%trnng%dt%(ft% not%only%the%sgnl%ut%lso%the%nose%n%the%dt),% especlly%f%not%enough%dt%to%constrn%model%% One%method%of%ssessng%ft:%test%generlz6on%%model s% lty%to%predct%the%held%out%dt%% Op6mz6on%s%essen6l:%stochs6c%nd%tch%ter6ve% pproches;%nly6c%when%vlle 7
72 Regulrzed%Lest%Squres A%technque%to%control%the%overfQng%phenomenon% Add%%penlty%term%to%the%error%func6on%n%order%to% dscourge%the%coeffcents%from%rechng%lrge% vlues Rdge regresson Ẽ(w) 2 N n w whch's'mnmzed'y' {y( n, w) t n } 2 + λ 2 w 2 w 2 w T w w w w 2 M mportnce of the regulrzton term com 72
73 The%effect%of%regulrz6on ln λ 8 ln λ 0 t t M 9 73
74 The%effect%of%regulrz6on ERMS 0.5 Trnng Test ln λ ln λ 8 ln λ 0 w w w w w w w w w w ln λ The%correspondng%coeffcents%from%the%fKed%polynomls,%showng% tht%regulrz6on%hs%the%desred%effect%of%reducng%the%mgntude% of%the%coeffcents. 74
75 A%more%generl%regulrzer 2 N {t n w T φ( n )} 2 + λ 2 n M j w j q q 0.5 q q 2 q 4 75
Least squares. Václav Hlaváč. Czech Technical University in Prague
Lest squres Václv Hlváč Czech echncl Unversty n Prgue hlvc@fel.cvut.cz http://cmp.felk.cvut.cz/~hlvc Courtesy: Fred Pghn nd J.P. Lews, SIGGRAPH 2007 Course; Outlne 2 Lner regresson Geometry of lest-squres
More informationCISE 301: Numerical Methods Lecture 5, Topic 4 Least Squares, Curve Fitting
CISE 3: umercl Methods Lecture 5 Topc 4 Lest Squres Curve Fttng Dr. Amr Khouh Term Red Chpter 7 of the tetoo c Khouh CISE3_Topc4_Lest Squre Motvton Gven set of epermentl dt 3 5. 5.9 6.3 The reltonshp etween
More informationSVMs for regression Non-parametric/instance based classification method
S 75 Mchne ernng ecture Mos Huskrecht mos@cs.ptt.edu 539 Sennott Squre SVMs for regresson Non-prmetrc/nstnce sed cssfcton method S 75 Mchne ernng Soft-mrgn SVM Aos some fet on crossng the seprtng hperpne
More informationSVMs for regression Multilayer neural networks
Lecture SVMs for regresson Muter neur netors Mos Husrecht mos@cs.ptt.edu 539 Sennott Squre Support vector mchne SVM SVM mmze the mrgn round the seprtng hperpne. he decson functon s fu specfed suset of
More informationDCDM BUSINESS SCHOOL NUMERICAL METHODS (COS 233-8) Solutions to Assignment 3. x f(x)
DCDM BUSINESS SCHOOL NUMEICAL METHODS (COS -8) Solutons to Assgnment Queston Consder the followng dt: 5 f() 8 7 5 () Set up dfference tble through fourth dfferences. (b) Wht s the mnmum degree tht n nterpoltng
More informationCS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 9
CS434/541: Pttern Recognton Prof. Olg Veksler Lecture 9 Announcements Fnl project proposl due Nov. 1 1-2 prgrph descrpton Lte Penlt: s 1 pont off for ech d lte Assgnment 3 due November 10 Dt for fnl project
More informationLecture 36. Finite Element Methods
CE 60: Numercl Methods Lecture 36 Fnte Element Methods Course Coordntor: Dr. Suresh A. Krth, Assocte Professor, Deprtment of Cvl Engneerng, IIT Guwht. In the lst clss, we dscussed on the ppromte methods
More informationSupport vector machines for regression
S 75 Mchne ernng ecture 5 Support vector mchnes for regresson Mos Huskrecht mos@cs.ptt.edu 539 Sennott Squre S 75 Mchne ernng he decson oundr: ˆ he decson: Support vector mchnes ˆ α SV ˆ sgn αˆ SV!!: Decson
More informationFall 2012 Analysis of Experimental Measurements B. Eisenstein/rev. S. Errede. with respect to λ. 1. χ λ χ λ ( ) λ, and thus:
More on χ nd errors : uppose tht we re fttng for sngle -prmeter, mnmzng: If we epnd The vlue χ ( ( ( ; ( wth respect to. χ n Tlor seres n the vcnt of ts mnmum vlue χ ( mn χ χ χ χ + + + mn mnmzes χ, nd
More informationDefinition of Tracking
Trckng Defnton of Trckng Trckng: Generte some conclusons bout the moton of the scene, objects, or the cmer, gven sequence of mges. Knowng ths moton, predct where thngs re gong to project n the net mge,
More informationChapter 7 Generalized and Weighted Least Squares Estimation. In this method, the deviation between the observed and expected values of
Chapter 7 Generalzed and Weghted Least Squares Estmaton The usual lnear regresson model assumes that all the random error components are dentcally and ndependently dstrbuted wth constant varance. When
More informationMachine Learning Support Vector Machines SVM
Mchne Lernng Support Vector Mchnes SVM Lesson 6 Dt Clssfcton problem rnng set:, D,,, : nput dt smple {,, K}: clss or lbel of nput rget: Construct functon f : X Y f, D Predcton of clss for n unknon nput
More informationFUNDAMENTALS ON ALGEBRA MATRICES AND DETERMINANTS
Dol Bgyoko (0 FUNDAMENTALS ON ALGEBRA MATRICES AND DETERMINANTS Introducton Expressons of the form P(x o + x + x + + n x n re clled polynomls The coeffcents o,, n re ndependent of x nd the exponents 0,,,
More informationLinear Regression Introduction to Machine Learning. Matt Gormley Lecture 5 September 14, Readings: Bishop, 3.1
School of Computer Scence 10-601 Introducton to Machne Learnng Lnear Regresson Readngs: Bshop, 3.1 Matt Gormle Lecture 5 September 14, 016 1 Homework : Remnders Extenson: due Frda (9/16) at 5:30pm Rectaton
More informationESCI 342 Atmospheric Dynamics I Lesson 1 Vectors and Vector Calculus
ESI 34 tmospherc Dnmcs I Lesson 1 Vectors nd Vector lculus Reference: Schum s Outlne Seres: Mthemtcl Hndbook of Formuls nd Tbles Suggested Redng: Mrtn Secton 1 OORDINTE SYSTEMS n orthonorml coordnte sstem
More informationME 501A Seminar in Engineering Analysis Page 1
More oundr-vlue Prolems nd genvlue Prolems n Os ovemer 9, 7 More oundr-vlue Prolems nd genvlue Prolems n Os Lrr retto Menl ngneerng 5 Semnr n ngneerng nlss ovemer 9, 7 Outlne Revew oundr-vlue prolems Soot
More informationUNIVERSITY OF IOANNINA DEPARTMENT OF ECONOMICS. M.Sc. in Economics MICROECONOMIC THEORY I. Problem Set II
Mcroeconomc Theory I UNIVERSITY OF IOANNINA DEPARTMENT OF ECONOMICS MSc n Economcs MICROECONOMIC THEORY I Techng: A Lptns (Note: The number of ndctes exercse s dffculty level) ()True or flse? If V( y )
More informationMultiple view geometry
EECS 442 Computer vson Multple vew geometry Perspectve Structure from Moton - Perspectve structure from moton prolem - mgutes - lgerc methods - Fctorzton methods - Bundle djustment - Self-clrton Redng:
More information6 Roots of Equations: Open Methods
HK Km Slghtly modfed 3//9, /8/6 Frstly wrtten t Mrch 5 6 Roots of Equtons: Open Methods Smple Fed-Pont Iterton Newton-Rphson Secnt Methods MATLAB Functon: fzero Polynomls Cse Study: Ppe Frcton Brcketng
More informationLinear Inferential Modeling: Theoretical Perspectives, Extensions, and Comparative Analysis
Intellgent Control nd Automton, 2012, 3, 376-389 http://dx.do.org/10.4236/c.2012.34042 Publshed Onlne November 2012 (http://www.scrp.org/journl/c) Lner Inferentl Modelng: heoretcl Perspectves, Extensons,
More informationQuiz: Experimental Physics Lab-I
Mxmum Mrks: 18 Totl tme llowed: 35 mn Quz: Expermentl Physcs Lb-I Nme: Roll no: Attempt ll questons. 1. In n experment, bll of mss 100 g s dropped from heght of 65 cm nto the snd contner, the mpct s clled
More informationCourse Review Introduction to Computer Methods
Course Revew Wht you hopefully hve lerned:. How to nvgte nsde MIT computer system: Athen, UNIX, emcs etc. (GCR). Generl des bout progrmmng (GCR): formultng the problem, codng n Englsh trnslton nto computer
More informationC4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z )
C4B Machne Learnng Answers II.(a) Show that for the logstc sgmod functon dσ(z) dz = σ(z) ( σ(z)) A. Zsserman, Hlary Term 20 Start from the defnton of σ(z) Note that Then σ(z) = σ = dσ(z) dz = + e z e z
More informationJens Siebel (University of Applied Sciences Kaiserslautern) An Interactive Introduction to Complex Numbers
Jens Sebel (Unversty of Appled Scences Kserslutern) An Interctve Introducton to Complex Numbers 1. Introducton We know tht some polynoml equtons do not hve ny solutons on R/. Exmple 1.1: Solve x + 1= for
More informationInternational Journal of Pure and Applied Sciences and Technology
Int. J. Pure Appl. Sc. Technol., () (), pp. 44-49 Interntonl Journl of Pure nd Appled Scences nd Technolog ISSN 9-67 Avlle onlne t www.jopst.n Reserch Pper Numercl Soluton for Non-Lner Fredholm Integrl
More information18.7 Artificial Neural Networks
310 18.7 Artfcl Neurl Networks Neuroscence hs hypotheszed tht mentl ctvty conssts prmrly of electrochemcl ctvty n networks of brn cells clled neurons Ths led McCulloch nd Ptts to devse ther mthemtcl model
More informationModeling Labor Supply through Duality and the Slutsky Equation
Interntonl Journl of Economc Scences nd Appled Reserch 3 : 111-1 Modelng Lor Supply through Dulty nd the Slutsky Equton Ivn Ivnov 1 nd Jul Dorev Astrct In the present pper n nlyss of the neo-clsscl optmzton
More informationActivator-Inhibitor Model of a Dynamical System: Application to an Oscillating Chemical Reaction System
Actvtor-Inhtor Model of Dynmcl System: Applcton to n Osclltng Chemcl Recton System C.G. Chrrth*P P,Denn BsuP P * Deprtment of Appled Mthemtcs Unversty of Clcutt 9, A. P. C. Rod, Kolt-79 # Deprtment of
More informationGeneralized Least-Squares Regressions I: Efcient Derivations
Generlzed Lest-Squres Regressons I: Efcent Dervtons NATANIEL GREENE Deprtment of Mthemtcs nd Computer Scence Kngsough Communt College, CUNY 00 Orentl Boulevrd, Brookln, NY 35 UNITED STATES ngreene.mth@gml.com
More informationwhere I = (n x n) diagonal identity matrix with diagonal elements = 1 and off-diagonal elements = 0; and σ 2 e = variance of (Y X).
11.4.1 Estmaton of Multple Regresson Coeffcents In multple lnear regresson, we essentally solve n equatons for the p unnown parameters. hus n must e equal to or greater than p and n practce n should e
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationLecture 3: Dual problems and Kernels
Lecture 3: Dual problems and Kernels C4B Machne Learnng Hlary 211 A. Zsserman Prmal and dual forms Lnear separablty revsted Feature mappng Kernels for SVMs Kernel trck requrements radal bass functons SVM
More informationFeb 14: Spatial analysis of data fields
Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s
More informationAn Introduction to Support Vector Machines
An Introducton to Support Vector Mchnes Wht s good Decson Boundry? Consder two-clss, lnerly seprble clssfcton problem Clss How to fnd the lne (or hyperplne n n-dmensons, n>)? Any de? Clss Per Lug Mrtell
More informationINSTITUTE OF AERONAUTICAL ENGINEERING Dundigal, Hyderabad
INSTITUTE OF AERONAUTICAL ENGINEERING Dundgl, Hyderbd - 5 3 FRESHMAN ENGINEERING TUTORIAL QUESTION BANK Nme : MATHEMATICS II Code : A6 Clss : II B. Te II Semester Brn : FRESHMAN ENGINEERING Yer : 5 Fulty
More informationChapter 5 Supplemental Text Material R S T. ij i j ij ijk
Chpter 5 Supplementl Text Mterl 5-. Expected Men Squres n the Two-fctor Fctorl Consder the two-fctor fxed effects model y = µ + τ + β + ( τβ) + ε k R S T =,,, =,,, k =,,, n gven s Equton (5-) n the textook.
More informationRank One Update And the Google Matrix by Al Bernstein Signal Science, LLC
Introducton Rnk One Updte And the Google Mtrx y Al Bernsten Sgnl Scence, LLC www.sgnlscence.net here re two dfferent wys to perform mtrx multplctons. he frst uses dot product formulton nd the second uses
More informationCOMPLEX NUMBERS INDEX
COMPLEX NUMBERS INDEX. The hstory of the complex numers;. The mgnry unt I ;. The Algerc form;. The Guss plne; 5. The trgonometrc form;. The exponentl form; 7. The pplctons of the complex numers. School
More informationLOCAL FRACTIONAL LAPLACE SERIES EXPANSION METHOD FOR DIFFUSION EQUATION ARISING IN FRACTAL HEAT TRANSFER
Yn, S.-P.: Locl Frctonl Lplce Seres Expnson Method for Dffuson THERMAL SCIENCE, Yer 25, Vol. 9, Suppl., pp. S3-S35 S3 LOCAL FRACTIONAL LAPLACE SERIES EXPANSION METHOD FOR DIFFUSION EQUATION ARISING IN
More information18-660: Numerical Methods for Engineering Design and Optimization
8-66: Numercal Methods for Engneerng Desgn and Optmzaton n L Department of EE arnege Mellon Unversty Pttsburgh, PA 53 Slde Overve lassfcaton Support vector machne Regularzaton Slde lassfcaton Predct categorcal
More informationFitting a Polynomial to Heat Capacity as a Function of Temperature for Ag. Mathematical Background Document
Fttng Polynol to Het Cpcty s Functon of Teperture for Ag. thetcl Bckground Docuent by Theres Jul Zelnsk Deprtent of Chestry, edcl Technology, nd Physcs onouth Unversty West ong Brnch, J 7764-898 tzelns@onouth.edu
More informationPartially Observable Systems. 1 Partially Observable Markov Decision Process (POMDP) Formalism
CS294-40 Lernng for Rootcs nd Control Lecture 10-9/30/2008 Lecturer: Peter Aeel Prtlly Oservle Systems Scre: Dvd Nchum Lecture outlne POMDP formlsm Pont-sed vlue terton Glol methods: polytree, enumerton,
More informationCENTROID (AĞIRLIK MERKEZİ )
CENTOD (ĞLK MEKEZİ ) centrod s geometrcl concept rsng from prllel forces. Tus, onl prllel forces possess centrod. Centrod s tougt of s te pont were te wole wegt of pscl od or sstem of prtcles s lumped.
More informationAbhilasha Classes Class- XII Date: SOLUTION (Chap - 9,10,12) MM 50 Mob no
hlsh Clsses Clss- XII Dte: 0- - SOLUTION Chp - 9,0, MM 50 Mo no-996 If nd re poston vets of nd B respetvel, fnd the poston vet of pont C n B produed suh tht C B vet r C B = where = hs length nd dreton
More informationDepartment of Mechanical Engineering, University of Bath. Mathematics ME Problem sheet 11 Least Squares Fitting of data
Deprtment of Mechncl Engneerng, Unversty of Bth Mthemtcs ME10305 Prolem sheet 11 Lest Squres Fttng of dt NOTE: If you re gettng just lttle t concerned y the length of these questons, then do hve look t
More informationReview of linear algebra. Nuno Vasconcelos UCSD
Revew of lner lgebr Nuno Vsconcelos UCSD Vector spces Defnton: vector spce s set H where ddton nd sclr multplcton re defned nd stsf: ) +( + ) (+ )+ 5) λ H 2) + + H 6) 3) H, + 7) λ(λ ) (λλ ) 4) H, - + 8)
More informationChapter 9: Statistical Inference and the Relationship between Two Variables
Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,
More informationChapter 15 - Multiple Regression
Chapter - Multple Regresson Chapter - Multple Regresson Multple Regresson Model The equaton that descrbes how the dependent varable y s related to the ndependent varables x, x,... x p and an error term
More informationModel Fitting and Robust Regression Methods
Dertment o Comuter Engneerng Unverst o Clorn t Snt Cruz Model Fttng nd Robust Regresson Methods CMPE 64: Imge Anlss nd Comuter Vson H o Fttng lnes nd ellses to mge dt Dertment o Comuter Engneerng Unverst
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationPrinciple Component Analysis
Prncple Component Anlyss Jng Go SUNY Bufflo Why Dmensonlty Reducton? We hve too mny dmensons o reson bout or obtn nsghts from o vsulze oo much nose n the dt Need to reduce them to smller set of fctors
More informationINTRODUCTORY NUMERICAL ANALYSIS
ITRODUCTORY UMERICL LYSIS Lecture otes y Mrce ndrecut Unversl Pulshers/UPUBLISHCOM Prlnd FL US Introductory umercl nlyss: Lecture otes Copyrght Mrce ndrecut ll rghts reserved ISB: 877 Unversl Pulshers/uPUBLISHcom
More informationTHE COMBINED SHEPARD ABEL GONCHAROV UNIVARIATE OPERATOR
REVUE D ANALYSE NUMÉRIQUE ET DE THÉORIE DE L APPROXIMATION Tome 32, N o 1, 2003, pp 11 20 THE COMBINED SHEPARD ABEL GONCHAROV UNIVARIATE OPERATOR TEODORA CĂTINAŞ Abstrct We extend the Sheprd opertor by
More informationNeural Network (Basic Ideas) Hung-yi Lee
Neur Network (Bsc Ides) Hung-y Lee Lernng Lookng for Functon Speech Recognton f Hndwrtten Recognton f Wether forecst f Py vdeo gmes f wether tody Postons nd numer of enemes 你好 sunny tomorrow fre Frmework
More informationChapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where y + = β + β e for =,..., y and are observable varables e s a random error How can an estmaton rule be constructed for the
More informatione i is a random error
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where + β + β e for,..., and are observable varables e s a random error How can an estmaton rule be constructed for the unknown
More informationThe Geometry of Logit and Probit
The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.
More informationLecture 4: Piecewise Cubic Interpolation
Lecture notes on Vrtonl nd Approxmte Methods n Appled Mthemtcs - A Perce UBC Lecture 4: Pecewse Cubc Interpolton Compled 6 August 7 In ths lecture we consder pecewse cubc nterpolton n whch cubc polynoml
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationLogistic Regression Maximum Likelihood Estimation
Harvard-MIT Dvson of Health Scences and Technology HST.951J: Medcal Decson Support, Fall 2005 Instructors: Professor Lucla Ohno-Machado and Professor Staal Vnterbo 6.873/HST.951 Medcal Decson Support Fall
More informationChemical Reaction Engineering
Lecture 20 hemcl Recton Engneerng (RE) s the feld tht studes the rtes nd mechnsms of chemcl rectons nd the desgn of the rectors n whch they tke plce. Lst Lecture Energy Blnce Fundmentls F 0 E 0 F E Q W
More informationKatholieke Universiteit Leuven Department of Computer Science
Updte Rules for Weghted Non-negtve FH*G Fctorzton Peter Peers Phlp Dutré Report CW 440, Aprl 006 Ktholeke Unverstet Leuven Deprtment of Computer Scence Celestjnenln 00A B-3001 Heverlee (Belgum) Updte Rules
More informationJean Fernand Nguema LAMETA UFR Sciences Economiques Montpellier. Abstract
Stochstc domnnce on optml portfolo wth one rsk less nd two rsky ssets Jen Fernnd Nguem LAMETA UFR Scences Economques Montpeller Abstrct The pper provdes restrctons on the nvestor's utlty functon whch re
More informationReview: Fit a line to N data points
Revew: Ft a lne to data ponts Correlated parameters: L y = a x + b Orthogonal parameters: J y = a (x ˆ x + b For ntercept b, set a=0 and fnd b by optmal average: ˆ b = y, Var[ b ˆ ] = For slope a, set
More informationβ0 + β1xi. You are interested in estimating the unknown parameters β
Revsed: v3 Ordnar Least Squares (OLS): Smple Lnear Regresson (SLR) Analtcs The SLR Setup Sample Statstcs Ordnar Least Squares (OLS): FOCs and SOCs Back to OLS and Sample Statstcs Predctons (and Resduals)
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationβ0 + β1xi. You are interested in estimating the unknown parameters β
Ordnary Least Squares (OLS): Smple Lnear Regresson (SLR) Analytcs The SLR Setup Sample Statstcs Ordnary Least Squares (OLS): FOCs and SOCs Back to OLS and Sample Statstcs Predctons (and Resduals) wth OLS
More informationLecture 6: Introduction to Linear Regression
Lecture 6: Introducton to Lnear Regresson An Manchakul amancha@jhsph.edu 24 Aprl 27 Lnear regresson: man dea Lnear regresson can be used to study an outcome as a lnear functon of a predctor Example: 6
More informationGenerative classification models
CS 675 Intro to Machne Learnng Lecture Generatve classfcaton models Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Data: D { d, d,.., dn} d, Classfcaton represents a dscrete class value Goal: learn
More informationLecture 3 Camera Models 2 & Camera Calibration. Professor Silvio Savarese Computational Vision and Geometry Lab
Lecture Cer Models Cer Clbrton rofessor Slvo Svrese Coputtonl Vson nd Geoetry Lb Slvo Svrese Lecture - Jn 7 th, 8 Lecture Cer Models Cer Clbrton Recp of cer odels Cer clbrton proble Cer clbrton wth rdl
More informationAssociation for the Chi-square Test
Assocaton for the Ch-square Test Davd J Olve Southern Illnos Unversty February 8, 2012 Abstract A problem wth measures of assocaton for the ch-square test s that the measures depend on the number of observatons
More informationINF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018
INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationOutline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique
Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng
More informationCME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 13
CME 30: NUMERICAL LINEAR ALGEBRA FALL 005/06 LECTURE 13 GENE H GOLUB 1 Iteratve Methods Very large problems (naturally sparse, from applcatons): teratve methods Structured matrces (even sometmes dense,
More informationExploiting Structure in Probability Distributions Irit Gat-Viks
Explotng Structure n rolty Dstrutons Irt Gt-Vks Bsed on presentton nd lecture notes of Nr Fredmn, Herew Unversty Generl References: D. Koller nd N. Fredmn, prolstc grphcl models erl, rolstc Resonng n Intellgent
More informationPhysicsAndMathsTutor.com
PhscsAndMathsTutor.com phscsandmathstutor.com June 005 5. The random varable X has probablt functon k, = 1,, 3, P( X = ) = k ( + 1), = 4, 5, where k s a constant. (a) Fnd the value of k. (b) Fnd the eact
More information8. INVERSE Z-TRANSFORM
8. INVERSE Z-TRANSFORM The proce by whch Z-trnform of tme ere, nmely X(), returned to the tme domn clled the nvere Z-trnform. The nvere Z-trnform defned by: Computer tudy Z X M-fle trn.m ued to fnd nvere
More informationPattern Classification
Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher
More informationDecision Analysis (part 2 of 2) Review Linear Regression
Harvard-MIT Dvson of Health Scences and Technology HST.951J: Medcal Decson Support, Fall 2005 Instructors: Professor Lucla Ohno-Machado and Professor Staal Vnterbo 6.873/HST.951 Medcal Decson Support Fall
More informationQuadrilateral et Hexahedral Pseudo-conform Finite Elements
Qurlterl et Heerl seuo-conform Fnte Elements E. DUBACH R. LUCE J.M. THOMAS Lbortore e Mtémtques Applquées UMR 5 u Frnce GDR MoMs Métoes Numérques pour les Flues. rs écembre 6 Wt s te problem? Loss of conergence
More informationStatistics 423 Midterm Examination Winter 2009
Sttstcs 43 Mdterm Exmnton Wnter 009 Nme: e-ml: 1. Plese prnt your nme nd e-ml ddress n the bove spces.. Do not turn ths pge untl nstructed to do so. 3. Ths s closed book exmnton. You my hve your hnd clcultor
More informationis the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors
Multple Lnear and Polynomal Regresson wth Statstcal Analyss Gven a set of data of measured (or observed) values of a dependent varable: y versus n ndependent varables x 1, x, x n, multple lnear regresson
More informationPHYS 2421 Fields and Waves
PHYS 242 Felds nd Wves Instucto: Joge A. López Offce: PSCI 29 A, Phone: 747-7528 Textook: Unvesty Physcs e, Young nd Feedmn 23. Electc potentl enegy 23.2 Electc potentl 23.3 Clcultng electc potentl 23.4
More informationLecture 21: Numerical methods for pricing American type derivatives
Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)
More informationCALIBRATION OF SMALL AREA ESTIMATES IN BUSINESS SURVEYS
CALIBRATION OF SMALL AREA ESTIMATES IN BUSINESS SURVES Rodolphe Prm, Ntle Shlomo Southmpton Sttstcl Scences Reserch Insttute Unverst of Southmpton Unted Kngdom SAE, August 20 The BLUE-ETS Project s fnnced
More informationReactor Control Division BARC Mumbai India
A Study of Frctonl Schrödnger Equton-composed v Jumre frctonl dervtve Joydp Bnerjee 1, Uttm Ghosh, Susmt Srkr b nd Shntnu Ds 3 Uttr Bunch Kjl Hr Prmry school, Ful, Nd, West Bengl, Ind eml- joydp1955bnerjee@gml.com
More informationIntro to Visual Recognition
CS 2770: Computer Vson Intro to Vsual Recognton Prof. Adrana Kovashka Unversty of Pttsburgh February 13, 2018 Plan for today What s recognton? a.k.a. classfcaton, categorzaton Support vector machnes Separable
More informationQuantum Mechanics for Scientists and Engineers. David Miller
Quantum Mechancs for Scentsts and Engneers Davd Mller Types of lnear operators Types of lnear operators Blnear expanson of operators Blnear expanson of lnear operators We know that we can expand functons
More informationStratified Extreme Ranked Set Sample With Application To Ratio Estimators
Journl of Modern Appled Sttstcl Metods Volume 3 Issue Artcle 5--004 Strtfed Extreme Rned Set Smple Wt Applcton To Rto Estmtors Hn M. Smw Sultn Qboos Unversty, smw@squ.edu.om t J. Sed Sultn Qboos Unversty
More informationKristin P. Bennett. Rensselaer Polytechnic Institute
Support Vector Machnes and Other Kernel Methods Krstn P. Bennett Mathematcal Scences Department Rensselaer Polytechnc Insttute Support Vector Machnes (SVM) A methodology for nference based on Statstcal
More informationChapter 2 Transformations and Expectations. , and define f
Revew for the prevous lecture Defnton: support set of a ranom varable, the monotone functon; Theorem: How to obtan a cf, pf (or pmf) of functons of a ranom varable; Eamples: several eamples Chapter Transformatons
More informationUSING IMAGE STATISTICS FOR AUTOMATED QUALITY ASSESSMENT OF URBAN GEOSPATIAL DATA
USIG IMAGE SAISICS FOR AUOMAED QUALIY ASSESSME OF URBA GEOSPAIAL DAA WGoemn, LMrtnez-Fonte, RBellens, SGutm Dept elecommuncton nd Informton Processng, Ghent Unversty StPetersneuwstrt, B-9000 Gent, Belgum
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationβ0 + β1xi and want to estimate the unknown
SLR Models Estmaton Those OLS Estmates Estmators (e ante) v. estmates (e post) The Smple Lnear Regresson (SLR) Condtons -4 An Asde: The Populaton Regresson Functon B and B are Lnear Estmators (condtonal
More informationTransform Coding. C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University
Trnsform Codng C.M. Lu Perceptul Sgnl Processng Lb College of Computer Scence Ntonl Cho-Tung Unversty http://www.cse.nctu.edu.tw/~cmlu/courses/compresson/ Offce: EC538 (03)573877 cmlu@cs.nctu.edu.tw Motvtng
More informationBinomial Distribution: Tossing a coin m times. p = probability of having head from a trial. y = # of having heads from n trials (y = 0, 1,..., m).
[7] Count Data Models () Some Dscrete Probablty Densty Functons Bnomal Dstrbuton: ossng a con m tmes p probablty of havng head from a tral y # of havng heads from n trals (y 0,,, m) m m! fb( y n) p ( p)
More informationIntroduction to Numerical Integration Part II
Introducton to umercl Integrton Prt II CS 75/Mth 75 Brn T. Smth, UM, CS Dept. Sprng, 998 4/9/998 qud_ Intro to Gussn Qudrture s eore, the generl tretment chnges the ntegrton prolem to ndng the ntegrl w
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More information15-381: Artificial Intelligence. Regression and cross validation
15-381: Artfcal Intellgence Regresson and cross valdaton Where e are Inputs Densty Estmator Probablty Inputs Classfer Predct category Inputs Regressor Predct real no. Today Lnear regresson Gven an nput
More information