Regression and the LMS Algorithm
|
|
- Godfrey Morton
- 5 years ago
- Views:
Transcription
1 CSE 556: Itroducto to Neural Netorks Regresso ad the LMS Algorthm CSE 556: Regresso 1
2 Problem statemet CSE 556: Regresso
3 Lear regresso th oe varable Gve a set of N pars of data {, d }, appromate d b a lear fucto of regressor.e. or d + d here the actvato fucto φ s a lear fucto, correspodg to a lear euro. s the output of the euro, ad s called the regresso epectatoal error b + ε ϕ + b ε d + b + ε + ε CSE 556: Regresso 3
4 CSE 556: Regresso 4 Lear regresso cot. The problem of regresso th oe varable s ho to choose ad b to mmze the regresso error The least squares method ams to mmze the square error: N N d E ε
5 Lear regresso cot. To mmze the to-varable square fucto, set E b E CSE 556: Regresso 5
6 CSE 556: Regresso 6 Lear regresso cot. b d b d b b E 1 b d b d E 1
7 Aaltc soluto approaches Solve oe equato for b terms of Substtute to other equato, solve for Substtute soluto for back to equato for b Setup sstem of equatos matr otato Solve matr equato Rerte problem matr form Compute matr gradet Solve for CSE 556: Regresso 7
8 CSE 556: Regresso 8 Lear regresso cot. Hece here a overbar.e. dcates the mea N d d b d d Derve ourself!
9 Lear regresso matr otato Let XX NN TT The the model predctos are XXXX Ad the mea square error ca be rtte EE dd dd XXXX To fd the optmal, set the gradet of the error th respect to equal to ad solve for EE dd XXXX See The Matr Cookbook Peterse & Pederse CSE 556: Regresso 9
10 Lear regresso matr otato EE dd XXXX dd XXXX TT dd XXXX ddtt dd TT XX TT dd + TT XX TT XXXX XX TT dd XX TT XXXX EE XXTT dd XX TT XXXX XX TT XX 1 XX TT dd Much cleaer! CSE 556: Regresso 1
11 Fdg optmal parameters va search Ofte there s o closed form soluto for EE We ca stll use the gradet a umercal soluto We ll stll use the same eample to permt comparso For smplct s sake, set b E 1 N 1 E s called a cost fucto d CSE 556: Regresso 11
12 Cost fucto E E m * Questo: ho ca e update from to mmze E? CSE 556: Regresso 1
13 CSE 556: Regresso 13 Gradet ad drectoal dervatves Cosder a to-varable fucto f,. Its gradet at the pot, T s defed as here u ad u are ut vectors the ad drectos, ad ad T f f f f u, u,,,, f + f f f f
14 CSE 556: Regresso 14 Gradet ad drectoal dervatves cot. At a gve drecto, u au + bu, th, the drectoal dervatve at, T alog the ut vector u s Whch drecto has the greatest slope? The gradet because of the dot product! u, f,, ],, [ ],, [ lm,, lm, u T h h bf af h f hb f hb f hb ha f h f hb ha f f D b a
15 Gradet ad drectoal dervatves cot. Eample: f, CSE 556: Regresso 15
16 Gradet ad drectoal dervatves cot. Eample: f, CSE 556: Regresso 16
17 Gradet ad drectoal dervatves cot. The level curves of a fucto ff, are curves such that ff, kk Thus, the drectoal dervatve alog a level curve s T Du f, u Ad the gradet vector s perpedcular to the level curve CSE 556: Regresso 17
18 Gradet ad drectoal dervatves cot. The gradet of a cost fucto s a vector th the dmeso of that pots to the drecto of mamum E crease ad th a magtude equal to the slope of the taget of the cost fucto alog that drecto Ca the slope be egatve? CSE 556: Regresso 18
19 Gradet llustrato E E m * Δ E lm E + E Gradet CSE 556: Regresso 19
20 Gradet descet Mmze the cost fucto va gradet steepest descet a case of hll-clmbg + 1 η E : terato umber η: learg rate See prevous fgure CSE 556: Regresso
21 CSE 556: Regresso 1 Gradet descet cot. For the mea-square-error cost fucto ad lear euros ] [ 1 ] [ 1 1 d d e E 1 e e E E
22 CSE 556: Regresso Gradet descet cot. Hece Ths s the least-mea-square LMS algorthm, or the Wdro-Hoff rule ] [ 1 d e η η
23 Stochastc gradet descet If the cost fucto s of the form EE EE CSE 556: Regresso 3 NN 1 The oe gradet descet step requres computg Δ NN EE EE 1 Whch meas computg EE or ts gradet for ever data pot Ma steps ma be requred to reach a optmum
24 Stochastc gradet descet It s geerall much more computatoall effcet to use + bb 1 Δ EE For small values of bb Ths update rule ma coverge ma feer passes through the data epochs CSE 556: Regresso 4
25 Stochastc gradet descet eample CSE 556: Regresso 5
26 Stochastc gradet descet error fuctos CSE 556: Regresso 6
27 Stochastc gradet descet gradets CSE 556: Regresso 7
28 Stochastc gradet descet amato CSE 556: Regresso 8
29 Gradet descet amato CSE 556: Regresso 9
30 Mult-varable LMS The aalss for the oe-varable case eteds to the multvarable case 1 T E [ d ] E E E,,..., 1 E m here b bas ad 1, as doe for perceptro learg T CSE 556: Regresso 3
31 CSE 556: Regresso 31 Mult-varable LMS cot. The LMS algorthm ] [ 1 d e E η η η
32 LMS algorthm remarks The LMS rule s eactl the same equato as the perceptro learg rule Perceptro learg s for olear M-P euros, hereas LMS learg s for lear euros..e., perceptro learg s for classfcato ad LMS s for fucto appromato LMS should be less sestve to ose the put data tha perceptros O the other had, LMS learg coverges slol Neto s method chages eghts the drecto of the mmum E ad leads to fast covergece. But t s ot ole ad s computatoall epesve CSE 556: Regresso 3
33 Stablt of adaptato Whe η s too small, learg coverges slol Whe η s too large, learg does t coverge CSE 556: Regresso 33
34 Learg rate aealg Basc dea: start th a large rate but graduall decrease t Stochastc appromato η c s a postve parameter c CSE 556: Regresso 34
35 Learg rate aealg cot. Search-the-coverge η η 1+ τ η ad τ are postve parameters Whe s small compared to τ, learg rate s appromatel costat Whe s large compared to τ, learg rule schedule roughl follos stochastc appromato CSE 556: Regresso 35
36 Rate aealg llustrato CSE 556: Regresso 36
37 CSE 556: Regresso 37 Nolear euros To eted the LMS algorthm to olear euros, cosder dfferetable actvato fucto φ at terato 1 ] [ 1 j j j d d E ϕ
38 CSE 556: Regresso 38 Nolear euros cot. B cha rule of dfferetato ] [ v e v d v v E E j j j j ϕ ϕ
39 CSE 556: Regresso 39 Nolear euros cot. Gradet descet gves The above s called the delta δ rule If e choose a logstc sgmod for φ the 1 v e j j j j j ηδ ϕ η ep 1 1 av v + ϕ ] [1 v v a v ϕ ϕ ϕ see tetbook
40 Role of actvato fucto φ φ v v The role of φ : eght update s most sestve he v s ear zero CSE 556: Regresso 4
CSE 5526: Introduction to Neural Networks Linear Regression
CSE 556: Itroducto to Neural Netorks Lear Regresso Part II 1 Problem statemet Part II Problem statemet Part II 3 Lear regresso th oe varable Gve a set of N pars of data , appromate d by a lear fucto
More informationSupervised learning: Linear regression Logistic regression
CS 57 Itroducto to AI Lecture 4 Supervsed learg: Lear regresso Logstc regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Data: D { D D.. D D Supervsed learg d a set of eamples s
More informationCS 1675 Introduction to Machine Learning Lecture 12 Support vector machines
CS 675 Itroducto to Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Mdterm eam October 9, 7 I-class eam Closed book Stud materal: Lecture otes Correspodg chapters
More informationCS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x
CS 75 Mache Learg Lecture 8 Lear regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear combato of put compoets f + + + K d d K k - parameters
More information15-381: Artificial Intelligence. Regression and neural networks (NN)
5-38: Artfcal Itellece Reresso ad eural etorks NN) Mmck the bra I the earl das of AI there as a lot of terest develop models that ca mmc huma thk. Whle o oe ke eactl ho the bra orks ad, eve thouh there
More informationMultiple Choice Test. Chapter Adequacy of Models for Regression
Multple Choce Test Chapter 06.0 Adequac of Models for Regresso. For a lear regresso model to be cosdered adequate, the percetage of scaled resduals that eed to be the rage [-,] s greater tha or equal to
More informationSupport vector machines II
CS 75 Mache Learg Lecture Support vector maches II Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Learl separable classes Learl separable classes: here s a hperplae that separates trag staces th o error
More informationSupport vector machines
CS 75 Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Outle Outle: Algorthms for lear decso boudary Support vector maches Mamum marg hyperplae.
More informationBinary classification: Support Vector Machines
CS 57 Itroducto to AI Lecture 6 Bar classfcato: Support Vector Maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Supervsed learg Data: D { D, D,.., D} a set of eamples D, (,,,,,
More informationGenerative classification models
CS 75 Mache Learg Lecture Geeratve classfcato models Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Data: D { d, d,.., d} d, Classfcato represets a dscrete class value Goal: lear f : X Y Bar classfcato
More informationLecture 16: Backpropogation Algorithm Neural Networks with smooth activation functions
CO-511: Learg Theory prg 2017 Lecturer: Ro Lv Lecture 16: Bacpropogato Algorthm Dsclamer: These otes have ot bee subected to the usual scruty reserved for formal publcatos. They may be dstrbuted outsde
More informationLecture Notes Forecasting the process of estimating or predicting unknown situations
Lecture Notes. Ecoomc Forecastg. Forecastg the process of estmatg or predctg ukow stuatos Eample usuall ecoomsts predct future ecoomc varables Forecastg apples to a varet of data () tme seres data predctg
More informationCS 2750 Machine Learning. Lecture 7. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x
CS 75 Mache Learg Lecture 7 Lear regresso Mlos Hauskrecht los@cs.ptt.edu 59 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear cobato of put copoets f + + + K d d K k - paraeters eghts
More informationObjectives of Multiple Regression
Obectves of Multple Regresso Establsh the lear equato that best predcts values of a depedet varable Y usg more tha oe eplaator varable from a large set of potetal predctors {,,... k }. Fd that subset of
More informationUnsupervised Learning and Other Neural Networks
CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all
More information( ) 2 2. Multi-Layer Refraction Problem Rafael Espericueta, Bakersfield College, November, 2006
Mult-Layer Refracto Problem Rafael Espercueta, Bakersfeld College, November, 006 Lght travels at dfferet speeds through dfferet meda, but refracts at layer boudares order to traverse the least-tme path.
More informationCorrelation and Regression Analysis
Chapter V Correlato ad Regresso Aalss R. 5.. So far we have cosdered ol uvarate dstrbutos. Ma a tme, however, we come across problems whch volve two or more varables. Ths wll be the subject matter of the
More informationJohns Hopkins University Department of Biostatistics Math Review for Introductory Courses
Johs Hopks Uverst Departmet of Bostatstcs Math Revew for Itroductor Courses Ratoale Bostatstcs courses wll rel o some fudametal mathematcal relatoshps, fuctos ad otato. The purpose of ths Math Revew s
More information( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model
Chapter 3 Asmptotc Theor ad Stochastc Regressors The ature of eplaator varable s assumed to be o-stochastc or fed repeated samples a regresso aalss Such a assumpto s approprate for those epermets whch
More informationLecture 12: Multilayer perceptrons II
Lecture : Multlayer perceptros II Bayes dscrmats ad MLPs he role of hdde uts A eample Itroducto to Patter Recoto Rcardo Guterrez-Osua Wrht State Uversty Bayes dscrmats ad MLPs ( As we have see throuhout
More informationRadial Basis Function Networks
Radal Bass Fucto Netorks Radal Bass Fucto Netorks A specal types of ANN that have three layers Iput layer Hdde layer Output layer Mappg from put to hdde layer s olear Mappg from hdde to output layer s
More informationKernel-based Methods and Support Vector Machines
Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg
More informationJohns Hopkins University Department of Biostatistics Math Review for Introductory Courses
Johs Hopks Uverst Departmet of Bostatstcs Math Revew for Itroductor Courses Ratoale Bostatstcs courses wll rel o some fudametal mathematcal relatoshps, fuctos ad otato. The purpose of ths Math Revew s
More information1 Lyapunov Stability Theory
Lyapuov Stablty heory I ths secto we cosder proofs of stablty of equlbra of autoomous systems. hs s stadard theory for olear systems, ad oe of the most mportat tools the aalyss of olear systems. It may
More informationLecture 12 APPROXIMATION OF FIRST ORDER DERIVATIVES
FDM: Appromato of Frst Order Dervatves Lecture APPROXIMATION OF FIRST ORDER DERIVATIVES. INTRODUCTION Covectve term coservato equatos volve frst order dervatves. The smplest possble approach for dscretzato
More informationBayes (Naïve or not) Classifiers: Generative Approach
Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg
More informationLinear regression (cont.) Linear methods for classification
CS 75 Mache Lear Lecture 7 Lear reresso cot. Lear methods for classfcato Mlos Hausrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Lear Coeffcet shrae he least squares estmates ofte have lo bas but hh
More informationChapter Two. An Introduction to Regression ( )
ubject: A Itroducto to Regresso Frst tage Chapter Two A Itroducto to Regresso (018-019) 1 pg. ubject: A Itroducto to Regresso Frst tage A Itroducto to Regresso Regresso aalss s a statstcal tool for the
More informationb. There appears to be a positive relationship between X and Y; that is, as X increases, so does Y.
.46. a. The frst varable (X) s the frst umber the par ad s plotted o the horzotal axs, whle the secod varable (Y) s the secod umber the par ad s plotted o the vertcal axs. The scatterplot s show the fgure
More informationAn Introduction to. Support Vector Machine
A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork
More informationBig Data Analytics. Data Fitting and Sampling. Acknowledgement: Notes by Profs. R. Szeliski, S. Seitz, S. Lazebnik, K. Chaturvedi, and S.
Bg Data Aaltcs Data Fttg ad Samplg Ackowledgemet: Notes b Profs. R. Szelsk, S. Setz, S. Lazebk, K. Chaturved, ad S. Shah Fttg: Cocepts ad recpes A bag of techques If we kow whch pots belog to the le, how
More informationA conic cutting surface method for linear-quadraticsemidefinite
A coc cuttg surface method for lear-quadratcsemdefte programmg Mohammad R. Osoorouch Calfora State Uversty Sa Marcos Sa Marcos, CA Jot wor wth Joh E. Mtchell RPI July 3, 2008 Outle: Secod-order coe: defto
More informationMachine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012
Mache Learg CSE6740/CS764/ISYE6740, Fall 0 Itroducto to Regresso Le Sog Lecture 4, August 30, 0 Based o sldes from Erc g, CMU Readg: Chap. 3, CB Mache learg for apartmet hutg Suppose ou are to move to
More informationESS Line Fitting
ESS 5 014 17. Le Fttg A very commo problem data aalyss s lookg for relatoshpetwee dfferet parameters ad fttg les or surfaces to data. The smplest example s fttg a straght le ad we wll dscuss that here
More information9.1 Introduction to the probit and logit models
EC3000 Ecoometrcs Lecture 9 Probt & Logt Aalss 9. Itroducto to the probt ad logt models 9. The logt model 9.3 The probt model Appedx 9. Itroducto to the probt ad logt models These models are used regressos
More informationModel Fitting, RANSAC. Jana Kosecka
Model Fttg, RANSAC Jaa Kosecka Fttg: Issues Prevous strateges Le detecto Hough trasform Smple parametrc model, two parameters m, b m + b Votg strateg Hard to geeralze to hgher dmesos a o + a + a 2 2 +
More informationFourth Order Four-Stage Diagonally Implicit Runge-Kutta Method for Linear Ordinary Differential Equations ABSTRACT INTRODUCTION
Malasa Joural of Mathematcal Sceces (): 95-05 (00) Fourth Order Four-Stage Dagoall Implct Ruge-Kutta Method for Lear Ordar Dfferetal Equatos Nur Izzat Che Jawas, Fudzah Ismal, Mohamed Sulema, 3 Azm Jaafar
More informationF. Inequalities. HKAL Pure Mathematics. 進佳數學團隊 Dr. Herbert Lam 林康榮博士. [Solution] Example Basic properties
進佳數學團隊 Dr. Herbert Lam 林康榮博士 HKAL Pure Mathematcs F. Ieualtes. Basc propertes Theorem Let a, b, c be real umbers. () If a b ad b c, the a c. () If a b ad c 0, the ac bc, but f a b ad c 0, the ac bc. Theorem
More informationMaximum Likelihood Estimation
Marquette Uverst Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 08 b Marquette Uverst Maxmum Lkelhood Estmato We have bee sag that ~
More informationUNIT 2 SOLUTION OF ALGEBRAIC AND TRANSCENDENTAL EQUATIONS
Numercal Computg -I UNIT SOLUTION OF ALGEBRAIC AND TRANSCENDENTAL EQUATIONS Structure Page Nos..0 Itroducto 6. Objectves 7. Ital Approxmato to a Root 7. Bsecto Method 8.. Error Aalyss 9.4 Regula Fals Method
More informationPGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation
PGE 30: Formulato ad Soluto Geosystems Egeerg Dr. Balhoff Iterpolato Numercal Methods wth MATLAB, Recktewald, Chapter 0 ad Numercal Methods for Egeers, Chapra ad Caale, 5 th Ed., Part Fve, Chapter 8 ad
More information= 2. Statistic - function that doesn't depend on any of the known parameters; examples:
of Samplg Theory amples - uemploymet househol cosumpto survey Raom sample - set of rv's... ; 's have ot strbuto [ ] f f s vector of parameters e.g. Statstc - fucto that oes't epe o ay of the ow parameters;
More informationPoint Estimation: definition of estimators
Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.
More informationCan we take the Mysticism Out of the Pearson Coefficient of Linear Correlation?
Ca we tae the Mstcsm Out of the Pearso Coeffcet of Lear Correlato? Itroducto As the ttle of ths tutoral dcates, our purpose s to egeder a clear uderstadg of the Pearso coeffcet of lear correlato studets
More informationCorrelation and Simple Linear Regression
Correlato ad Smple Lear Regresso Berl Che Departmet of Computer Scece & Iformato Egeerg Natoal Tawa Normal Uverst Referece:. W. Navd. Statstcs for Egeerg ad Scetsts. Chapter 7 (7.-7.3) & Teachg Materal
More informationFeature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)
CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.
More informationChapter Business Statistics: A First Course Fifth Edition. Learning Objectives. Correlation vs. Regression. In this chapter, you learn:
Chapter 3 3- Busess Statstcs: A Frst Course Ffth Edto Chapter 2 Correlato ad Smple Lear Regresso Busess Statstcs: A Frst Course, 5e 29 Pretce-Hall, Ic. Chap 2- Learg Objectves I ths chapter, you lear:
More informationQR Factorization and Singular Value Decomposition COS 323
QR Factorzato ad Sgular Value Decomposto COS 33 Why Yet Aother Method? How do we solve least-squares wthout currg codto-squarg effect of ormal equatos (A T A A T b) whe A s sgular, fat, or otherwse poorly-specfed?
More informationMachine Learning. knowledge acquisition skill refinement. Relation between machine learning and data mining. P. Berka, /18
Mache Learg The feld of mache learg s cocered wth the questo of how to costruct computer programs that automatcally mprove wth eperece. (Mtchell, 1997) Thgs lear whe they chage ther behavor a way that
More informationSpecial Instructions / Useful Data
JAM 6 Set of all real umbers P A..d. B, p Posso Specal Istructos / Useful Data x,, :,,, x x Probablty of a evet A Idepedetly ad detcally dstrbuted Bomal dstrbuto wth parameters ad p Posso dstrbuto wth
More informationBarycentric Interpolators for Continuous. Space & Time Reinforcement Learning. Robotics Institute, Carnegie Mellon University
Barycetrc Iterpolators for Cotuous Space & Tme Reforcemet Learg Rem Muos & Adrew Moore Robotcs Isttute, Carege Mello Uversty Pttsburgh, PA 15213, USA. E-mal:fmuos, awmg@cs.cmu.edu Category : Reforcemet
More information3. Basic Concepts: Consequences and Properties
: 3. Basc Cocepts: Cosequeces ad Propertes Markku Jutt Overvew More advaced cosequeces ad propertes of the basc cocepts troduced the prevous lecture are derved. Source The materal s maly based o Sectos.6.8
More informationOverview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression
Overvew Basc cocepts of Bayesa learg Most probable model gve data Co tosses Lear regresso Logstc regresso Bayesa predctos Co tosses Lear regresso 30 Recap: regresso problems Iput to learg problem: trag
More informationTaylor s Series and Interpolation. Interpolation & Curve-fitting. CIS Interpolation. Basic Scenario. Taylor Series interpolates at a specific
CIS 54 - Iterpolato Roger Crawfs Basc Scearo We are able to prod some fucto, but do ot kow what t really s. Ths gves us a lst of data pots: [x,f ] f(x) f f + x x + August 2, 25 OSU/CIS 54 3 Taylor s Seres
More informationLecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model
Lecture 7. Cofdece Itervals ad Hypothess Tests the Smple CLR Model I lecture 6 we troduced the Classcal Lear Regresso (CLR) model that s the radom expermet of whch the data Y,,, K, are the outcomes. The
More informationStatistics. Correlational. Dr. Ayman Eldeib. Simple Linear Regression and Correlation. SBE 304: Linear Regression & Correlation 1/3/2018
/3/08 Sstems & Bomedcal Egeerg Departmet SBE 304: Bo-Statstcs Smple Lear Regresso ad Correlato Dr. Ama Eldeb Fall 07 Descrptve Orgasg, summarsg & descrbg data Statstcs Correlatoal Relatoshps Iferetal Geeralsg
More informationLINEAR REGRESSION ANALYSIS
LINEAR REGRESSION ANALYSIS MODULE V Lecture - Correctg Model Iadequaces Through Trasformato ad Weghtg Dr. Shalabh Departmet of Mathematcs ad Statstcs Ida Isttute of Techology Kapur Aalytcal methods for
More informationChapter 5. Curve fitting
Chapter 5 Curve ttg Assgmet please use ecell Gve the data elow use least squares regresso to t a a straght le a power equato c a saturato-growthrate equato ad d a paraola. Fd the r value ad justy whch
More informationArithmetic Mean and Geometric Mean
Acta Mathematca Ntresa Vol, No, p 43 48 ISSN 453-6083 Arthmetc Mea ad Geometrc Mea Mare Varga a * Peter Mchalča b a Departmet of Mathematcs, Faculty of Natural Sceces, Costate the Phlosopher Uversty Ntra,
More informationSTK4011 and STK9011 Autumn 2016
STK4 ad STK9 Autum 6 Pot estmato Covers (most of the followg materal from chapter 7: Secto 7.: pages 3-3 Secto 7..: pages 3-33 Secto 7..: pages 35-3 Secto 7..3: pages 34-35 Secto 7.3.: pages 33-33 Secto
More informationAnalyzing Two-Dimensional Data. Analyzing Two-Dimensional Data
/7/06 Aalzg Two-Dmesoal Data The most commo aaltcal measuremets volve the determato of a ukow cocetrato based o the respose of a aaltcal procedure (usuall strumetal). Such a measuremet requres calbrato,
More informationClassification : Logistic regression. Generative classification model.
CS 75 Mache Lear Lecture 8 Classfcato : Lostc reresso. Geeratve classfcato model. Mlos Hausrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Lear Bar classfcato o classes Y {} Our oal s to lear to classf
More informationChapter 14 Logistic Regression Models
Chapter 4 Logstc Regresso Models I the lear regresso model X β + ε, there are two types of varables explaatory varables X, X,, X k ad study varable y These varables ca be measured o a cotuous scale as
More informationPart 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))
art 4b Asymptotc Results for MRR usg RESS Recall that the RESS statstc s a specal type of cross valdato procedure (see Alle (97)) partcular to the regresso problem ad volves fdg Y $,, the estmate at the
More informationMultivariate Transformation of Variables and Maximum Likelihood Estimation
Marquette Uversty Multvarate Trasformato of Varables ad Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Assocate Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Copyrght 03 by Marquette Uversty
More informationIntroduction to local (nonparametric) density estimation. methods
Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest
More informationThe number of observed cases The number of parameters. ith case of the dichotomous dependent variable. the ith case of the jth parameter
LOGISTIC REGRESSION Notato Model Logstc regresso regresses a dchotomous depedet varable o a set of depedet varables. Several methods are mplemeted for selectg the depedet varables. The followg otato s
More informationTHE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE
THE ROYAL STATISTICAL SOCIETY 00 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE PAPER I STATISTICAL THEORY The Socety provdes these solutos to assst caddates preparg for the examatos future years ad for the
More informationMidterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes
coometrcs, CON Sa Fracsco State Uverst Mchael Bar Sprg 5 Mdterm xam, secto Soluto Thursda, Februar 6 hour, 5 mutes Name: Istructos. Ths s closed book, closed otes exam.. No calculators of a kd are allowed..
More informationEVALUATION OF FUNCTIONAL INTEGRALS BY MEANS OF A SERIES AND THE METHOD OF BOREL TRANSFORM
EVALUATION OF FUNCTIONAL INTEGRALS BY MEANS OF A SERIES AND THE METHOD OF BOREL TRANSFORM Jose Javer Garca Moreta Ph. D research studet at the UPV/EHU (Uversty of Basque coutry) Departmet of Theoretcal
More informationPower Flow S + Buses with either or both Generator Load S G1 S G2 S G3 S D3 S D1 S D4 S D5. S Dk. Injection S G1
ower Flow uses wth ether or both Geerator Load G G G D D 4 5 D4 D5 ecto G Net Comple ower ecto - D D ecto s egatve sg at load bus = _ G D mlarl Curret ecto = G _ D At each bus coservato of comple power
More informationThe Selection Problem - Variable Size Decrease/Conquer (Practice with algorithm analysis)
We have covered: Selecto, Iserto, Mergesort, Bubblesort, Heapsort Next: Selecto the Qucksort The Selecto Problem - Varable Sze Decrease/Coquer (Practce wth algorthm aalyss) Cosder the problem of fdg the
More informationRademacher Complexity. Examples
Algorthmc Foudatos of Learg Lecture 3 Rademacher Complexty. Examples Lecturer: Patrck Rebesch Verso: October 16th 018 3.1 Itroducto I the last lecture we troduced the oto of Rademacher complexty ad showed
More informationLecture 2: Linear Least Squares Regression
Lecture : Lear Least Squares Regresso Dave Armstrog UW Mlwaukee February 8, 016 Is the Relatoshp Lear? lbrary(car) data(davs) d 150) Davs$weght[d]
More informationLine Fitting and Regression
Marquette Uverst MSCS6 Le Fttg ad Regresso Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 8 b Marquette Uverst Least Squares Regresso MSCS6 For LSR we have pots
More informationEconometric Methods. Review of Estimation
Ecoometrc Methods Revew of Estmato Estmatg the populato mea Radom samplg Pot ad terval estmators Lear estmators Ubased estmators Lear Ubased Estmators (LUEs) Effcecy (mmum varace) ad Best Lear Ubased Estmators
More informationTraining Sample Model: Given n observations, [[( Yi, x i the sample model can be expressed as (1) where, zero and variance σ
Stat 74 Estmato for Geeral Lear Model Prof. Goel Broad Outle Geeral Lear Model (GLM): Trag Samle Model: Gve observatos, [[( Y, x ), x = ( x,, xr )], =,,, the samle model ca be exressed as Y = µ ( x, x,,
More information{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:
Chapter 4 Exercses Samplg Theory Exercse (Smple radom samplg: Let there be two correlated radom varables X ad A sample of sze s draw from a populato by smple radom samplg wthout replacemet The observed
More information1 0, x? x x. 1 Root finding. 1.1 Introduction. Solve[x^2-1 0,x] {{x -1},{x 1}} Plot[x^2-1,{x,-2,2}] 3
Adrew Powuk - http://www.powuk.com- Math 49 (Numercal Aalyss) Root fdg. Itroducto f ( ),?,? Solve[^-,] {{-},{}} Plot[^-,{,-,}] Cubc equato https://e.wkpeda.org/wk/cubc_fucto Quartc equato https://e.wkpeda.org/wk/quartc_fucto
More informationA Robust Total Least Mean Square Algorithm For Nonlinear Adaptive Filter
A Robust otal east Mea Square Algorthm For Nolear Adaptve Flter Ruxua We School of Electroc ad Iformato Egeerg X'a Jaotog Uversty X'a 70049, P.R. Cha rxwe@chare.com Chogzhao Ha, azhe u School of Electroc
More informationSTA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1
STA 08 Appled Lear Models: Regresso Aalyss Sprg 0 Soluto for Homework #. Let Y the dollar cost per year, X the umber of vsts per year. The the mathematcal relato betwee X ad Y s: Y 300 + X. Ths s a fuctoal
More informationMidterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes
coometrcs, CON Sa Fracsco State Uversty Mchael Bar Sprg 5 Mdterm am, secto Soluto Thursday, February 6 hour, 5 mutes Name: Istructos. Ths s closed book, closed otes eam.. No calculators of ay kd are allowed..
More informationPrincipal Components. Analysis. Basic Intuition. A Method of Self Organized Learning
Prcpal Compoets Aalss A Method of Self Orgazed Learg Prcpal Compoets Aalss Stadard techque for data reducto statstcal patter matchg ad sgal processg Usupervsed learg: lear from examples wthout a teacher
More informationA tighter lower bound on the circuit size of the hardest Boolean functions
Electroc Colloquum o Computatoal Complexty, Report No. 86 2011) A tghter lower boud o the crcut sze of the hardest Boolea fuctos Masak Yamamoto Abstract I [IPL2005], Fradse ad Mlterse mproved bouds o the
More informationChapter 3. Differentiation 3.3 Differentiation Rules
3.3 Dfferetato Rules 1 Capter 3. Dfferetato 3.3 Dfferetato Rules Dervatve of a Costat Fucto. If f as te costat value f(x) = c, te f x = [c] = 0. x Proof. From te efto: f (x) f(x + ) f(x) o c c 0 = 0. QED
More information8.1 Hashing Algorithms
CS787: Advaced Algorthms Scrbe: Mayak Maheshwar, Chrs Hrchs Lecturer: Shuch Chawla Topc: Hashg ad NP-Completeess Date: September 21 2007 Prevously we looked at applcatos of radomzed algorthms, ad bega
More informationLinear Regression with One Regressor
Lear Regresso wth Oe Regressor AIM QA.7. Expla how regresso aalyss ecoometrcs measures the relatoshp betwee depedet ad depedet varables. A regresso aalyss has the goal of measurg how chages oe varable,
More informationi 2 σ ) i = 1,2,...,n , and = 3.01 = 4.01
ECO 745, Homework 6 Le Cabrera. Assume that the followg data come from the lear model: ε ε ~ N, σ,,..., -6. -.5 7. 6.9 -. -. -.9. -..6.4.. -.6 -.7.7 Fd the mamum lkelhood estmates of,, ad σ ε s.6. 4. ε
More informationThe Mathematics of Portfolio Theory
The Matheatcs of Portfolo Theory The rates of retur of stocks, ad are as follows Market odtos state / scearo) earsh Neutral ullsh Probablty 0. 0.5 0.3 % 5% 9% -3% 3% % 5% % -% Notato: R The retur of stock
More information1. A real number x is represented approximately by , and we are told that the relative error is 0.1 %. What is x? Note: There are two answers.
PROBLEMS A real umber s represeted appromately by 63, ad we are told that the relatve error s % What s? Note: There are two aswers Ht : Recall that % relatve error s What s the relatve error volved roudg
More informationSimple Linear Regression
Statstcal Methods I (EST 75) Page 139 Smple Lear Regresso Smple regresso applcatos are used to ft a model descrbg a lear relatoshp betwee two varables. The aspects of least squares regresso ad correlato
More informationECE 421/599 Electric Energy Systems 7 Optimal Dispatch of Generation. Instructor: Kai Sun Fall 2014
ECE 4/599 Electrc Eergy Systems 7 Optmal Dspatch of Geerato Istructor: Ka Su Fall 04 Backgroud I a practcal power system, the costs of geeratg ad delverg electrcty from power plats are dfferet (due to
More informationSome Different Perspectives on Linear Least Squares
Soe Dfferet Perspectves o Lear Least Squares A stadard proble statstcs s to easure a respose or depedet varable, y, at fed values of oe or ore depedet varables. Soetes there ests a deterstc odel y f (,,
More informationJacobian-Free Diagonal Newton s Method for Solving Nonlinear Systems with Singular Jacobian ABSTRACT INTRODUCTION
Malaysa Joural of Mathematcal Sceces 5(: 4-55 ( Jacoba-ree Dagoal Newto s Method for Solvg Nolear Systems wth Sgular Jacoba Mohammed Wazr Yusuf,, Leog Wah Jue ad, Mal Abu Hassa aculty of Scece, Uverst
More informationLinear regression (cont) Logistic regression
CS 7 Fouatos of Mache Lear Lecture 4 Lear reresso cot Lostc reresso Mlos Hausrecht mlos@cs.ptt.eu 539 Seott Square Lear reresso Vector efto of the moel Iclue bas costat the put vector f - parameters ehts
More informationMachine Learning. Topic 4: Measuring Distance
Mache Learg Topc 4: Measurg Dstace Bra Pardo Mache Learg: EECS 349 Fall 2009 Wh measure dstace? Clusterg requres dstace measures. Local methods requre a measure of localt Search eges requre a measure of
More informationChapter 5 Properties of a Random Sample
Lecture 6 o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for the prevous lecture Cocepts: t-dstrbuto, F-dstrbuto Theorems: Dstrbutos of sample mea ad sample varace, relatoshp betwee sample mea ad sample
More informationOn the convergence of derivatives of Bernstein approximation
O the covergece of dervatves of Berste approxmato Mchael S. Floater Abstract: By dfferetatg a remader formula of Stacu, we derve both a error boud ad a asymptotc formula for the dervatves of Berste approxmato.
More informationLECTURE 9: Principal Components Analysis
LECURE 9: Prcpal Compoets Aalss he curse of dmesoalt Dmesoalt reducto Feature selecto vs. feature etracto Sal represetato vs. sal classfcato Prcpal Compoets Aalss Itroducto to Patter Aalss Rcardo Guterrez-Osua
More informationLECTURE 2: Linear and quadratic classifiers
LECURE : Lear ad quadratc classfers g Part : Bayesa Decso heory he Lkelhood Rato est Maxmum A Posteror ad Maxmum Lkelhood Dscrmat fuctos g Part : Quadratc classfers Bayes classfers for ormally dstrbuted
More information