15-381: Artificial Intelligence. Regression and neural networks (NN)

Size: px
Start display at page:

Download "15-381: Artificial Intelligence. Regression and neural networks (NN)"

Transcription

1 5-38: Artfcal Itellece Reresso ad eural etorks NN)

2 Mmck the bra I the earl das of AI there as a lot of terest develop models that ca mmc huma thk. Whle o oe ke eactl ho the bra orks ad, eve thouh there as a lot of proress sce, there s stll lttle ko), some of the basc computatoal uts ere ko A ke compoet of these uts s the euro.

3 he Neuro A cell the bra Hhl coected to other euros houht to perform computatos b terat sals from other euros Outputs of these computato ma be trasmtted to oe or more euros

4 What ca e do th NN? Classfcato - We alread metoed ma useful applcatos Reresso Iput: Real valued varables Output: Oe or more real values Eamples: - Predct the prce of Goole s stock from Mcrosoft s stock prce - Predct dstace to obstacle from varous sesors

5 Lear reresso Gve a put e ould lke to compute a output I lear reresso e assume that ad are related th the follo equato: +ε here s a parameter ad ε represets measuremet or other ose Y X

6 Multvarate reresso: Least squares We ca re-rte multvarate reresso as X he soluto s: X X) - X he s a stace of a larer set of computatoal solutos hch are usuall referred to as eeralzed least squares X X s a k b k matr X s a vector th k etres

7 Multvarate reresso: Least squares We ca re-rte our model as X he soluto turs out to be: X X) - X We eed to vert a k b k matr hs takes Ok 3 ) Deped o k ths ca be rather slo

8 Where e are Lear reresso solved But - Soluto ma be slo - Does ot address eeral reresso problems of the form fx)

9 Back to NN: Preceptro he basc process ut of a eural et f ) k k

10 Lear reresso Lets start b sett f ) We are back to lear reresso Ulke our oral lear reresso soluto, for perceptros e ll use a dfferet strate Wh? - We ll dscuss ths later, for o lets focus o the soluto 2 k 0 2 k

11 Gradet descet zf)-) 2 Slope z/ Δz Δ Go the opposte drecto to the slope ll lead to a smaller z But ot too much, otherse e ould o beod the optmal

12 Gradet descet Go the opposte drecto to the slope ll lead to a smaller z But ot too much, otherse e ould o beod the optmal We thus update the ehts b sett: z # " $ here λ s small costat hch s teded to prevet us from pass the optmal

13 Eample he choos the rht λ We et a mootocall decreas error as e perform more updates

14 Gradet descet for lear reresso We compute the radet.r.t. to each Ad f e have measuremets the here, s the th value of the th put vector ) 2 2 " " # $ % & ' " ) ) k k k k k k " # #, 2 ) 2 ) - -

15 Gradet descet for lear reresso If e have measuremets the Set he our update rule ca be rtte as " # #, 2 ) 2 ) - - ) - + ", $2 #

16 Gradet descet alorthm for lear reresso.chose λ 2.Start th a uess for 3.Compute δ for all 4.For all set " 5.If o mprovemet for + $2 # stop. Otherse o to step 3, - 2 )

17 W 2 Eample

18 Gradet descet vs. matr verso Advataes of matr verso - No teratos - No eed to specf parameters - Closed form soluto a predctable tme Advataes of radet descet - Applcable reardless of the umber of parameters - Geeral, apples to other forms of reresso

19 Perceptros for classfcato So far e dscussed reresso Hoever, perceptros ca also be used for classfcato For eample, output s > 0 ad - otherse Problem?

20 Perceptros for classfcato So far e dscussed reresso Hoever, perceptros ca also be used for classfcato Outputs ether 0 or We predct f > /2 ad 0 otherse Problem? Best least squares ft Best classfer

21 he smod fucto o classf us a perceptro e replace the lear fucto th the smod fucto: h) h + e Us the smod e ould mmze " Where s ether 0 or deped o the class 2 ))

22 Gradet descet th smod Oce e defed our taret fucto, e ca mmze t us radet descet hs volves some math, ad reles o the follo dervato*: So, )) ) ) ' h h h *I have cluded a dervato of ths at the ed of the lecture otes, 2 )) ) )) 2 ) ' )) 2 )) )) 2 )) " " " " " " # # # #

23 Gradet descet th smod ) " ), 2 )) ) )) 2 )) " " # # Set, 2 ) 2 )) " " # # $, ) 2 + " # $ % So our update rule s:

24 Revsed alorthm for smod reresso.chose λ 2.Start th a uess for 3.Compute δ for all 4.For all set " + % 2 # $ ), 5.If o mprovemet for - 2 )) stop. Otherse o to step 3

25 Multlaer eural etorks So far e dscussed etorks th oe laer. But these etorks ca be eteded to combe several laers, creas the set of fuctos that ca be represeted us a NN Ofte called the hdde laer 0,2,2 0,, 2, v ) 2 v v) 2 2,2 v 2 )

26 Lear the parameters for multlaer etorks Gradet descet orks b coect the output to the puts. But ho do e use t for a multlaer etork? We eed to accout for both, the output ehts ad the hdde laer ehts 0,2,2 0,, 2, v ) 2 v v) 2 2,2 v 2 )

27 Lear the parameters for multlaer etorks Its eas to compute the update rule for the output ehts ad 2 : + % 2 $ ) v, " # here " v ) 0,2,2 0,, 2, v ) 2 v) 2 2,2 v 2 )

28 Lear the parameters for multlaer etorks Its eas to compute the update rule for the output ehts ad 2 : + % 2 $ ) v, " # here " v ) But hat s the error assocated th each of the 0, hdde laer states? 0,2,2, 2, v ) 2 v v) 2 2,2 v 2 )

29 Backpropaato A method for dstrbut the error amo hdde laer states Us the error for each of these states e ca emplo radet descet to update them Set ", # $ ) output error eht 0,2,2 0,, 2, v ) 2 v v) 2 2,2 v 2 )

30 Backpropaato A method for dstrbut the error amo hdde laer states Us the error for each of these states e ca emplo radet descet to update them Set ", # $ ) Our update rule chaes to: k, # k, + % 2 $ ",,, ), k

31 Backpropaato he correct error term for each hdde state ca be determed b tak the partal dervatve for each of the eht parameters of the hdde laer.r.t. the lobal error fucto*: k, # k, + 2 $ 2 Err % ",, )), ), k *See RN book for detals paes )

32 Revsed alorthm for multlaered eural etork.chose λ 2.Start th a uess for, 3.Compute values v, for all hdde laer states ad puts 4.Compute δ for all : " v ) 5.Compute Δ,I 6.For all set + % 2 $ ) v 7. For all k ad set, k, # k, + % 2 $ ",,, ), k " # If o mprovemet for # + ", stop. Otherse o to step 3 s

33 Eamples Fure : Feedforard ANN desed ad tested for predcto of tactcal ar combat maeuvers.

34 What ou should ko Lear reresso - Solv a lear reresso problem Gradet descet Perceptros - Smod fuctos for classfcato Multlaered eural etorks - Backpropaato

35 Derv ) Recall that ) s the smod fucto so ) + e he dervato of ) s belo

Supervised learning: Linear regression Logistic regression

Supervised learning: Linear regression Logistic regression CS 57 Itroducto to AI Lecture 4 Supervsed learg: Lear regresso Logstc regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Data: D { D D.. D D Supervsed learg d a set of eamples s

More information

Linear regression (cont.) Linear methods for classification

Linear regression (cont.) Linear methods for classification CS 75 Mache Lear Lecture 7 Lear reresso cot. Lear methods for classfcato Mlos Hausrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Lear Coeffcet shrae he least squares estmates ofte have lo bas but hh

More information

Regression and the LMS Algorithm

Regression and the LMS Algorithm CSE 556: Itroducto to Neural Netorks Regresso ad the LMS Algorthm CSE 556: Regresso 1 Problem statemet CSE 556: Regresso Lear regresso th oe varable Gve a set of N pars of data {, d }, appromate d b a

More information

Classification : Logistic regression. Generative classification model.

Classification : Logistic regression. Generative classification model. CS 75 Mache Lear Lecture 8 Classfcato : Lostc reresso. Geeratve classfcato model. Mlos Hausrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Lear Bar classfcato o classes Y {} Our oal s to lear to classf

More information

CSE 5526: Introduction to Neural Networks Linear Regression

CSE 5526: Introduction to Neural Networks Linear Regression CSE 556: Itroducto to Neural Netorks Lear Regresso Part II 1 Problem statemet Part II Problem statemet Part II 3 Lear regresso th oe varable Gve a set of N pars of data , appromate d by a lear fucto

More information

Lecture 12: Multilayer perceptrons II

Lecture 12: Multilayer perceptrons II Lecture : Multlayer perceptros II Bayes dscrmats ad MLPs he role of hdde uts A eample Itroducto to Patter Recoto Rcardo Guterrez-Osua Wrht State Uversty Bayes dscrmats ad MLPs ( As we have see throuhout

More information

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x CS 75 Mache Learg Lecture 8 Lear regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear combato of put compoets f + + + K d d K k - parameters

More information

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines CS 675 Itroducto to Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Mdterm eam October 9, 7 I-class eam Closed book Stud materal: Lecture otes Correspodg chapters

More information

Linear regression (cont) Logistic regression

Linear regression (cont) Logistic regression CS 7 Fouatos of Mache Lear Lecture 4 Lear reresso cot Lostc reresso Mlos Hausrecht mlos@cs.ptt.eu 539 Seott Square Lear reresso Vector efto of the moel Iclue bas costat the put vector f - parameters ehts

More information

Lecture Notes Forecasting the process of estimating or predicting unknown situations

Lecture Notes Forecasting the process of estimating or predicting unknown situations Lecture Notes. Ecoomc Forecastg. Forecastg the process of estmatg or predctg ukow stuatos Eample usuall ecoomsts predct future ecoomc varables Forecastg apples to a varet of data () tme seres data predctg

More information

Support vector machines II

Support vector machines II CS 75 Mache Learg Lecture Support vector maches II Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Learl separable classes Learl separable classes: here s a hperplae that separates trag staces th o error

More information

Analyzing Two-Dimensional Data. Analyzing Two-Dimensional Data

Analyzing Two-Dimensional Data. Analyzing Two-Dimensional Data /7/06 Aalzg Two-Dmesoal Data The most commo aaltcal measuremets volve the determato of a ukow cocetrato based o the respose of a aaltcal procedure (usuall strumetal). Such a measuremet requres calbrato,

More information

Linear models for classification

Linear models for classification CS 75 Mache Lear Lecture 9 Lear modes for cassfcato Mos Hausrecht mos@cs.ptt.edu 539 Seott Square ata: { d d.. d} d Cassfcato represets a dscrete cass vaue Goa: ear f : X Y Bar cassfcato A speca case he

More information

Generative classification models

Generative classification models CS 75 Mache Learg Lecture Geeratve classfcato models Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Data: D { d, d,.., d} d, Classfcato represets a dscrete class value Goal: lear f : X Y Bar classfcato

More information

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab Lear Regresso Lear Regresso th Shrkage Some sldes are due to Tomm Jaakkola, MIT AI Lab Itroducto The goal of regresso s to make quattatve real valued predctos o the bass of a vector of features or attrbutes.

More information

CS 2750 Machine Learning. Lecture 7. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

CS 2750 Machine Learning. Lecture 7. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x CS 75 Mache Learg Lecture 7 Lear regresso Mlos Hauskrecht los@cs.ptt.edu 59 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear cobato of put copoets f + + + K d d K k - paraeters eghts

More information

Binary classification: Support Vector Machines

Binary classification: Support Vector Machines CS 57 Itroducto to AI Lecture 6 Bar classfcato: Support Vector Maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Supervsed learg Data: D { D, D,.., D} a set of eamples D, (,,,,,

More information

Objectives of Multiple Regression

Objectives of Multiple Regression Obectves of Multple Regresso Establsh the lear equato that best predcts values of a depedet varable Y usg more tha oe eplaator varable from a large set of potetal predctors {,,... k }. Fd that subset of

More information

Model Fitting, RANSAC. Jana Kosecka

Model Fitting, RANSAC. Jana Kosecka Model Fttg, RANSAC Jaa Kosecka Fttg: Issues Prevous strateges Le detecto Hough trasform Smple parametrc model, two parameters m, b m + b Votg strateg Hard to geeralze to hgher dmesos a o + a + a 2 2 +

More information

Lecture 16: Backpropogation Algorithm Neural Networks with smooth activation functions

Lecture 16: Backpropogation Algorithm Neural Networks with smooth activation functions CO-511: Learg Theory prg 2017 Lecturer: Ro Lv Lecture 16: Bacpropogato Algorthm Dsclamer: These otes have ot bee subected to the usual scruty reserved for formal publcatos. They may be dstrbuted outsde

More information

Line Fitting and Regression

Line Fitting and Regression Marquette Uverst MSCS6 Le Fttg ad Regresso Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 8 b Marquette Uverst Least Squares Regresso MSCS6 For LSR we have pots

More information

PGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation

PGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation PGE 30: Formulato ad Soluto Geosystems Egeerg Dr. Balhoff Iterpolato Numercal Methods wth MATLAB, Recktewald, Chapter 0 ad Numercal Methods for Egeers, Chapra ad Caale, 5 th Ed., Part Fve, Chapter 8 ad

More information

An Introduction to. Support Vector Machine

An Introduction to. Support Vector Machine A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork

More information

( ) 2 2. Multi-Layer Refraction Problem Rafael Espericueta, Bakersfield College, November, 2006

( ) 2 2. Multi-Layer Refraction Problem Rafael Espericueta, Bakersfield College, November, 2006 Mult-Layer Refracto Problem Rafael Espercueta, Bakersfeld College, November, 006 Lght travels at dfferet speeds through dfferet meda, but refracts at layer boudares order to traverse the least-tme path.

More information

Kernel-based Methods and Support Vector Machines

Kernel-based Methods and Support Vector Machines Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg

More information

Simple Linear Regression

Simple Linear Regression Statstcal Methods I (EST 75) Page 139 Smple Lear Regresso Smple regresso applcatos are used to ft a model descrbg a lear relatoshp betwee two varables. The aspects of least squares regresso ad correlato

More information

Recall MLR 5 Homskedasticity error u has the same variance given any values of the explanatory variables Var(u x1,...,xk) = 2 or E(UU ) = 2 I

Recall MLR 5 Homskedasticity error u has the same variance given any values of the explanatory variables Var(u x1,...,xk) = 2 or E(UU ) = 2 I Chapter 8 Heterosedastcty Recall MLR 5 Homsedastcty error u has the same varace gve ay values of the eplaatory varables Varu,..., = or EUU = I Suppose other GM assumptos hold but have heterosedastcty.

More information

To use adaptive cluster sampling we must first make some definitions of the sampling universe:

To use adaptive cluster sampling we must first make some definitions of the sampling universe: 8.3 ADAPTIVE SAMPLING Most of the methods dscussed samplg theory are lmted to samplg desgs hch the selecto of the samples ca be doe before the survey, so that oe of the decsos about samplg deped ay ay

More information

LECTURE 9: Principal Components Analysis

LECTURE 9: Principal Components Analysis LECURE 9: Prcpal Compoets Aalss he curse of dmesoalt Dmesoalt reducto Feature selecto vs. feature etracto Sal represetato vs. sal classfcato Prcpal Compoets Aalss Itroducto to Patter Aalss Rcardo Guterrez-Osua

More information

ENGI 3423 Simple Linear Regression Page 12-01

ENGI 3423 Simple Linear Regression Page 12-01 ENGI 343 mple Lear Regresso Page - mple Lear Regresso ometmes a expermet s set up where the expermeter has cotrol over the values of oe or more varables X ad measures the resultg values of aother varable

More information

Support vector machines

Support vector machines CS 75 Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Outle Outle: Algorthms for lear decso boudary Support vector maches Mamum marg hyperplae.

More information

Radial Basis Function Networks

Radial Basis Function Networks Radal Bass Fucto Netorks Radal Bass Fucto Netorks A specal types of ANN that have three layers Iput layer Hdde layer Output layer Mappg from put to hdde layer s olear Mappg from hdde to output layer s

More information

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression Overvew Basc cocepts of Bayesa learg Most probable model gve data Co tosses Lear regresso Logstc regresso Bayesa predctos Co tosses Lear regresso 30 Recap: regresso problems Iput to learg problem: trag

More information

Functions of Random Variables

Functions of Random Variables Fuctos of Radom Varables Chapter Fve Fuctos of Radom Varables 5. Itroducto A geeral egeerg aalyss model s show Fg. 5.. The model output (respose) cotas the performaces of a system or product, such as weght,

More information

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions. Ordary Least Squares egresso. Smple egresso. Algebra ad Assumptos. I ths part of the course we are gog to study a techque for aalysg the lear relatoshp betwee two varables Y ad X. We have pars of observatos

More information

Multiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades

Multiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades STAT 101 Dr. Kar Lock Morga 11/20/12 Exam 2 Grades Multple Regresso SECTIONS 9.2, 10.1, 10.2 Multple explaatory varables (10.1) Parttog varablty R 2, ANOVA (9.2) Codtos resdual plot (10.2) Trasformatos

More information

Rademacher Complexity. Examples

Rademacher Complexity. Examples Algorthmc Foudatos of Learg Lecture 3 Rademacher Complexty. Examples Lecturer: Patrck Rebesch Verso: October 16th 018 3.1 Itroducto I the last lecture we troduced the oto of Rademacher complexty ad showed

More information

13. Parametric and Non-Parametric Uncertainties, Radial Basis Functions and Neural Network Approximations

13. Parametric and Non-Parametric Uncertainties, Radial Basis Functions and Neural Network Approximations Lecture 7 3. Parametrc ad No-Parametrc Ucertates, Radal Bass Fuctos ad Neural Network Approxmatos he parameter estmato algorthms descrbed prevous sectos were based o the assumpto that the system ucertates

More information

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses Johs Hopks Uverst Departmet of Bostatstcs Math Revew for Itroductor Courses Ratoale Bostatstcs courses wll rel o some fudametal mathematcal relatoshps, fuctos ad otato. The purpose of ths Math Revew s

More information

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses Johs Hopks Uverst Departmet of Bostatstcs Math Revew for Itroductor Courses Ratoale Bostatstcs courses wll rel o some fudametal mathematcal relatoshps, fuctos ad otato. The purpose of ths Math Revew s

More information

Bayes (Naïve or not) Classifiers: Generative Approach

Bayes (Naïve or not) Classifiers: Generative Approach Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg

More information

Power Flow S + Buses with either or both Generator Load S G1 S G2 S G3 S D3 S D1 S D4 S D5. S Dk. Injection S G1

Power Flow S + Buses with either or both Generator Load S G1 S G2 S G3 S D3 S D1 S D4 S D5. S Dk. Injection S G1 ower Flow uses wth ether or both Geerator Load G G G D D 4 5 D4 D5 ecto G Net Comple ower ecto - D D ecto s egatve sg at load bus = _ G D mlarl Curret ecto = G _ D At each bus coservato of comple power

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Marquette Uverst Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 08 b Marquette Uverst Maxmum Lkelhood Estmato We have bee sag that ~

More information

Statistics: Unlocking the Power of Data Lock 5

Statistics: Unlocking the Power of Data Lock 5 STAT 0 Dr. Kar Lock Morga Exam 2 Grades: I- Class Multple Regresso SECTIONS 9.2, 0., 0.2 Multple explaatory varables (0.) Parttog varablty R 2, ANOVA (9.2) Codtos resdual plot (0.2) Exam 2 Re- grades Re-

More information

Minimization of Unconstrained Nonpolynomial Large-Scale Optimization Problems Using Conjugate Gradient Method Via Exact Line Search

Minimization of Unconstrained Nonpolynomial Large-Scale Optimization Problems Using Conjugate Gradient Method Via Exact Line Search Amerca Joural of Mechacal ad Materals Eeer 207; (): 0-4 http://wwwscecepublshroupcom/j/ajmme do: 0648/jajmme207003 Mmzato of Ucostraed Nopolyomal Lare-Scale Optmzato Problems Us Cojuate Gradet Method Va

More information

15-381: Artificial Intelligence. Regression and cross validation

15-381: Artificial Intelligence. Regression and cross validation 15-381: Artfcal Intellgence Regresson and cross valdaton Where e are Inputs Densty Estmator Probablty Inputs Classfer Predct category Inputs Regressor Predct real no. Today Lnear regresson Gven an nput

More information

LECTURE 21: Support Vector Machines

LECTURE 21: Support Vector Machines LECURE 2: Support Vector Maches Emprcal Rsk Mmzato he VC dmeso Structural Rsk Mmzato Maxmum mar hyperplae he Laraa dual problem Itroducto to Patter Aalyss Rcardo Guterrez-Osua exas A&M Uversty Itroducto

More information

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Multivariate Transformation of Variables and Maximum Likelihood Estimation Marquette Uversty Multvarate Trasformato of Varables ad Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Assocate Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Copyrght 03 by Marquette Uversty

More information

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ  1 STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS Recall Assumpto E(Y x) η 0 + η x (lear codtoal mea fucto) Data (x, y ), (x 2, y 2 ),, (x, y ) Least squares estmator ˆ E (Y x) ˆ " 0 + ˆ " x, where ˆ

More information

Unsupervised Learning and Other Neural Networks

Unsupervised Learning and Other Neural Networks CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all

More information

Some Applications of the Resampling Methods in Computational Physics

Some Applications of the Resampling Methods in Computational Physics Iteratoal Joural of Mathematcs Treds ad Techoloy Volume 6 February 04 Some Applcatos of the Resampl Methods Computatoal Physcs Sotraq Marko #, Lorec Ekoom * # Physcs Departmet, Uversty of Korca, Albaa,

More information

Some Different Perspectives on Linear Least Squares

Some Different Perspectives on Linear Least Squares Soe Dfferet Perspectves o Lear Least Squares A stadard proble statstcs s to easure a respose or depedet varable, y, at fed values of oe or ore depedet varables. Soetes there ests a deterstc odel y f (,,

More information

The Selection Problem - Variable Size Decrease/Conquer (Practice with algorithm analysis)

The Selection Problem - Variable Size Decrease/Conquer (Practice with algorithm analysis) We have covered: Selecto, Iserto, Mergesort, Bubblesort, Heapsort Next: Selecto the Qucksort The Selecto Problem - Varable Sze Decrease/Coquer (Practce wth algorthm aalyss) Cosder the problem of fdg the

More information

Data Processing Techniques

Data Processing Techniques Uverstas Gadjah Mada Departmet o Cvl ad Evrometal Egeerg Master o Egeerg Natural Dsaster Maagemet Data Processg Techques Curve Fttg: Regresso ad Iterpolato 3Oct7 Curve Fttg Reerece Chapra, S.C., Caale

More information

Lecture 8: Linear Regression

Lecture 8: Linear Regression Lecture 8: Lear egresso May 4, GENOME 56, Sprg Goals Develop basc cocepts of lear regresso from a probablstc framework Estmatg parameters ad hypothess testg wth lear models Lear regresso Su I Lee, CSE

More information

Lecture 12 APPROXIMATION OF FIRST ORDER DERIVATIVES

Lecture 12 APPROXIMATION OF FIRST ORDER DERIVATIVES FDM: Appromato of Frst Order Dervatves Lecture APPROXIMATION OF FIRST ORDER DERIVATIVES. INTRODUCTION Covectve term coservato equatos volve frst order dervatves. The smplest possble approach for dscretzato

More information

residual. (Note that usually in descriptions of regression analysis, upper-case

residual. (Note that usually in descriptions of regression analysis, upper-case Regresso Aalyss Regresso aalyss fts or derves a model that descres the varato of a respose (or depedet ) varale as a fucto of oe or more predctor (or depedet ) varales. The geeral regresso model s oe of

More information

New Trade Theory (1979)

New Trade Theory (1979) Ne Trade Theory 979 Ne Trade Theory Krugma, 979: - Ecoomes of scale as reaso for trade - Elas trade betee smlar coutres Ituto of model: There s a trade-off betee ecoomes of scale the roducto of good tyes

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Postpoed exam: ECON430 Statstcs Date of exam: Jauary 0, 0 Tme for exam: 09:00 a.m. :00 oo The problem set covers 5 pages Resources allowed: All wrtte ad prted

More information

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1 STA 08 Appled Lear Models: Regresso Aalyss Sprg 0 Soluto for Homework #. Let Y the dollar cost per year, X the umber of vsts per year. The the mathematcal relato betwee X ad Y s: Y 300 + X. Ths s a fuctoal

More information

Chapter Business Statistics: A First Course Fifth Edition. Learning Objectives. Correlation vs. Regression. In this chapter, you learn:

Chapter Business Statistics: A First Course Fifth Edition. Learning Objectives. Correlation vs. Regression. In this chapter, you learn: Chapter 3 3- Busess Statstcs: A Frst Course Ffth Edto Chapter 2 Correlato ad Smple Lear Regresso Busess Statstcs: A Frst Course, 5e 29 Pretce-Hall, Ic. Chap 2- Learg Objectves I ths chapter, you lear:

More information

Statistics MINITAB - Lab 5

Statistics MINITAB - Lab 5 Statstcs 10010 MINITAB - Lab 5 PART I: The Correlato Coeffcet Qute ofte statstcs we are preseted wth data that suggests that a lear relatoshp exsts betwee two varables. For example the plot below s of

More information

Correlation and Regression Analysis

Correlation and Regression Analysis Chapter V Correlato ad Regresso Aalss R. 5.. So far we have cosdered ol uvarate dstrbutos. Ma a tme, however, we come across problems whch volve two or more varables. Ths wll be the subject matter of the

More information

CENTRE FOR ECONOMIC RESEARCH WORKING PAPER SERIES

CENTRE FOR ECONOMIC RESEARCH WORKING PAPER SERIES CENTRE FOR ECONOMIC RESEARCH WORKING PAPER SERIES 2001 Mmum Wages for Roald McDoald Moopsoes: A Theory of Moopsostc Competto by V Bhaskar ad Ted To A Commet Frak Walsh, Uversty College Dubl WP01/18 September

More information

Soft Sensing of Key State Variables in Fermentation Process Based on Relevance Vector Machine with Hybrid Kernel Function

Soft Sensing of Key State Variables in Fermentation Process Based on Relevance Vector Machine with Hybrid Kernel Function Sesors & Trasducers, Vol. 73, Issue 6, Jue 4, pp. 37-43 Sesors & Trasducers 4 by IFSA Publsh, S. L. http://www.sesorsportal.com Soft Ses of Key State Varables Fermetato Process Based o Relevace Vector

More information

Investigating Cellular Automata

Investigating Cellular Automata Researcher: Taylor Dupuy Advsor: Aaro Wootto Semester: Fall 4 Ivestgatg Cellular Automata A Overvew of Cellular Automata: Cellular Automata are smple computer programs that geerate rows of black ad whte

More information

Principal Components. Analysis. Basic Intuition. A Method of Self Organized Learning

Principal Components. Analysis. Basic Intuition. A Method of Self Organized Learning Prcpal Compoets Aalss A Method of Self Orgazed Learg Prcpal Compoets Aalss Stadard techque for data reducto statstcal patter matchg ad sgal processg Usupervsed learg: lear from examples wthout a teacher

More information

ESS Line Fitting

ESS Line Fitting ESS 5 014 17. Le Fttg A very commo problem data aalyss s lookg for relatoshpetwee dfferet parameters ad fttg les or surfaces to data. The smplest example s fttg a straght le ad we wll dscuss that here

More information

QR Factorization and Singular Value Decomposition COS 323

QR Factorization and Singular Value Decomposition COS 323 QR Factorzato ad Sgular Value Decomposto COS 33 Why Yet Aother Method? How do we solve least-squares wthout currg codto-squarg effect of ormal equatos (A T A A T b) whe A s sgular, fat, or otherwse poorly-specfed?

More information

The equation is sometimes presented in form Y = a + b x. This is reasonable, but it s not the notation we use.

The equation is sometimes presented in form Y = a + b x. This is reasonable, but it s not the notation we use. INTRODUCTORY NOTE ON LINEAR REGREION We have data of the form (x y ) (x y ) (x y ) These wll most ofte be preseted to us as two colum of a spreadsheet As the topc develops we wll see both upper case ad

More information

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Mache Learg CSE6740/CS764/ISYE6740, Fall 0 Itroducto to Regresso Le Sog Lecture 4, August 30, 0 Based o sldes from Erc g, CMU Readg: Chap. 3, CB Mache learg for apartmet hutg Suppose ou are to move to

More information

Introduction to local (nonparametric) density estimation. methods

Introduction to local (nonparametric) density estimation. methods Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest

More information

Computational Geometry

Computational Geometry Problem efto omputatoal eometry hapter 6 Pot Locato Preprocess a plaar map S. ve a query pot p, report the face of S cotag p. oal: O()-sze data structure that eables O(log ) query tme. pplcato: Whch state

More information

UNIT 7 RANK CORRELATION

UNIT 7 RANK CORRELATION UNIT 7 RANK CORRELATION Rak Correlato Structure 7. Itroucto Objectves 7. Cocept of Rak Correlato 7.3 Dervato of Rak Correlato Coeffcet Formula 7.4 Te or Repeate Raks 7.5 Cocurret Devato 7.6 Summar 7.7

More information

III-16 G. Brief Review of Grand Orthogonality Theorem and impact on Representations (Γ i ) l i = h n = number of irreducible representations.

III-16 G. Brief Review of Grand Orthogonality Theorem and impact on Representations (Γ i ) l i = h n = number of irreducible representations. III- G. Bref evew of Grad Orthogoalty Theorem ad mpact o epresetatos ( ) GOT: h [ () m ] [ () m ] δδ δmm ll GOT puts great restrcto o form of rreducble represetato also o umber: l h umber of rreducble

More information

Lecture 2: Linear Least Squares Regression

Lecture 2: Linear Least Squares Regression Lecture : Lear Least Squares Regresso Dave Armstrog UW Mlwaukee February 8, 016 Is the Relatoshp Lear? lbrary(car) data(davs) d 150) Davs$weght[d]

More information

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i.

1 Mixed Quantum State. 2 Density Matrix. CS Density Matrices, von Neumann Entropy 3/7/07 Spring 2007 Lecture 13. ψ = α x x. ρ = p i ψ i ψ i. CS 94- Desty Matrces, vo Neuma Etropy 3/7/07 Sprg 007 Lecture 3 I ths lecture, we wll dscuss the bascs of quatum formato theory I partcular, we wll dscuss mxed quatum states, desty matrces, vo Neuma etropy

More information

The number of observed cases The number of parameters. ith case of the dichotomous dependent variable. the ith case of the jth parameter

The number of observed cases The number of parameters. ith case of the dichotomous dependent variable. the ith case of the jth parameter LOGISTIC REGRESSION Notato Model Logstc regresso regresses a dchotomous depedet varable o a set of depedet varables. Several methods are mplemeted for selectg the depedet varables. The followg otato s

More information

Lecture 07: Poles and Zeros

Lecture 07: Poles and Zeros Lecture 07: Poles ad Zeros Defto of poles ad zeros The trasfer fucto provdes a bass for determg mportat system respose characterstcs wthout solvg the complete dfferetal equato. As defed, the trasfer fucto

More information

Chapter Two. An Introduction to Regression ( )

Chapter Two. An Introduction to Regression ( ) ubject: A Itroducto to Regresso Frst tage Chapter Two A Itroducto to Regresso (018-019) 1 pg. ubject: A Itroducto to Regresso Frst tage A Itroducto to Regresso Regresso aalss s a statstcal tool for the

More information

Derivation of 3-Point Block Method Formula for Solving First Order Stiff Ordinary Differential Equations

Derivation of 3-Point Block Method Formula for Solving First Order Stiff Ordinary Differential Equations Dervato of -Pot Block Method Formula for Solvg Frst Order Stff Ordary Dfferetal Equatos Kharul Hamd Kharul Auar, Kharl Iskadar Othma, Zara Bb Ibrahm Abstract Dervato of pot block method formula wth costat

More information

C-1: Aerodynamics of Airfoils 1 C-2: Aerodynamics of Airfoils 2 C-3: Panel Methods C-4: Thin Airfoil Theory

C-1: Aerodynamics of Airfoils 1 C-2: Aerodynamics of Airfoils 2 C-3: Panel Methods C-4: Thin Airfoil Theory ROAD MAP... AE301 Aerodyamcs I UNIT C: 2-D Arfols C-1: Aerodyamcs of Arfols 1 C-2: Aerodyamcs of Arfols 2 C-3: Pael Methods C-4: Th Arfol Theory AE301 Aerodyamcs I Ut C-3: Lst of Subects Problem Solutos?

More information

Big Data Analytics. Data Fitting and Sampling. Acknowledgement: Notes by Profs. R. Szeliski, S. Seitz, S. Lazebnik, K. Chaturvedi, and S.

Big Data Analytics. Data Fitting and Sampling. Acknowledgement: Notes by Profs. R. Szeliski, S. Seitz, S. Lazebnik, K. Chaturvedi, and S. Bg Data Aaltcs Data Fttg ad Samplg Ackowledgemet: Notes b Profs. R. Szelsk, S. Setz, S. Lazebk, K. Chaturved, ad S. Shah Fttg: Cocepts ad recpes A bag of techques If we kow whch pots belog to the le, how

More information

Numerical Experiments with the Lagrange Multiplier and Conjugate Gradient Methods (ILMCGM)

Numerical Experiments with the Lagrange Multiplier and Conjugate Gradient Methods (ILMCGM) Aerca Joural of Appled Matheatcs 4; (6: -6 Publshed ole Jauary 5, 5 (http://wwwscecepublshroupco//aa do: 648/aa465 ISSN: 33-43 (Prt; ISSN: 33-6X (Ole Nuercal Eperets wth the Larae Multpler ad Couate Gradet

More information

Introduction to Matrices and Matrix Approach to Simple Linear Regression

Introduction to Matrices and Matrix Approach to Simple Linear Regression Itroducto to Matrces ad Matrx Approach to Smple Lear Regresso Matrces Defto: A matrx s a rectagular array of umbers or symbolc elemets I may applcatos, the rows of a matrx wll represet dvduals cases (people,

More information

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements Aoucemets No-Parametrc Desty Estmato Techques HW assged Most of ths lecture was o the blacboard. These sldes cover the same materal as preseted DHS Bometrcs CSE 90-a Lecture 7 CSE90a Fall 06 CSE90a Fall

More information

Beam Warming Second-Order Upwind Method

Beam Warming Second-Order Upwind Method Beam Warmg Secod-Order Upwd Method Petr Valeta Jauary 6, 015 Ths documet s a part of the assessmet work for the subject 1DRP Dfferetal Equatos o Computer lectured o FNSPE CTU Prague. Abstract Ths documet

More information

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture) CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.

More information

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model ECON 48 / WH Hog The Smple Regresso Model. Defto of the Smple Regresso Model Smple Regresso Model Expla varable y terms of varable x y = β + β x+ u y : depedet varable, explaed varable, respose varable,

More information

7.0 Equality Contraints: Lagrange Multipliers

7.0 Equality Contraints: Lagrange Multipliers Systes Optzato 7.0 Equalty Cotrats: Lagrage Multplers Cosder the zato of a o-lear fucto subject to equalty costrats: g f() R ( ) 0 ( ) (7.) where the g ( ) are possbly also olear fuctos, ad < otherwse

More information

Econometric Methods. Review of Estimation

Econometric Methods. Review of Estimation Ecoometrc Methods Revew of Estmato Estmatg the populato mea Radom samplg Pot ad terval estmators Lear estmators Ubased estmators Lear Ubased Estmators (LUEs) Effcecy (mmum varace) ad Best Lear Ubased Estmators

More information

Newton s Power Flow algorithm

Newton s Power Flow algorithm Power Egeerg - Egll Beedt Hresso ewto s Power Flow algorthm Power Egeerg - Egll Beedt Hresso The ewto s Method of Power Flow 2 Calculatos. For the referece bus #, we set : V = p.u. ad δ = 0 For all other

More information

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier Baa Classfcato CS6L Data Mg: Classfcato() Referece: J. Ha ad M. Kamber, Data Mg: Cocepts ad Techques robablstc learg: Calculate explct probabltes for hypothess, amog the most practcal approaches to certa

More information

OPTIMAL LAY-OUT OF NATURAL GAS PIPELINE NETWORK

OPTIMAL LAY-OUT OF NATURAL GAS PIPELINE NETWORK 23rd World Gas Coferece, Amsterdam 2006 OPTIMAL LAY-OUT OF NATURAL GAS PIPELINE NETWORK Ma author Tg-zhe, Ne CHINA ABSTRACT I cha, there are lots of gas ppele etwork eeded to be desged ad costructed owadays.

More information

Multiple Choice Test. Chapter Adequacy of Models for Regression

Multiple Choice Test. Chapter Adequacy of Models for Regression Multple Choce Test Chapter 06.0 Adequac of Models for Regresso. For a lear regresso model to be cosdered adequate, the percetage of scaled resduals that eed to be the rage [-,] s greater tha or equal to

More information

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes coometrcs, CON Sa Fracsco State Uverst Mchael Bar Sprg 5 Mdterm xam, secto Soluto Thursda, Februar 6 hour, 5 mutes Name: Istructos. Ths s closed book, closed otes exam.. No calculators of a kd are allowed..

More information

C.11 Bang-bang Control

C.11 Bang-bang Control Itroucto to Cotrol heory Iclug Optmal Cotrol Nguye a e -.5 C. Bag-bag Cotrol. Itroucto hs chapter eals wth the cotrol wth restrctos: s boue a mght well be possble to have scotutes. o llustrate some of

More information

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines It J Cotemp Math Sceces, Vol 5, 2010, o 19, 921-929 Solvg Costraed Flow-Shop Schedulg Problems wth Three Maches P Pada ad P Rajedra Departmet of Mathematcs, School of Advaced Sceces, VIT Uversty, Vellore-632

More information

Statistics. Correlational. Dr. Ayman Eldeib. Simple Linear Regression and Correlation. SBE 304: Linear Regression & Correlation 1/3/2018

Statistics. Correlational. Dr. Ayman Eldeib. Simple Linear Regression and Correlation. SBE 304: Linear Regression & Correlation 1/3/2018 /3/08 Sstems & Bomedcal Egeerg Departmet SBE 304: Bo-Statstcs Smple Lear Regresso ad Correlato Dr. Ama Eldeb Fall 07 Descrptve Orgasg, summarsg & descrbg data Statstcs Correlatoal Relatoshps Iferetal Geeralsg

More information

Module 1 : The equation of continuity. Lecture 5: Conservation of Mass for each species. & Fick s Law

Module 1 : The equation of continuity. Lecture 5: Conservation of Mass for each species. & Fick s Law Module : The equato of cotuty Lecture 5: Coservato of Mass for each speces & Fck s Law NPTEL, IIT Kharagpur, Prof. Sakat Chakraborty, Departmet of Chemcal Egeerg 2 Basc Deftos I Mass Trasfer, we usually

More information