Week 1, Lecture 2. Neural Network Basics. Announcements: HW 1 Due on 10/8 Data sets for HW 1 are online Project selection 10/11. Suggested reading :

Size: px
Start display at page:

Download "Week 1, Lecture 2. Neural Network Basics. Announcements: HW 1 Due on 10/8 Data sets for HW 1 are online Project selection 10/11. Suggested reading :"

Transcription

1 ME 537: Learig-Based Cotrol Week 1, Lecture 2 Neural Network Basics Aoucemets: HW 1 Due o 10/8 Data sets for HW 1 are olie Proect selectio 10/11 Suggested readig : NN survey paper (Zhag Chap 1, 2 ad Sectios 4.1 to 4.5 i Passio Orego State Uiversity Learig From Data! y = f(x = a x + b Liear regressio (parametric

2 Learig From Data! y = f(x = a x + b???! y = f(x = a x 2 + b x + c Polyomial regressio (parametric Learig From Data! y = f(x = a x + a -1 x a 2 x 2 + a 1 x + a 0 Polyomial regressio (parametric

3 Learig From Data! y = f(x =?? Neural Networks for Noliear Cotrol! Motivatio:! Cotrol a system with oliear dyamics! Robot! Satellite! Air vehicle! Do we kow what the good cotrol strategies are?! Yes: teach eural etwork those strategies! Drive a car ad record good driver actios for each state! Fly a helicopter ad record good pilot actios for each state! No: have a eural etwork discover those strategies! Let car drive aroud ad provide feedback o performace

4 Neural Networks! Why Neural Networks?! Uits or Neuros! Neural Network Architectures! Activatio Fuctios! Sigle Layer Feed Forward Networks! Multi Layer Feed Forward Networks! Error Backpropagatio! Implemetatio Issues Why Neural Networks?! Neural Network: A massively parallel distributed processor made up of simple processig uits. It stores kowledge.! A artificial eural etwork is similar to the brai i that:! Kowledge is acquired by the etwork from its eviromet through a learig process! Itereuro coectio stregths (syaptic weights are used to store the acquired kowledge! A artificial eural etwork is differet from the brai i a thousad ways!! Thik of a eural etwork as a statistical tool.

5 Beefits of Neural Networks! Performs a iput/output mappig! Ca be traied from examples Noliear regressio ++ Fuctioal form of mappig eed ot be kow! Is adaptive to chagig eviromets! Provides probabilistic respose! Results i fault tolerat computig Track ostatioarity Cofidece i solutio Graceful degradatio Iput / Output Mappig! Supervised learig: learig with a set of labeled examples! Each example has:! A iput! A desired output! Traiig:! Preset iput! Compute output! Compare etwork output to desired output! Update etwork weights to miimize error! Whe weights are stable etwork has leared a iput/output mappig

6 Types of Learig! Learig Rules:! Hebbia! Memory Based! Competitive! Gradiet descet! Learig Paradigms:! Supervised! Critic (Reiforcemet Learig! Usupervised Hebbia Learig! If two euros are activated at the same time, stregthe the weight betwee them (Hebb, 1949! Properties:! Highly Local! Time depedet! Iteractive! Appeal: Evidece for biological plausibility

7 Memory Based Learig! Explicitly store experieces (patters i memory! Whe a ew patter is observed:! Fid stored patters i eighborhood of test patter! Example: Nearest Neighbor algorithm! For each ew usee patter, fid closest (or closest K patters i memory! Assig ew patter to class most frequetly represeted i the eighborhood! Slow recall (search through all stored patters Competitive Learig! Oly euros wiig some competitio are updated! Basic elemets:! All euros start the same! There is a limit o the total stregth of each euro! A mechaism for euros to compete. Wier is called wiertakes-all euro! Example: euros represet cocetratios of data! For each patter, the wiig euro is modified to be closer to that particular patter! Neuros form clumps to represet the differet data clusters

8 Gradiet Descet! Update weights to miimize error! Take steps proportioal to the egative of the derivative! More later Model of a Neuro! Each iput is a product of some sigal (output ad a weight! All icomig iputs are summed! Sum goes through a activatio fuctio! Output is set out to the etwork

9 Activatio Fuctios Neural Network Architectures

10 Neural Network Architectures Sigle Layer Feed Forward Networks! Iput x is a m elemet iput vector! Target t is the desired output (ca be a vector! Output y is respose to x! Error e is differece betwee desired ad etwork outputs x 1! x 2! x m! w 1! w 2! w m!?! y! w 0! e = t " y

11 Sigle Layer Feed Forward Networks! Liear Discrimiatio: m y = " w k x k + w 0 k x 1! x 2! x m! w 1! w 2! w m! " w 0! y!! Logistic Discrimiatio: m y = f (" w k x k k x 1! x 2! x m! w 1! w 2! w m! f (" y! Sigle Layer Feed Forward Networks! Liear Discrimiatio: m x 1! x 2! w 1! w 2! y = " w k x k + w 0 k w w 0! x m! m! " y!! Logistic Discrimiatio: m y = f (" w k x k + w 0 k x 1! x 2! x m! w 1! w 2! w m! f (" w 0! y!

12 Learig From Data: Represetatio x a!" y b 1! y = f(x = a x + b Learig From Data: Represetatio x a!" y x 2 b c 1 1 h(x a x!" y 1 b c h(x = x 2 1 w 1 h(x w 3! y = f(x = a x 2 + b x + c x w 5!" y w 2 1 w 6 h(x w 4 h(x = 1/(1+e -x w 0 1

13 Learig From Data: Represetatio w 1 h(x w 3! y = f(x =?? x w 5!" y w 2 1 w 6 h(x w 4 h(x = 1/(1+e -x w 0 1 Sigle Layer Feed Forward Networks! patters (x,t! Mea Square Error: E = 1 2 N # =1 (t " y 2

14 Sigle Layer Feed Forward Networks! Error o patter : e = t " y Sigle Layer Feed Forward Networks! Error o patter : e = t " y! Mea Square Error: E = 1 2 N # =1 (t " y 2

15 Sigle Layer Feed Forward Networks! Error o patter : e = t " y! Mea Square Error: E = 1 2 N # =1 (t " y 2! Least Mea Square algorithm: "E "w = "e e "w Sigle Layer Feed Forward Networks! Error o patter : e = t " y! Mea Square Error: E = 1 2 N # =1 (t " y 2! Least Mea Square algorithm:! Gradiet descet: "E "w = "e e "w "w = #$ %E %w

16 Gradiet Descet: Move i directio of egative derivative E(w Decreasig E(w w 1 d E(w/ dw 1 w 1 d E(w/dw 1 > 0 w 1 <= w 1 - # d E(w/dw 1 i.e., the rule decreases w 1 Gradiet Descet: Move i directio of egative derivative E(w Decreasig E(w w 1 d E(w/ dw 1 w 1 d E(w/dw 1 > 0 w 1 <= w 1 - # de(w/dw 1 i.e., the rule decreases w 1

17 Gradiet Descet: Move i directio of egative derivative E(w Decreasig E(w w 1 Gradiet Descet: Move i directio of egative derivative E(w Decreasig E(w w 1 d E(w/ dw 1 w 1 d E(w/dw 1 < 0 w 1 <= w 1 - # de(w/dw 1 i.e., the rule icreases w 1

18 Sigle Layer Feed Forward Networks! Liear activatio fuctio: x 1! x 2! w 1! w 2! " y! x m! w m! w 0! Sigle Layer Feed Forward Networks! Liear activatio fuctio: "E = e i "w i, x 1! x 2! x m! w 1! w 2! w m! " w 0! y! "e i "w i, = #e i "y i "w i, = #e i x

19 Sigle Layer Feed Forward Networks! Liear activatio fuctio: x 1! x 2! w 1! w 2! " y! "E "e = e i i "w i, "w i, = #e i "y i "w i, = #e i x x m! w m! w 0!! Weight update: "w i, = #$ %E %w i, = $ e i x ME 537: Learig-Based Cotrol Week 2, Lecture 1 Neural Network Basics II Aoucemets: HW 1 Due o 10/8 Data sets for HW 1 are olie Proect selectio 10/11 Suggested readig : Sectios 9.2 & 11.1 i Passio Orego State Uiversity

20 Sigle Layer Feed Forward Networks! Sigmoid activatio fuctio:! Derivative of sigmoid: f (a = 1 1+ e "a f "(a = f (a(1# f (a Sigle Layer Feed Forward Networks! Sigmoid activatio fuctio:! Derivative of sigmoid: f (a = 1 1+ e "a f "(a = f (a(1# f (a! Gradiet descet: "E "e = e i " i = #e i f ( $ w i, x "w i, "w i, "w i, = #e i f ( $ w i, x (1# f ( $ w i, x x = #e i y i (1# y i x

21 Sigle Layer Feed Forward Networks! Sigmoid activatio fuctio:! Derivative of sigmoid: f (a = 1 1+ e "a f "(a = f (a(1# f (a! Gradiet descet: "E "e = e i " i = #e i f ( $ w i, x "w i, "w i, "w i, = #e i f ( $ w i, x (1# f ( $ w i, x x = #e i y i (1# y i x Sigle Layer Feed Forward Networks! Sigmoid activatio fuctio:! Derivative of sigmoid: f (a = 1 1+ e "a f "(a = f (a(1# f (a! Gradiet descet: "E "e = e i " i = #e i f ( $ w i, x "w i, "w i, "w i, = #e i f ( $ w i, x (1# f ( $ w i, x x = #e i y i (1# y i x! Weight update: "w i, = #$ %E %w i, = $ e i y i (1# y i x

22 Illustratio of Gradiet Descet E(w w 1 w 0 Illustratio of Gradiet Descet E(w w 1 w 0

23 Illustratio of Gradiet Descet E(w w 1 Directio of steepest descet = directio of egative gradiet w 0 Illustratio of Gradiet Descet E(w w 1 Origial poit i weight space w 0 New poit i weight space

24 Multi Layer Feed Forward Networks x 1! v 1,1! h 1! x i! h! y k! v i,! w,k! Multi Layer Feed Forward Networks x 1! v 1,1! h 1! x i! h! y k! y k = f (" w,k h + w 0,k v i,! w,k! f (a = 1 1+ e "a = f (" w,k # f (" v i, x i + v 0, + w 0,k i

25 Weight updates! Derivative of error wrt weight,k: "E = e k "w,k "e k "w,k "f = #e k k "w,k = #e k f ( $ w,k h (1# f ( $ w,k h h = #e k y k (1# y k h Weight updates! Derivative of error wrt weight,k: "E = e k "w,k "e k "w,k "f = #e k k "w,k = #e k f ( $ w,k h (1# f ( $ w,k h h = #e k y k (1# y k h! Updatig hidde-output layer weights:! Hidde-output layer deltas: "w,k = #$ %E %w,k = $& k h " k = e k y k (1# y k

26 Multi Layer Feed Forward Networks x 1! v 1,1! h 1! x i! h! y k! v i,! w,k! h = f (" v i, x i + v 0, i Multi Layer Feed Forward Networks x 1! v 1,1! h 1! x i! h! y k! v i,! w,k! What are the errors for the hidde layer? h = f (" v i, x i + v 0, i We do t kow the Targets. Now what?

27 Error Backpropagatio! Updatig iput-hidde layer weights: "v i, = #$ x i! Delta: " = e f (# v i, x i (1$ f (# v i, x i = e h (1$ h # i i ( = w,k " k % h (1$ h k Error Backpropagatio! Updatig iput-hidde layer weights: "v i, = #$ x i! Delta: " = e f (# v i, x i (1$ f (# v i, x i = e h (1$ h # i ( = w,k " k % h (1$ h k i Errors for the hidde layer: Backpropagated deltas from output layer

28 Backpropagatio Summary:! For sigmoidal activatio fuctios, update ay weight coectig iput i to output : "w i, = #$ x i Backpropagatio Summary:! For sigmoidal activatio fuctios, update ay weight coectig iput i to output :! Deltas give by: "w i, = #$ x i! For output layer " = e y (1# y

29 Backpropagatio Summary:! For sigmoidal activatio fuctios, update ay weight coectig iput i to output :! Deltas give by: "w i, = #$ x i! For output layer " = e y (1# y! For hidde layer " = $ w,k " k # h (1% h k Backpropagatio Algorithm! For each epoch:! Preset patter x to etwork! Propagate sigal forward:! Compute hidde uit values! Compute output value! Fid error! Compute output layer deltas! Compute hidde layer deltas! Compute gradiet for each weight! Update each weight! Preset ext patter! Repeat this process util MSE is satisfactory

30 Radial Basis Fuctio Networks Key RBF differeces: x 1! R 1!!! Local activatio Liear output layer recommeded x i! R! y k!! Euclidea orm activatio! All hidde uits are differet fuctios w,k!! Oe hidde layer Radial Basis Fuctio Networks R 1! x 1! $ i! x i! R! y k! c i! w,k!! c is the ceter of the th radial basis fuctio c = {c,1,,c,i,, c,n }! s is the radius of the th radial basis fuctio

31 Radial Basis Fuctio Networks R 1! x 1! $ i! x i! R! y k! c i! w,k! $ R (x = exp " x " c 2 ' & & 2(# 2 % (! c is the ceter of the th radial basis fuctio c = {c,1,,c,i,, c,n }! $ is the radius of the th radial basis fuctio Radial Basis Fuctio Networks R 1! x 1! $ i! x i! R! y k! c i! w,k! y k = f (" w,k R + w 0,k & & = f w,k # exp $ x $ c 2 ( ( + ( 2(% w + " 0,k ( + ' ' * *

32 RBF Ceter Updates! For quadratic distace: x " c 2 = (x " c,1 2 +!+ (x " c,i 2 +!+ (x " c,n 2! Ceter updates: "R "c,k = % " exp # x # c 2 ( ' * ' 2($ 2 * & "c,k % = exp # x # c 2 ( ' * ' 2($ 2 * (#2(# 1 2($ & (x 2 k # c,k = R (x k # c,k ($ 2 RBF Ceter Updates! For sigle output (y liear output layer: "E = "E "e "y "R "c,k "e "y "R "c,k = e(#1w "R "c,k = e(#1w R (x k # c,k ($ 2! Updatig Ceters: "c,k = #$ %E (x = $ e w R k # c,k %c,k (& 2

33 Implemetatio issues! Traiig, Testig ad Validatio! Network architecture ad Traiig! Iitial weights & Parameter selectio! Local miima! Mometum term for weights! Network complexity! Covergece! Geeralizatio! Model complexity! Uiversal approximator theorem Traiig, Testig ad Validatio!!!!! Traiig: usig kow samples to set parameters Testig: Verifyig that leared mappig applies to usee samples Validatio: testig o the traiig samples to set parameters Geeralizatio: Ability to exted learig to ew samples Example: 1000 data poits! Use 600 for traiig: set parameters! Use 200 for validatio: check performace, adust parameters! Use 200 for testig: Geeralizatio performace! Cross-validatio: Trai ad validate o data partitios.! 4-fold cross validatio meas split data ito four ad trai o three quarters ad validate o oe fourth for each combiatio! All traiig data used for traiig (200 data poits ot used above! Validatio results still valid (four validatio sets above

34 Network Architecture ad Traiig! Architecture:! Feed forward etwork! 2 Layer FFN! Neuro selectio! Activatio fuctios! Learig Algorithm! Gradiet descet! How may hidde uits?! How log should traiig last? Iitial Weights! Iitial Weights! Radom! Seed special cocept! Clusterig (for RBF etworks

35 Local Miima! Ca have multiple local miima! Gradiet descet goes to the closest local miimum:! solutio: radom restarts from multiple places i weight space Mometum Term! Weight update chages too fast with: "w i, = #$ x i! Let each update be closer to last update. Give the gradiet mometum : "w i, = #"w $1 i, + %& x i

36 Covergece! Preset time:! Trai for 2000 epochs! Preset error criteria:! Trai util MSE reaches! Relative error criteria:! Trai till MSE chages by less tha.1% per epoch! Use some left out patters to validate traiig. Whe validatio error bottoms out, stop traiig. Geeralizatio error Traiig set error Test set error Traiig time! Traiig set error reduced cotiuously! Test set error (geeralizatio error icreases after a poit! Network starts to lear the oise i the traiig data

37 Model Complexity Uiversal Fuctio Approximatio How good a approximator is a multi layer feed forward etwork? Uiversal Approximatio Theorem: Uder some assumptios, for ay give costat e ad cotiuous fuctio f (x 1,...,x m, there exists a three layer MLP with the property that f (x 1,...,x m - H(x 1,...,x m < e where H ( x 1,..., x m =! i v i h (! w i x + b i h(. is oliear activatio fuctio

ME 539, Fall 2008: Learning-Based Control

ME 539, Fall 2008: Learning-Based Control ME 539, Fall 2008: Learig-Based Cotrol Neural Network Basics 10/1/2008 & 10/6/2008 Uiversity Orego State Neural Network Basics Questios??? Aoucemet: Homework 1 has bee posted Due Friday 10/10/08 at oo

More information

Multilayer perceptrons

Multilayer perceptrons Multilayer perceptros If traiig set is ot liearly separable, a etwork of McCulloch-Pitts uits ca give a solutio If o loop exists i etwork, called a feedforward etwork (else, recurret etwork) A two-layer

More information

Perceptron. Inner-product scalar Perceptron. XOR problem. Gradient descent Stochastic Approximation to gradient descent 5/10/10

Perceptron. Inner-product scalar Perceptron. XOR problem. Gradient descent Stochastic Approximation to gradient descent 5/10/10 Perceptro Ier-product scalar Perceptro Perceptro learig rule XOR problem liear separable patters Gradiet descet Stochastic Approximatio to gradiet descet LMS Adalie 1 Ier-product et =< w, x >= w x cos(θ)

More information

Linear Associator Linear Layer

Linear Associator Linear Layer Hebbia Learig opic 6 Note: lecture otes by Michael Negevitsky (uiversity of asmaia) Bob Keller (Harvey Mudd College CA) ad Marti Haga (Uiversity of Colorado) are used Mai idea: learig based o associatio

More information

10-701/ Machine Learning Mid-term Exam Solution

10-701/ Machine Learning Mid-term Exam Solution 0-70/5-78 Machie Learig Mid-term Exam Solutio Your Name: Your Adrew ID: True or False (Give oe setece explaatio) (20%). (F) For a cotiuous radom variable x ad its probability distributio fuctio p(x), it

More information

Step 1: Function Set. Otherwise, output C 2. Function set: Including all different w and b

Step 1: Function Set. Otherwise, output C 2. Function set: Including all different w and b Logistic Regressio Step : Fuctio Set We wat to fid P w,b C x σ z = + exp z If P w,b C x.5, output C Otherwise, output C 2 z P w,b C x = σ z z = w x + b = w i x i + b i z Fuctio set: f w,b x = P w,b C x

More information

Introduction to Artificial Intelligence CAP 4601 Summer 2013 Midterm Exam

Introduction to Artificial Intelligence CAP 4601 Summer 2013 Midterm Exam Itroductio to Artificial Itelligece CAP 601 Summer 013 Midterm Exam 1. Termiology (7 Poits). Give the followig task eviromets, eter their properties/characteristics. The properties/characteristics of the

More information

Outline. Linear regression. Regularization functions. Polynomial curve fitting. Stochastic gradient descent for regression. MLE for regression

Outline. Linear regression. Regularization functions. Polynomial curve fitting. Stochastic gradient descent for regression. MLE for regression REGRESSION 1 Outlie Liear regressio Regularizatio fuctios Polyomial curve fittig Stochastic gradiet descet for regressio MLE for regressio Step-wise forward regressio Regressio methods Statistical techiques

More information

An Introduction to Neural Networks

An Introduction to Neural Networks A Itroductio to Neural Networks Referece: B.J.A. Kröse ad P.P. va der Smagt (1994): A Itroductio to Neural Networks, Poglavja 1-5, 6.1, 6.2, 7-8. Systems modellig from data 0 B.J.A. Kröse ad P.P. va der

More information

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise)

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise) Lecture 22: Review for Exam 2 Basic Model Assumptios (without Gaussia Noise) We model oe cotiuous respose variable Y, as a liear fuctio of p umerical predictors, plus oise: Y = β 0 + β X +... β p X p +

More information

Jacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3

Jacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3 No-Parametric Techiques Jacob Hays Amit Pillay James DeFelice 4.1, 4.2, 4.3 Parametric vs. No-Parametric Parametric Based o Fuctios (e.g Normal Distributio) Uimodal Oly oe peak Ulikely real data cofies

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Patter Recogitio Classificatio: No-Parametric Modelig Hamid R. Rabiee Jafar Muhammadi Sprig 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Ageda Parametric Modelig No-Parametric Modelig

More information

REGRESSION (Physics 1210 Notes, Partial Modified Appendix A)

REGRESSION (Physics 1210 Notes, Partial Modified Appendix A) REGRESSION (Physics 0 Notes, Partial Modified Appedix A) HOW TO PERFORM A LINEAR REGRESSION Cosider the followig data poits ad their graph (Table I ad Figure ): X Y 0 3 5 3 7 4 9 5 Table : Example Data

More information

Introduction to Signals and Systems, Part V: Lecture Summary

Introduction to Signals and Systems, Part V: Lecture Summary EEL33: Discrete-Time Sigals ad Systems Itroductio to Sigals ad Systems, Part V: Lecture Summary Itroductio to Sigals ad Systems, Part V: Lecture Summary So far we have oly looked at examples of o-recursive

More information

Deep Neural Networks CMSC 422 MARINE CARPUAT. Deep learning slides credit: Vlad Morariu

Deep Neural Networks CMSC 422 MARINE CARPUAT. Deep learning slides credit: Vlad Morariu Deep Neural Networks CMSC 422 MARINE CARPUAT marie@cs.umd.edu Deep learig slides credit: Vlad Morariu Traiig (Deep) Neural Networks Computatioal graphs Improvemets to gradiet descet Stochastic gradiet

More information

Areas and Distances. We can easily find areas of certain geometric figures using well-known formulas:

Areas and Distances. We can easily find areas of certain geometric figures using well-known formulas: Areas ad Distaces We ca easily fid areas of certai geometric figures usig well-kow formulas: However, it is t easy to fid the area of a regio with curved sides: METHOD: To evaluate the area of the regio

More information

Classification with linear models

Classification with linear models Lecture 8 Classificatio with liear models Milos Hauskrecht milos@cs.pitt.edu 539 Seott Square Geerative approach to classificatio Idea:. Represet ad lear the distributio, ). Use it to defie probabilistic

More information

Machine Learning Regression I Hamid R. Rabiee [Slides are based on Bishop Book] Spring

Machine Learning Regression I Hamid R. Rabiee [Slides are based on Bishop Book] Spring Machie Learig Regressio I Hamid R. Rabiee [Slides are based o Bishop Book] Sprig 015 http://ce.sharif.edu/courses/93-94//ce717-1 Liear Regressio Liear regressio: ivolves a respose variable ad a sigle predictor

More information

Chapter 7. Support Vector Machine

Chapter 7. Support Vector Machine Chapter 7 Support Vector Machie able of Cotet Margi ad support vectors SVM formulatio Slack variables ad hige loss SVM for multiple class SVM ith Kerels Relevace Vector Machie Support Vector Machie (SVM)

More information

ECE4270 Fundamentals of DSP. Lecture 2 Discrete-Time Signals and Systems & Difference Equations. Overview of Lecture 2. More Discrete-Time Systems

ECE4270 Fundamentals of DSP. Lecture 2 Discrete-Time Signals and Systems & Difference Equations. Overview of Lecture 2. More Discrete-Time Systems ECE4270 Fudametals of DSP Lecture 2 Discrete-Time Sigals ad Systems & Differece Equatios School of ECE Ceter for Sigal ad Iformatio Processig Georgia Istitute of Techology Overview of Lecture 2 Aoucemet

More information

Pattern Classification, Ch4 (Part 1)

Pattern Classification, Ch4 (Part 1) Patter Classificatio All materials i these slides were take from Patter Classificatio (2d ed) by R O Duda, P E Hart ad D G Stork, Joh Wiley & Sos, 2000 with the permissio of the authors ad the publisher

More information

1 Review of Probability & Statistics

1 Review of Probability & Statistics 1 Review of Probability & Statistics a. I a group of 000 people, it has bee reported that there are: 61 smokers 670 over 5 960 people who imbibe (drik alcohol) 86 smokers who imbibe 90 imbibers over 5

More information

Adaptive Resonance Theory (ART)

Adaptive Resonance Theory (ART) Adaptive Resoace Theory : Soft Computig Course Lecture 25-28, otes, slides www.myreaders.ifo/, RC Chakraborty, e-mail rcchak@gmail.com, Dec., 2 http://www.myreaders.ifo/html/soft_computig.html www.myreaders.ifo

More information

Nonlinear regression

Nonlinear regression oliear regressio How to aalyse data? How to aalyse data? Plot! How to aalyse data? Plot! Huma brai is oe the most powerfull computatioall tools Works differetly tha a computer What if data have o liear

More information

Admin REGULARIZATION. Schedule. Midterm 9/29/16. Assignment 5. Midterm next week, due Friday (more on this in 1 min)

Admin REGULARIZATION. Schedule. Midterm 9/29/16. Assignment 5. Midterm next week, due Friday (more on this in 1 min) Admi Assigmet 5! Starter REGULARIZATION David Kauchak CS 158 Fall 2016 Schedule Midterm ext week, due Friday (more o this i 1 mi Assigmet 6 due Friday before fall break Midterm Dowload from course web

More information

Study on Coal Consumption Curve Fitting of the Thermal Power Based on Genetic Algorithm

Study on Coal Consumption Curve Fitting of the Thermal Power Based on Genetic Algorithm Joural of ad Eergy Egieerig, 05, 3, 43-437 Published Olie April 05 i SciRes. http://www.scirp.org/joural/jpee http://dx.doi.org/0.436/jpee.05.34058 Study o Coal Cosumptio Curve Fittig of the Thermal Based

More information

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss ECE 90 Lecture : Complexity Regularizatio ad the Squared Loss R. Nowak 5/7/009 I the previous lectures we made use of the Cheroff/Hoeffdig bouds for our aalysis of classifier errors. Hoeffdig s iequality

More information

Linear Classifiers III

Linear Classifiers III Uiversität Potsdam Istitut für Iformatik Lehrstuhl Maschielles Lere Liear Classifiers III Blaie Nelso, Tobias Scheffer Cotets Classificatio Problem Bayesia Classifier Decisio Liear Classifiers, MAP Models

More information

Support vector machine revisited

Support vector machine revisited 6.867 Machie learig, lecture 8 (Jaakkola) 1 Lecture topics: Support vector machie ad kerels Kerel optimizatio, selectio Support vector machie revisited Our task here is to first tur the support vector

More information

Lectures 12&13&14: Multilayer Perceptrons (MLP) Networks

Lectures 12&13&14: Multilayer Perceptrons (MLP) Networks 1 Lectures 12&13&14: Multilayer Perceptros MLP Networks MultiLayer Perceptro MLP formulated from loose biological priciples popularized mid 1980s Rumelhart, Hito & Williams 1986 Werbos 1974, Ho 1964 lear

More information

REGRESSION WITH QUADRATIC LOSS

REGRESSION WITH QUADRATIC LOSS REGRESSION WITH QUADRATIC LOSS MAXIM RAGINSKY Regressio with quadratic loss is aother basic problem studied i statistical learig theory. We have a radom couple Z = X, Y ), where, as before, X is a R d

More information

Machine Learning Theory (CS 6783)

Machine Learning Theory (CS 6783) Machie Learig Theory (CS 6783) Lecture 2 : Learig Frameworks, Examples Settig up learig problems. X : istace space or iput space Examples: Computer Visio: Raw M N image vectorized X = 0, 255 M N, SIFT

More information

Optimally Sparse SVMs

Optimally Sparse SVMs A. Proof of Lemma 3. We here prove a lower boud o the umber of support vectors to achieve geeralizatio bouds of the form which we cosider. Importatly, this result holds ot oly for liear classifiers, but

More information

Topics Machine learning: lecture 2. Review: the learning problem. Hypotheses and estimation. Estimation criterion cont d. Estimation criterion

Topics Machine learning: lecture 2. Review: the learning problem. Hypotheses and estimation. Estimation criterion cont d. Estimation criterion .87 Machie learig: lecture Tommi S. Jaakkola MIT CSAIL tommi@csail.mit.edu Topics The learig problem hypothesis class, estimatio algorithm loss ad estimatio criterio samplig, empirical ad epected losses

More information

Difference Equation Construction (1) ENGG 1203 Tutorial. Difference Equation Construction (2) Grow, baby, grow (1)

Difference Equation Construction (1) ENGG 1203 Tutorial. Difference Equation Construction (2) Grow, baby, grow (1) ENGG 03 Tutorial Differece Equatio Costructio () Systems ad Cotrol April Learig Objectives Differece Equatios Z-trasform Poles Ack.: MIT OCW 6.0, 6.003 Newto s law of coolig states that: The chage i a

More information

Pixel Recurrent Neural Networks

Pixel Recurrent Neural Networks Pixel Recurret Neural Networks Aa ro va de Oord, Nal Kalchbreer, Koray Kavukcuoglu Google DeepMid August 2016 Preseter - Neha M Example problem (completig a image) Give the first half of the image, create

More information

w (1) ˆx w (1) x (1) /ρ and w (2) ˆx w (2) x (2) /ρ.

w (1) ˆx w (1) x (1) /ρ and w (2) ˆx w (2) x (2) /ρ. 2 5. Weighted umber of late jobs 5.1. Release dates ad due dates: maximimizig the weight of o-time jobs Oce we add release dates, miimizig the umber of late jobs becomes a sigificatly harder problem. For

More information

Ω ). Then the following inequality takes place:

Ω ). Then the following inequality takes place: Lecture 8 Lemma 5. Let f : R R be a cotiuously differetiable covex fuctio. Choose a costat δ > ad cosider the subset Ωδ = { R f δ } R. Let Ωδ ad assume that f < δ, i.e., is ot o the boudary of f = δ, i.e.,

More information

Pattern recognition systems Laboratory 10 Linear Classifiers and the Perceptron Algorithm

Pattern recognition systems Laboratory 10 Linear Classifiers and the Perceptron Algorithm Patter recogitio systems Laboratory 10 Liear Classifiers ad the Perceptro Algorithm 1. Objectives his laboratory sessio presets the perceptro learig algorithm for the liear classifier. We will apply gradiet

More information

Information-based Feature Selection

Information-based Feature Selection Iformatio-based Feature Selectio Farza Faria, Abbas Kazeroui, Afshi Babveyh Email: {faria,abbask,afshib}@staford.edu 1 Itroductio Feature selectio is a topic of great iterest i applicatios dealig with

More information

Pattern recognition systems Lab 10 Linear Classifiers and the Perceptron Algorithm

Pattern recognition systems Lab 10 Linear Classifiers and the Perceptron Algorithm Patter recogitio systems Lab 10 Liear Classifiers ad the Perceptro Algorithm 1. Objectives his lab sessio presets the perceptro learig algorithm for the liear classifier. We will apply gradiet descet ad

More information

4.1 Sigma Notation and Riemann Sums

4.1 Sigma Notation and Riemann Sums 0 the itegral. Sigma Notatio ad Riema Sums Oe strategy for calculatig the area of a regio is to cut the regio ito simple shapes, calculate the area of each simple shape, ad the add these smaller areas

More information

CALCULUS BASIC SUMMER REVIEW

CALCULUS BASIC SUMMER REVIEW CALCULUS BASIC SUMMER REVIEW NAME rise y y y Slope of a o vertical lie: m ru Poit Slope Equatio: y y m( ) The slope is m ad a poit o your lie is, ). ( y Slope-Itercept Equatio: y m b slope= m y-itercept=

More information

ECE-S352 Introduction to Digital Signal Processing Lecture 3A Direct Solution of Difference Equations

ECE-S352 Introduction to Digital Signal Processing Lecture 3A Direct Solution of Difference Equations ECE-S352 Itroductio to Digital Sigal Processig Lecture 3A Direct Solutio of Differece Equatios Discrete Time Systems Described by Differece Equatios Uit impulse (sample) respose h() of a DT system allows

More information

Mixtures of Gaussians and the EM Algorithm

Mixtures of Gaussians and the EM Algorithm Mixtures of Gaussias ad the EM Algorithm CSE 6363 Machie Learig Vassilis Athitsos Computer Sciece ad Egieerig Departmet Uiversity of Texas at Arligto 1 Gaussias A popular way to estimate probability desity

More information

CS 2750 Machine Learning. Lecture 23. Concept learning. CS 2750 Machine Learning. Concept Learning

CS 2750 Machine Learning. Lecture 23. Concept learning. CS 2750 Machine Learning. Concept Learning Lecture 3 Cocept learig Milos Hauskrecht milos@cs.pitt.edu Cocept Learig Outlie: Learig boolea fuctios Most geeral ad most specific cosistet hypothesis. Mitchell s versio space algorithm Probably approximately

More information

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015 ECE 8527: Itroductio to Machie Learig ad Patter Recogitio Midterm # 1 Vaishali Ami Fall, 2015 tue39624@temple.edu Problem No. 1: Cosider a two-class discrete distributio problem: ω 1 :{[0,0], [2,0], [2,2],

More information

FIR Filters. Lecture #7 Chapter 5. BME 310 Biomedical Computing - J.Schesser

FIR Filters. Lecture #7 Chapter 5. BME 310 Biomedical Computing - J.Schesser FIR Filters Lecture #7 Chapter 5 8 What Is this Course All About? To Gai a Appreciatio of the Various Types of Sigals ad Systems To Aalyze The Various Types of Systems To Lear the Skills ad Tools eeded

More information

1 Approximating Integrals using Taylor Polynomials

1 Approximating Integrals using Taylor Polynomials Seughee Ye Ma 8: Week 7 Nov Week 7 Summary This week, we will lear how we ca approximate itegrals usig Taylor series ad umerical methods. Topics Page Approximatig Itegrals usig Taylor Polyomials. Defiitios................................................

More information

Linear Regression Models

Linear Regression Models Liear Regressio Models Dr. Joh Mellor-Crummey Departmet of Computer Sciece Rice Uiversity johmc@cs.rice.edu COMP 528 Lecture 9 15 February 2005 Goals for Today Uderstad how to Use scatter diagrams to ispect

More information

Machine Learning: Logistic Regression. Lecture 04

Machine Learning: Logistic Regression. Lecture 04 Machie Learig: Logistic Regressio Razva C. Buescu School of Electrical Egieerig ad Computer Sciece buescu@ohio.edu Supervised Learig ask = lear a uko fuctio t : X that maps iput istaces x Î X to output

More information

Machine Learning Theory Tübingen University, WS 2016/2017 Lecture 11

Machine Learning Theory Tübingen University, WS 2016/2017 Lecture 11 Machie Learig Theory Tübige Uiversity, WS 06/07 Lecture Tolstikhi Ilya Abstract We will itroduce the otio of reproducig kerels ad associated Reproducig Kerel Hilbert Spaces (RKHS). We will cosider couple

More information

Algorithms for Clustering

Algorithms for Clustering CR2: Statistical Learig & Applicatios Algorithms for Clusterig Lecturer: J. Salmo Scribe: A. Alcolei Settig: give a data set X R p where is the umber of observatio ad p is the umber of features, we wat

More information

Chapter 10: Power Series

Chapter 10: Power Series Chapter : Power Series 57 Chapter Overview: Power Series The reaso series are part of a Calculus course is that there are fuctios which caot be itegrated. All power series, though, ca be itegrated because

More information

Lecture 4. Hw 1 and 2 will be reoped after class for every body. New deadline 4/20 Hw 3 and 4 online (Nima is lead)

Lecture 4. Hw 1 and 2 will be reoped after class for every body. New deadline 4/20 Hw 3 and 4 online (Nima is lead) Lecture 4 Homework Hw 1 ad 2 will be reoped after class for every body. New deadlie 4/20 Hw 3 ad 4 olie (Nima is lead) Pod-cast lecture o-lie Fial projects Nima will register groups ext week. Email/tell

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Istitute of Techology 6.867 Machie Learig, Fall 6 Problem Set : Solutios. (a) (5 poits) From the lecture otes (Eq 4, Lecture 5), the optimal parameter values for liear regressio give the

More information

Regression with quadratic loss

Regression with quadratic loss Regressio with quadratic loss Maxim Ragisky October 13, 2015 Regressio with quadratic loss is aother basic problem studied i statistical learig theory. We have a radom couple Z = X,Y, where, as before,

More information

The Maximum-Likelihood Decoding Performance of Error-Correcting Codes

The Maximum-Likelihood Decoding Performance of Error-Correcting Codes The Maximum-Lielihood Decodig Performace of Error-Correctig Codes Hery D. Pfister ECE Departmet Texas A&M Uiversity August 27th, 2007 (rev. 0) November 2st, 203 (rev. ) Performace of Codes. Notatio X,

More information

A Unified Approach on Fast Training of Feedforward and Recurrent Networks Using EM Algorithm

A Unified Approach on Fast Training of Feedforward and Recurrent Networks Using EM Algorithm 2270 IEEE TRASACTIOS O SIGAL PROCESSIG, VOL. 46, O. 8, AUGUST 1998 [12] Q. T. Zhag, K. M. Wog, P. C. Yip, ad J. P. Reilly, Statistical aalysis of the performace of iformatio criteria i the detectio of

More information

Study the bias (due to the nite dimensional approximation) and variance of the estimators

Study the bias (due to the nite dimensional approximation) and variance of the estimators 2 Series Methods 2. Geeral Approach A model has parameters (; ) where is ite-dimesioal ad is oparametric. (Sometimes, there is o :) We will focus o regressio. The fuctio is approximated by a series a ite

More information

Machine Learning. Ilya Narsky, Caltech

Machine Learning. Ilya Narsky, Caltech Machie Learig Ilya Narsky, Caltech Lecture 4 Multi-class problems. Multi-class versios of Neural Networks, Decisio Trees, Support Vector Machies ad AdaBoost. Reductio of a multi-class problem to a set

More information

ECE 308 Discrete-Time Signals and Systems

ECE 308 Discrete-Time Signals and Systems ECE 38-5 ECE 38 Discrete-Time Sigals ad Systems Z. Aliyazicioglu Electrical ad Computer Egieerig Departmet Cal Poly Pomoa ECE 38-5 1 Additio, Multiplicatio, ad Scalig of Sequeces Amplitude Scalig: (A Costat

More information

4.1 SIGMA NOTATION AND RIEMANN SUMS

4.1 SIGMA NOTATION AND RIEMANN SUMS .1 Sigma Notatio ad Riema Sums Cotemporary Calculus 1.1 SIGMA NOTATION AND RIEMANN SUMS Oe strategy for calculatig the area of a regio is to cut the regio ito simple shapes, calculate the area of each

More information

Solution of Final Exam : / Machine Learning

Solution of Final Exam : / Machine Learning Solutio of Fial Exam : 10-701/15-781 Machie Learig Fall 2004 Dec. 12th 2004 Your Adrew ID i capital letters: Your full ame: There are 9 questios. Some of them are easy ad some are more difficult. So, if

More information

CHAPTER 10 INFINITE SEQUENCES AND SERIES

CHAPTER 10 INFINITE SEQUENCES AND SERIES CHAPTER 10 INFINITE SEQUENCES AND SERIES 10.1 Sequeces 10.2 Ifiite Series 10.3 The Itegral Tests 10.4 Compariso Tests 10.5 The Ratio ad Root Tests 10.6 Alteratig Series: Absolute ad Coditioal Covergece

More information

Outline. CSCI-567: Machine Learning (Spring 2019) Outline. Prof. Victor Adamchik. Mar. 26, 2019

Outline. CSCI-567: Machine Learning (Spring 2019) Outline. Prof. Victor Adamchik. Mar. 26, 2019 Outlie CSCI-567: Machie Learig Sprig 209 Gaussia mixture models Prof. Victor Adamchik 2 Desity estimatio U of Souther Califoria Mar. 26, 209 3 Naive Bayes Revisited March 26, 209 / 57 March 26, 209 2 /

More information

Clustering. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar.

Clustering. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar. Clusterig CM226: Machie Learig for Bioiformatics. Fall 216 Sriram Sakararama Ackowledgmets: Fei Sha, Ameet Talwalkar Clusterig 1 / 42 Admiistratio HW 1 due o Moday. Email/post o CCLE if you have questios.

More information

Discrete-Time Systems, LTI Systems, and Discrete-Time Convolution

Discrete-Time Systems, LTI Systems, and Discrete-Time Convolution EEL5: Discrete-Time Sigals ad Systems. Itroductio I this set of otes, we begi our mathematical treatmet of discrete-time s. As show i Figure, a discrete-time operates or trasforms some iput sequece x [

More information

6.867 Machine learning

6.867 Machine learning 6.867 Machie learig Mid-term exam October, ( poits) Your ame ad MIT ID: Problem We are iterested here i a particular -dimesioal liear regressio problem. The dataset correspodig to this problem has examples

More information

Math 21B-B - Homework Set 2

Math 21B-B - Homework Set 2 Math B-B - Homework Set Sectio 5.:. a) lim P k= c k c k ) x k, where P is a partitio of [, 5. x x ) dx b) lim P k= 4 ck x k, where P is a partitio of [,. 4 x dx c) lim P k= ta c k ) x k, where P is a partitio

More information

Correlation Regression

Correlation Regression Correlatio Regressio While correlatio methods measure the stregth of a liear relatioship betwee two variables, we might wish to go a little further: How much does oe variable chage for a give chage i aother

More information

Topics Machine learning: lecture 3. Linear regression. Linear regression. Linear regression. Linear regression

Topics Machine learning: lecture 3. Linear regression. Linear regression. Linear regression. Linear regression 6.867 Machie learig: lecture 3 Tommi S. Jaakkola MIT CSAIL tommi@csail.mit.edu Topics Beod liear regressio models additive regressio models, eamples geeralizatio ad cross-validatio populatio miimizer Statistical

More information

Template matching. s[x,y] t[x,y] Problem: locate an object, described by a template t[x,y], in the image s[x,y] Example

Template matching. s[x,y] t[x,y] Problem: locate an object, described by a template t[x,y], in the image s[x,y] Example Template matchig Problem: locate a object, described by a template t[x,y], i the image s[x,y] Example t[x,y] s[x,y] Digital Image Processig: Berd Girod, 013-018 Staford Uiversity -- Template Matchig 1

More information

1 Duality revisited. AM 221: Advanced Optimization Spring 2016

1 Duality revisited. AM 221: Advanced Optimization Spring 2016 AM 22: Advaced Optimizatio Sprig 206 Prof. Yaro Siger Sectio 7 Wedesday, Mar. 9th Duality revisited I this sectio, we will give a slightly differet perspective o duality. optimizatio program: f(x) x R

More information

Electricity consumption forecasting method based on MPSO-BP neural network model Youshan Zhang 1, 2,a, Liangdong Guo2, b,qi Li 3, c and Junhui Li2, d

Electricity consumption forecasting method based on MPSO-BP neural network model Youshan Zhang 1, 2,a, Liangdong Guo2, b,qi Li 3, c and Junhui Li2, d 4th Iteratioal Coferece o Electrical & Electroics Egieerig ad Computer Sciece (ICEEECS 2016) Electricity cosumptio forecastig method based o eural etwork model Yousha Zhag 1, 2,a, Liagdog Guo2, b,qi Li

More information

Machine Learning Brett Bernstein

Machine Learning Brett Bernstein Machie Learig Brett Berstei Week 2 Lecture: Cocept Check Exercises Starred problems are optioal. Excess Risk Decompositio 1. Let X = Y = {1, 2,..., 10}, A = {1,..., 10, 11} ad suppose the data distributio

More information

Section 11.8: Power Series

Section 11.8: Power Series Sectio 11.8: Power Series 1. Power Series I this sectio, we cosider geeralizig the cocept of a series. Recall that a series is a ifiite sum of umbers a. We ca talk about whether or ot it coverges ad i

More information

Orthogonal Gaussian Filters for Signal Processing

Orthogonal Gaussian Filters for Signal Processing Orthogoal Gaussia Filters for Sigal Processig Mark Mackezie ad Kiet Tieu Mechaical Egieerig Uiversity of Wollogog.S.W. Australia Abstract A Gaussia filter usig the Hermite orthoormal series of fuctios

More information

ENGI 4421 Confidence Intervals (Two Samples) Page 12-01

ENGI 4421 Confidence Intervals (Two Samples) Page 12-01 ENGI 44 Cofidece Itervals (Two Samples) Page -0 Two Sample Cofidece Iterval for a Differece i Populatio Meas [Navidi sectios 5.4-5.7; Devore chapter 9] From the cetral limit theorem, we kow that, for sufficietly

More information

Deep Learning CMSC 422 MARINE CARPUAT. Based on slides by Vlad Morariu

Deep Learning CMSC 422 MARINE CARPUAT. Based on slides by Vlad Morariu Deep Learig CMSC 422 MARINE CARPUAT marie@cs.umd.edu Based o slides by Vlad Morariu feature extractio classificatio Stadard Applicatio of Machie Learig to Computer Visio cat or backgroud features predicted

More information

Math 113 Exam 3 Practice

Math 113 Exam 3 Practice Math Exam Practice Exam 4 will cover.-., 0. ad 0.. Note that eve though. was tested i exam, questios from that sectios may also be o this exam. For practice problems o., refer to the last review. This

More information

Section 13.3 Area and the Definite Integral

Section 13.3 Area and the Definite Integral Sectio 3.3 Area ad the Defiite Itegral We ca easily fid areas of certai geometric figures usig well-kow formulas: However, it is t easy to fid the area of a regio with curved sides: METHOD: To evaluate

More information

Response Variable denoted by y it is the variable that is to be predicted measure of the outcome of an experiment also called the dependent variable

Response Variable denoted by y it is the variable that is to be predicted measure of the outcome of an experiment also called the dependent variable Statistics Chapter 4 Correlatio ad Regressio If we have two (or more) variables we are usually iterested i the relatioship betwee the variables. Associatio betwee Variables Two variables are associated

More information

Image Spaces. What might an image space be

Image Spaces. What might an image space be Image Spaces What might a image space be Map each image to a poit i a space Defie a distace betwee two poits i that space Mabe also a shortest path (morph) We have alread see a simple versio of this, i

More information

Correlation. Two variables: Which test? Relationship Between Two Numerical Variables. Two variables: Which test? Contingency table Grouped bar graph

Correlation. Two variables: Which test? Relationship Between Two Numerical Variables. Two variables: Which test? Contingency table Grouped bar graph Correlatio Y Two variables: Which test? X Explaatory variable Respose variable Categorical Numerical Categorical Cotigecy table Cotigecy Logistic Grouped bar graph aalysis regressio Mosaic plot Numerical

More information

Short Term Load Forecasting Using Artificial Neural Network And Imperialist Competitive Algorithm

Short Term Load Forecasting Using Artificial Neural Network And Imperialist Competitive Algorithm Short Term Load Forecastig Usig Artificial eural etwork Ad Imperialist Competitive Algorithm Mostafa Salamat, Mostafa_salamat63@yahoo.com Javad Mousavi, jmousavi.sh1365@gmail.com Seyed Hamid Shah Alami,

More information

CS 2750 Machine Learning. Lecture 22. Concept learning. CS 2750 Machine Learning. Concept Learning

CS 2750 Machine Learning. Lecture 22. Concept learning. CS 2750 Machine Learning. Concept Learning Lecture 22 Cocept learig Milos Hauskrecht milos@cs.pitt.edu 5329 Seott Square Cocept Learig Outlie: Learig boolea fuctios Most geeral ad most specific cosistet hypothesis. Mitchell s versio space algorithm

More information

II. Descriptive Statistics D. Linear Correlation and Regression. 1. Linear Correlation

II. Descriptive Statistics D. Linear Correlation and Regression. 1. Linear Correlation II. Descriptive Statistics D. Liear Correlatio ad Regressio I this sectio Liear Correlatio Cause ad Effect Liear Regressio 1. Liear Correlatio Quatifyig Liear Correlatio The Pearso product-momet correlatio

More information

( ) (( ) ) ANSWERS TO EXERCISES IN APPENDIX B. Section B.1 VECTORS AND SETS. Exercise B.1-1: Convex sets. are convex, , hence. and. (a) Let.

( ) (( ) ) ANSWERS TO EXERCISES IN APPENDIX B. Section B.1 VECTORS AND SETS. Exercise B.1-1: Convex sets. are convex, , hence. and. (a) Let. Joh Riley 8 Jue 03 ANSWERS TO EXERCISES IN APPENDIX B Sectio B VECTORS AND SETS Exercise B-: Covex sets (a) Let 0 x, x X, X, hece 0 x, x X ad 0 x, x X Sice X ad X are covex, x X ad x X The x X X, which

More information

Lecture 11 Simple Linear Regression

Lecture 11 Simple Linear Regression Lecture 11 Simple Liear Regressio Fall 2013 Prof. Yao Xie, yao.xie@isye.gatech.edu H. Milto Stewart School of Idustrial Systems & Egieerig Georgia Tech Midterm 2 mea: 91.2 media: 93.75 std: 6.5 2 Meddicorp

More information

CS321. Numerical Analysis and Computing

CS321. Numerical Analysis and Computing CS Numerical Aalysis ad Computig Lecture Locatig Roots o Equatios Proessor Ju Zhag Departmet o Computer Sciece Uiversity o Ketucky Leigto KY 456-6 September 8 5 What is the Root May physical system ca

More information

Grouping 2: Spectral and Agglomerative Clustering. CS 510 Lecture #16 April 2 nd, 2014

Grouping 2: Spectral and Agglomerative Clustering. CS 510 Lecture #16 April 2 nd, 2014 Groupig 2: Spectral ad Agglomerative Clusterig CS 510 Lecture #16 April 2 d, 2014 Groupig (review) Goal: Detect local image features (SIFT) Describe image patches aroud features SIFT, SURF, HoG, LBP, Group

More information

Math 21C Brian Osserman Practice Exam 2

Math 21C Brian Osserman Practice Exam 2 Math 1C Bria Osserma Practice Exam 1 (15 pts.) Determie the radius ad iterval of covergece of the power series (x ) +1. First we use the root test to determie for which values of x the series coverges

More information

Machine Learning Lecture 10

Machine Learning Lecture 10 Today s Topic Machie Learig Lecture 10 Neural Networks 26.11.2018 Bastia Leibe RWTH Aache http://www.visio.rwth-aache.de leibe@visio.rwth-aache.de Deep Learig 2 Course Outlie Recap: AdaBoost Adaptive Boostig

More information

MATH 1080: Calculus of One Variable II Fall 2017 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart.

MATH 1080: Calculus of One Variable II Fall 2017 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart. MATH 1080: Calculus of Oe Variable II Fall 2017 Textbook: Sigle Variable Calculus: Early Trascedetals, 7e, by James Stewart Uit 3 Skill Set Importat: Studets should expect test questios that require a

More information

Evapotranspiration Estimation Using Support Vector Machines and Hargreaves-Samani Equation for St. Johns, FL, USA

Evapotranspiration Estimation Using Support Vector Machines and Hargreaves-Samani Equation for St. Johns, FL, USA Evirometal Egieerig 0th Iteratioal Coferece eissn 2029-7092 / eisbn 978-609-476-044-0 Vilius Gedimias Techical Uiversity Lithuaia, 27 28 April 207 Article ID: eviro.207.094 http://eviro.vgtu.lt DOI: https://doi.org/0.3846/eviro.207.094

More information

Lecture 7: Density Estimation: k-nearest Neighbor and Basis Approach

Lecture 7: Density Estimation: k-nearest Neighbor and Basis Approach STAT 425: Itroductio to Noparametric Statistics Witer 28 Lecture 7: Desity Estimatio: k-nearest Neighbor ad Basis Approach Istructor: Ye-Chi Che Referece: Sectio 8.4 of All of Noparametric Statistics.

More information

Intro to Learning Theory

Intro to Learning Theory Lecture 1, October 18, 2016 Itro to Learig Theory Ruth Urer 1 Machie Learig ad Learig Theory Comig soo 2 Formal Framework 21 Basic otios I our formal model for machie learig, the istaces to be classified

More information

CS537. Numerical Analysis and Computing

CS537. Numerical Analysis and Computing CS57 Numerical Aalysis ad Computig Lecture Locatig Roots o Equatios Proessor Ju Zhag Departmet o Computer Sciece Uiversity o Ketucky Leigto KY 456-6 Jauary 9 9 What is the Root May physical system ca be

More information

Mechatronics. Time Response & Frequency Response 2 nd -Order Dynamic System 2-Pole, Low-Pass, Active Filter

Mechatronics. Time Response & Frequency Response 2 nd -Order Dynamic System 2-Pole, Low-Pass, Active Filter Time Respose & Frequecy Respose d -Order Dyamic System -Pole, Low-Pass, Active Filter R 4 R 7 C 5 e i R 1 C R 3 - + R 6 - + e out Assigmet: Perform a Complete Dyamic System Ivestigatio of the Two-Pole,

More information