Feedforward neural network. IFT Réseaux neuronaux

Size: px
Start display at page:

Download "Feedforward neural network. IFT Réseaux neuronaux"

Transcription

1 Feedforard neural netork IFT Réseau neuronau

2 Septemer Astrat6, 22 verste de Serrooke Sep Mat for my sldes Feedforard neural netork ARTIFICIAL NEURON Septemer 6, 22 Astrat Astrat Mat for my sl Tops: onneton egts, as,>atvaton funton Mat for my neural netork sldes Feedforard Mat for my sldes Feedforard neural netork Neuron Septemer 6, 22 nput atvaton: Mat for my sldes Feedforar Astrat Mat for my sldes Feedfor a > Mat for my sldes g(a g( > a Mat for m a Mat for my sldes Feedforard neural netork > a a g(a g(a g( g( > a g(a d d a a Astrat Neuron (output atvaton g(a g( g(a g( d dg( d g(a d g(a g( g(a( dforard neural netork d d d d d d d d > are te onneton egts { {{ { { s te neuron as g( { salled te atvaton funton g( g( g( { 2 g( { g(a

3 g(a g( Astrat Mat for my sldes Feedforard neural netork ARTIFICIAL NEURON a > Tops: onneton egts, as, atvaton funton g(a g( Mat for my sldes Feedfor Mat for my a a g(a g(a g( { range determned y g( y { a { g( as only anges te poston of te rff (from asal Vnent s sldes 3

4 ARTIFICIAL NEURON Tops: lnear atvaton funton erforms no nput squasng Not very nterestng { g(a a 4

5 ARTIFICIAL NEURON Tops: sgmod atvaton funton Squases te neuron s nput eteen and Alays postve Bounded Strtly nreasng g(a sgm(a ep( a 5

6 ARTIFICIAL NEURON Tops: yperol tangent ( tan atvaton funton Squases te neuron s nput eteen - and Can e postve or negatve Bounded Strtly nreasng g(a tan(a ep(a ep( a ep(aep( a ep(2a ep(2a 6

7 ARTIFICIAL NEURON Tops: retfed lnear atvaton funton Bounded elo y (alays postve Not upper ounded Strtly nreasng Tends to gve neurons t sparse atvtes g(a reln(a ma(,a 7

8 ARTIFICIAL NEURON Tops: apaty, deson oundary of neuron Could do nary lassfaton: t sgmod, an nterpret neuron as estmatng also knon as logst regresson lassfer f greater tan 5, predt lass oterse, predt lass ues p(y deson oundary s lnear 2 R (smlar dea an apply t tan 2 R 2 (from asal Vnent s sldes 8

9 ARTIFICIAL NEURON Tops: apaty of sngle neuron Can solve lnearly separale prolems OR (, 2 AND (, 2 AND (,

10 ARTIFICIAL NEURON Tops: apaty of sngle neuron Can t solve non lnearly separale prolems XOR (, 2 XOR (, 2 2? AND (, 2 AND (, 2 unless te nput s transformed n a etter representaton

11 g(a reln(a ma(, a ugolar tan(a g(a sgm ep( a ep(2a ep(a ep( a g(a ep(2a ep(a g(a a ep(aep( a ep( (a g(a a tan(a ep(aep( g(a ep(aep( a ep(2a g(a sgm(a a ep(2a g(a g(a ma(, a g(a ma(, a g( (, a g(a sgm(a ep( a g(a a g(a ma(, a ma(, S g(a sgm(a ep( a g(a tan a n(a ma(, a g(a reln(a ma(, a,j j g(a reln(a ep(a ep( a ma(, a g(a tan(a g(a tan(a ep(aep( a g(a reln(a ma(, a Tops: sngle dden layer neural netork g(a reln(a ma(, a ep(a ep( p(y g(a (out (2 g(a tan(a g(a ma(, a f g ( g(a ma layer nput atvaton: jhdden g( ep(aep( p(y g(a ma(, g( g( g(a reln(a a a a a j,jma(, j g( g(a ma(,ag(a reln Mat for my sldes Feedf g(a reln(a g( j j,j a,j j,j j > Mat for m j,j f o( j Hdden >,j layer atvaton: a j ma(,,jreln(a g(a g(a g( g(a g( a a g(a g(a g(a g( a a j,j j g(a g( Output g(a( layer atvaton: a a a a a j,j a j j,j,j a a > f o > (out > d o g ( g(a( d j >,j g(a f o (out > NEURAL NETORK (out o g ( o g ( output atvaton funton g(a a a

12 NEURAL NETORK Tops: sngle dden layer neural netork Réseau de neurones z z k - y y as - y sorte k y kj 2 j aée j entrée 2 (from asal Vnent s sldes 2

13 NEURAL NETORK Tops: sngle dden layer neural netork 2 s oues R R 2 R2 R 2 (from asal Vnent s sldes 3

14 NEURAL NETORK Tops: sngle dden layer neural netork z 2 y 2 z y 3 y y y 2 y 3 y 4 y 4 2 (from asal Vnent s sldes 4

15 NEURAL NETORK Tops: unversal appromaton Unversal appromaton teorem (Hornk, 99: a sngle dden layer neural netork t a lnear output unt an appromate any ontnuous funton artrarly ell, gven enoug dden unts Te result apples for sgmod, tan and many oter dden layer atvaton funtons Ts s a good result, ut t doesn t mean tere s a learnng algortm tat an fnd te neessary parameter values! 5

16 NEURAL NETORK Tops: softma atvaton funton For mult-lass lassfaton: e need multple outputs ( output per lass e ould lke to estmate te ondtonal proalty p(y e use te softma atvaton funton at te output: o(a softma(a strtly postve ep(a ep(a ep(a C ep(a > sums to one redted lass s te one t gest estmated proalty 6

17 F H p(y p(y De parte p(y p(y > > { ep(a ep(a ep(a > Unve ep(a softma(a o(a p(y ep(a ep(a > o(a softma(a ep(a ep(a ep(a ep(a ep(a ep(a Tops: multlayer neural netork o(a softma(a o(a softma(a ep(a ep(a ep(a ugolaro ep(a g(a a f > NEURAL NETORK C C ep(a f p(y o(a softma(a p(y Could ave layers: L (3 dden f p(y ep(a f C ep(ac (3 ep(a C Se g(a sgm(a > (3 (3 ep( a > p(y ep(a ep(a (k ( C (3 (3 ( > a layer nput ep(a (3 (3 ep(a ep(a k> o(a softma(a ep(a C for ep(a ep(a atvaton o(a f C softma(a ep(a o(a softma(a > ep(a ep(a ep(a ep(a ep(a ep(a ep( (k ( f C a g(a (k ( o(a softma(a ( (k g(a tan(a a ( ep(a ( ep(a (3 (3 a ( ep(aep( p(y p(y (L (3 (3 (L f o(a f f > p(y f g(a g(a g(a ep(a ep(a (k ( o(a softma(a (k ( ep(a ep(a a (3 ( ( ep(a ep(a g(a ma(, a C dden a o(a softma(a Mat for my sldes Feedfor layer atvaton (k from L: (L (3 ep(a (3 (L to ep(a (3 (3> for ep(a Mat my (L ep(a (L C (L (L o(a f o(a o(a softma(a f f g(af ep(a o(a ep(a ( (k a a ( g(a reln(a ma(, g(a (k ( (L (L (3 f a (k ( ( o(a f ( a a a C f (k ( (3 (L (L a ( g(a g(a o(a f g( (3 g( (3 g(a g(a output atvaton g(a layer (kl: (L g(a (L (L (L o(a f o(a f (k ( a ( (k ( a (L ( (L (L (L o(a f o(a f,j d g(a g(a (L j g(a (L (L (L d 7

18 NEURAL NETORK Tops: parallel t te vsual orte edges mout nose eyes fae 8

19 BIOLOGICAL NEURONS Tops: synapse, aon, dendrte e estmate around and te numer of neurons n te uman ran: tey reeve nformaton from oter neurons troug ter dendrtes te proess te nformaton n ter ell ody (soma tey send nformaton troug a ale alled an aon te pont of onneton eteen te aon ranes and oter neurons dendrtes are alled synapses 9

20 BIOLOGICAL NEURONS Tops: synapse, aon, dendrte Sgnal transmsson Computaton Sgnal reepton Synapses Aon Cell ody Dendrtes Oter neurons (from Hyvärnen, Hurr and Hoyer s ook 2

21 BIOLOGICAL NEURONS Tops: aton potental, frng rate An aton potental s an eletral mpulse tat travels troug te aon: ts s o neurons ommunate t generates a spke n te eletr potental (voltage of te aon an aton potental s generated at neuron only f t reeves enoug (over some tresold of te rgt pattern of spkes from oter neurons Neurons an generate several su spkes every seonds: te frequeny of te spkes, alled frng rate, s at araterzes te atvty of a neuron - neurons are alays frng a lttle t, (spontaneous frng rate, ut tey ll fre more, gven te rgt stmulus 2

22 BIOLOGICAL NEURONS Tops: aton potental, frng rate Frng rates of dfferent nput neurons omne to nfluene te frng rate of oter neurons: dependng on te dendrte and aon, a neuron an eter ork to nrease (ete or desrease (nt te frng rate of anoter neuron Ts s at artfal neurons appromate: te atvaton orresponds to a sort of frng rate te egts eteen neurons model eter neurons ete or nt ea oter te atvaton funton and as model te tresolded eavor of aton potentals 22

23 BIOLOGICAL NEURONS Huel & esel eperment ttp://youtueom/at?v8vdff3egfg&featurerelated 23

24 CONCLUSION e ave seen te most ommon: atvaton funtons netork topologes (layer-se e ould easly ave desgned more omplated atvaton funtons and topologes: ould get more nspraton from neurosene Hoever, tose dsussed ere tend to ork fne 24

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them? Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of

More information

CoSMo 2012 Gunnar Blohm

CoSMo 2012 Gunnar Blohm Sensory-motor computatons CoSMo 2012 Gunnar Blom Outlne Introducton (Day 1) Sabes Sensory-motor transformatons Blom Populaton codng and parallel computng Modellng sensory-motor mappngs t artfcal neural

More information

Multilayer neural networks

Multilayer neural networks Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer

More information

Shuai Dong. Isaac Newton. Gottfried Leibniz

Shuai Dong. Isaac Newton. Gottfried Leibniz Computatonal pyscs Sua Dong Isaac Newton Gottred Lebnz Numercal calculus poston dervatve ntegral v velocty dervatve ntegral a acceleraton Numercal calculus Numercal derentaton Numercal ntegraton Roots

More information

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17 Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer

More information

Physics 41 Chapter 22 HW Serway 7 th Edition

Physics 41 Chapter 22 HW Serway 7 th Edition yss 41 apter H Serway 7 t Edton oneptual uestons: 1,, 8, 1 roblems: 9, 1, 0,, 7, 9, 48, 54, 55 oneptual uestons: 1,, 8, 1 1 Frst, te effeny of te automoble engne annot exeed te arnot effeny: t s lmted

More information

Multi-layer neural networks

Multi-layer neural networks Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification.

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification. Page 1 Model of Neurons CS 416 Artfcal Intellgence Lecture 18 Neural Nets Chapter 20 Multple nputs/dendrtes (~10,000!!!) Cell body/soma performs computaton Sngle output/axon Computaton s typcally modeled

More information

Machine Learning: and 15781, 2003 Assignment 4

Machine Learning: and 15781, 2003 Assignment 4 ahne Learnng: 070 and 578, 003 Assgnment 4. VC Dmenson 30 onts Consder the spae of nstane X orrespondng to all ponts n the D x, plane. Gve the VC dmenson of the followng hpothess spaes. No explanaton requred.

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

Introduction to the Introduction to Artificial Neural Network

Introduction to the Introduction to Artificial Neural Network Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Lecture 26 Finite Differences and Boundary Value Problems

Lecture 26 Finite Differences and Boundary Value Problems 4//3 Leture 6 Fnte erenes and Boundar Value Problems Numeral derentaton A nte derene s an appromaton o a dervatve - eample erved rom Talor seres 3 O! Negletng all terms ger tan rst order O O Tat s te orward

More information

Parameter estimation class 5

Parameter estimation class 5 Parameter estmaton class 5 Multple Ve Geometr Comp 9-89 Marc Pollefes Content Background: Projectve geometr (D, 3D), Parameter estmaton, Algortm evaluaton. Sngle Ve: Camera model, Calbraton, Sngle Ve Geometr.

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

Classification (klasifikácia) Feedforward Multi-Layer Perceptron (Dopredná viacvrstvová sieť) 14/11/2016. Perceptron (Frank Rosenblatt, 1957)

Classification (klasifikácia) Feedforward Multi-Layer Perceptron (Dopredná viacvrstvová sieť) 14/11/2016. Perceptron (Frank Rosenblatt, 1957) 4//06 IAI: Lecture 09 Feedforard Mult-Layer Percetron (Doredná vacvrstvová seť) Lubca Benuskova AIMA 3rd ed. Ch. 8.6.4 8.7.5 Classfcaton (klasfkáca) In machne learnng and statstcs, classfcaton s the roblem

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

COMP4630: λ-calculus

COMP4630: λ-calculus COMP4630: λ-calculus 4. Standardsaton Mcael Norrs Mcael.Norrs@ncta.com.au Canberra Researc Lab., NICTA Semester 2, 2015 Last Tme Confluence Te property tat dvergent evaluatons can rejon one anoter Proof

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Evaluation of classifiers MLPs

Evaluation of classifiers MLPs Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label

More information

18-660: Numerical Methods for Engineering Design and Optimization

18-660: Numerical Methods for Engineering Design and Optimization 8-66: Numercal Methods for Engneerng Desgn and Optmzaton n L Department of EE arnege Mellon Unversty Pttsburgh, PA 53 Slde Overve lassfcaton Support vector machne Regularzaton Slde lassfcaton Predct categorcal

More information

Instance-Based Learning and Clustering

Instance-Based Learning and Clustering Instane-Based Learnng and Clusterng R&N 04, a bt of 03 Dfferent knds of Indutve Learnng Supervsed learnng Bas dea: Learn an approxmaton for a funton y=f(x based on labelled examples { (x,y, (x,y,, (x n,y

More information

Networks of Neurons (Chapter 7)

Networks of Neurons (Chapter 7) CSE/NEUBEH 58 Networks of Neurons (Chapter 7) Drawng by Ramón y Cajal Today s Agenda F Computaton n Networks of Neurons Feedforward Networks: What can they do? Recurrent Networks: What more can they do?

More information

Linear discriminants. Nuno Vasconcelos ECE Department, UCSD

Linear discriminants. Nuno Vasconcelos ECE Department, UCSD Lnear dscrmnants Nuno Vasconcelos ECE Department UCSD Classfcaton a classfcaton problem as to tpes of varables e.g. X - vector of observatons features n te orld Y - state class of te orld X R 2 fever blood

More information

1 Input-Output Mappings. 2 Hebbian Failure. 3 Delta Rule Success.

1 Input-Output Mappings. 2 Hebbian Failure. 3 Delta Rule Success. Task Learnng 1 / 27 1 Input-Output Mappngs. 2 Hebban Falure. 3 Delta Rule Success. Input-Output Mappngs 2 / 27 0 1 2 3 4 5 6 7 8 9 Output 3 8 2 7 Input 5 6 0 9 1 4 Make approprate: Response gven stmulus.

More information

Discriminative classifier: Logistic Regression. CS534-Machine Learning

Discriminative classifier: Logistic Regression. CS534-Machine Learning Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng 2 Logstc Regresson Gven tranng set D stc regresson learns the condtonal dstrbuton We ll assume onl to classes and a parametrc form for here s

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

CENTROID (AĞIRLIK MERKEZİ )

CENTROID (AĞIRLIK MERKEZİ ) CENTOD (ĞLK MEKEZİ ) centrod s a geometrcal concept arsng from parallel forces. Tus, onl parallel forces possess a centrod. Centrod s tougt of as te pont were te wole wegt of a pscal od or sstem of partcles

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

MATH 567: Mathematical Techniques in Data Science Lab 8

MATH 567: Mathematical Techniques in Data Science Lab 8 1/14 MATH 567: Mathematcal Technques n Data Scence Lab 8 Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 11, 2017 Recall We have: a (2) 1 = f(w (1) 11 x 1 + W (1) 12 x 2 + W

More information

PHYSICS 212 MIDTERM II 19 February 2003

PHYSICS 212 MIDTERM II 19 February 2003 PHYSICS 1 MIDERM II 19 Feruary 003 Exam s losed ook, losed notes. Use only your formula sheet. Wrte all work and answers n exam ooklets. he aks of pages wll not e graded unless you so request on the front

More information

Supervised Learning NNs

Supervised Learning NNs EE788 Robot Cognton and Plannng, Prof. J.-H. Km Lecture 6 Supervsed Learnng NNs Robot Intellgence Technolog Lab. From Jang, Sun, Mzutan, Ch.9, Neuro-Fuzz and Soft Computng, Prentce Hall Contents. Introducton.

More information

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression 11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING

More information

CENTROID (AĞIRLIK MERKEZİ )

CENTROID (AĞIRLIK MERKEZİ ) CENTOD (ĞLK MEKEZİ ) centrod s a geometrcal concept arsng from parallel forces. Tus, onl parallel forces possess a centrod. Centrod s tougt of as te pont were te wole wegt of a pscal od or sstem of partcles

More information

Average Treatment Effect

Average Treatment Effect Average Treatment ffect L Gan Aprl, 28 We are nterested n average canges n outcome after a polc cange. As n te case of te medcal feld, e are nterested n te outcomes of a ne medcne, ncludng ts effectveness

More information

Tangent Lines-1. Tangent Lines

Tangent Lines-1. Tangent Lines Tangent Lines- Tangent Lines In geometry, te tangent line to a circle wit centre O at a point A on te circle is defined to be te perpendicular line at A to te line OA. Te tangent lines ave te special property

More information

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

Adaptive Multilayer Neural Network Control of Blood Pressure

Adaptive Multilayer Neural Network Control of Blood Pressure Proeedng of st Internatonal Symposum on Instrument Sene and Tenology. ISIST 99. P4-45. 999. (ord format fle: ISIST99.do) Adaptve Multlayer eural etwork ontrol of Blood Pressure Fe Juntao, Zang bo Department

More information

Rhythmic activity in neuronal ensembles in the presence of conduction delays

Rhythmic activity in neuronal ensembles in the presence of conduction delays Rhythmc actvty n neuronal ensembles n the presence of conducton delays Crstna Masoller Carme Torrent, Jord García Ojalvo Departament de Fsca Engnyera Nuclear Unverstat Poltecnca de Catalunya, Terrassa,

More information

Discriminative classifier: Logistic Regression. CS534-Machine Learning

Discriminative classifier: Logistic Regression. CS534-Machine Learning Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng robablstc Classfer Gven an nstance, hat does a probablstc classfer do dfferentl compared to, sa, perceptron? It does not drectl predct Instead,

More information

Continuity. Example 1

Continuity. Example 1 Continuity MATH 1003 Calculus and Linear Algebra (Lecture 13.5) Maoseng Xiong Department of Matematics, HKUST A function f : (a, b) R is continuous at a point c (a, b) if 1. x c f (x) exists, 2. f (c)

More information

Chapter 18: The Laws of Thermodynamics

Chapter 18: The Laws of Thermodynamics Capter 18: e Laws o ermodynams Answers to Even-Numbered Coneptual uestons. (a) Yes. Heat an low nto te system at te same tme te system expands, as n an sotermal expanson o a gas. (b) Yes. Heat an low out

More information

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 - Chapter 9R -Davd Klenfeld - Fall 2005 9 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys a set

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radal-Bass uncton Networs v.0 March 00 Mchel Verleysen Radal-Bass uncton Networs - Radal-Bass uncton Networs p Orgn: Cover s theorem p Interpolaton problem p Regularzaton theory p Generalzed RBN p Unversal

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

Neural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj

Neural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj Neural Networs Class 22: MLSP, Fall 2016 Instructor: Bhsha Raj IMPORTANT ADMINSTRIVIA Fnal wee. Project presentatons on 6th 18797/11755 2 Neural Networs are tang over! Neural networs have become one of

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We

More information

Week 11: Chapter 11. The Vector Product. The Vector Product Defined. The Vector Product and Torque. More About the Vector Product

Week 11: Chapter 11. The Vector Product. The Vector Product Defined. The Vector Product and Torque. More About the Vector Product The Vector Product Week 11: Chapter 11 Angular Momentum There are nstances where the product of two vectors s another vector Earler we saw where the product of two vectors was a scalar Ths was called the

More information

Scatter Plot x

Scatter Plot x Construct a scatter plot usng excel for the gven data. Determne whether there s a postve lnear correlaton, negatve lnear correlaton, or no lnear correlaton. Complete the table and fnd the correlaton coeffcent

More information

Neural networks. Nuno Vasconcelos ECE Department, UCSD

Neural networks. Nuno Vasconcelos ECE Department, UCSD Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X

More information

Artificial Neural Networks. Part 2

Artificial Neural Networks. Part 2 Artificial Neural Netorks Part Artificial Neuron Model Folloing simplified model of real neurons is also knon as a Threshold Logic Unit x McCullouch-Pitts neuron (943) x x n n Body of neuron f out Biological

More information

SVMs: Duality and Kernel Trick. SVMs as quadratic programs

SVMs: Duality and Kernel Trick. SVMs as quadratic programs /8/9 SVMs: Dualt and Kernel rck Machne Learnng - 6 Geoff Gordon MroslavDudík [[[partl ased on sldes of Zv-Bar Joseph] http://.cs.cmu.edu/~ggordon/6/ Novemer 8 9 SVMs as quadratc programs o optmzaton prolems:

More information

SVMs: Duality and Kernel Trick. SVMs as quadratic programs

SVMs: Duality and Kernel Trick. SVMs as quadratic programs 11/17/9 SVMs: Dualt and Kernel rck Machne Learnng - 161 Geoff Gordon MroslavDudík [[[partl ased on sldes of Zv-Bar Joseph] http://.cs.cmu.edu/~ggordon/161/ Novemer 18 9 SVMs as quadratc programs o optmzaton

More information

Voltammetry. Bulk electrolysis: relatively large electrodes (on the order of cm 2 ) Voltammetry:

Voltammetry. Bulk electrolysis: relatively large electrodes (on the order of cm 2 ) Voltammetry: Voltammetry varety of eletroanalytal methods rely on the applaton of a potental funton to an eletrode wth the measurement of the resultng urrent n the ell. In ontrast wth bul eletrolyss methods, the objetve

More information

Preface. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.

Preface. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed. Preface Here are my online notes for my course tat I teac ere at Lamar University. Despite te fact tat tese are my class notes, tey sould be accessible to anyone wanting to learn or needing a refreser

More information

Math 212-Lecture 9. For a single-variable function z = f(x), the derivative is f (x) = lim h 0

Math 212-Lecture 9. For a single-variable function z = f(x), the derivative is f (x) = lim h 0 3.4: Partial Derivatives Definition Mat 22-Lecture 9 For a single-variable function z = f(x), te derivative is f (x) = lim 0 f(x+) f(x). For a function z = f(x, y) of two variables, to define te derivatives,

More information

Multivariate Ratio Estimator of the Population Total under Stratified Random Sampling

Multivariate Ratio Estimator of the Population Total under Stratified Random Sampling Open Journal of Statstcs, 0,, 300-304 ttp://dx.do.org/0.436/ojs.0.3036 Publsed Onlne July 0 (ttp://www.scrp.org/journal/ojs) Multvarate Rato Estmator of te Populaton Total under Stratfed Random Samplng

More information

MAE140 - Linear Circuits - Winter 16 Final, March 16, 2016

MAE140 - Linear Circuits - Winter 16 Final, March 16, 2016 ME140 - Lnear rcuts - Wnter 16 Fnal, March 16, 2016 Instructons () The exam s open book. You may use your class notes and textbook. You may use a hand calculator wth no communcaton capabltes. () You have

More information

Lecture 23: Artificial neural networks

Lecture 23: Artificial neural networks Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of

More information

DATA STRUCTURES FOR LOGIC OPTIMIZATION

DATA STRUCTURES FOR LOGIC OPTIMIZATION DATA STRUCTURES FOR LOGIC OPTIMIZATION Outlne Revew of Boolean algera. c Govann De Mchel Stanford Unversty Representatons of logc functons. Matrx representatons of covers. Operatons on logc covers. Background

More information

WINKLER PLATES BY THE BOUNDARY KNOT METHOD

WINKLER PLATES BY THE BOUNDARY KNOT METHOD WINKLER PLATES BY THE BOUNARY KNOT ETHO Sofía Roble, sroble@fing.edu.uy Berardi Sensale, sensale@fing.edu.uy Facultad de Ingeniería, Julio Herrera y Reissig 565, ontevideo Abstract. Tis paper describes

More information

11/19/2013. PHY 113 C General Physics I 11 AM 12:15 PM MWF Olin 101

11/19/2013. PHY 113 C General Physics I 11 AM 12:15 PM MWF Olin 101 PHY 113 C General Pyss I 11 AM 12:15 PM MWF Oln 101 Plan or Leture 23: Capter 22: Heat engnes 1. ermodynam yles; work and eat eeny 2. Carnot yle 3. Otto yle; desel yle 4. Bre omments on entropy 11/19/2013

More information

So far: simple (planar) geometries

So far: simple (planar) geometries Physcs 06 ecture 5 Torque and Angular Momentum as Vectors SJ 7thEd.: Chap. to 3 Rotatonal quanttes as vectors Cross product Torque epressed as a vector Angular momentum defned Angular momentum as a vector

More information

3) Surrogate Responses

3) Surrogate Responses 1) Introducton Vsual neurophysology has benefted greatly for many years through the use of smple, controlled stmul lke bars and gratngs. One common characterzaton of the responses elcted by these stmul

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Instance-Based Learning (a.k.a. memory-based learning) Part I: Nearest Neighbor Classification

Instance-Based Learning (a.k.a. memory-based learning) Part I: Nearest Neighbor Classification Instance-Based earnng (a.k.a. memory-based learnng) Part I: Nearest Neghbor Classfcaton Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n

More information

How to Find the Derivative of a Function: Calculus 1

How to Find the Derivative of a Function: Calculus 1 Introduction How to Find te Derivative of a Function: Calculus 1 Calculus is not an easy matematics course Te fact tat you ave enrolled in suc a difficult subject indicates tat you are interested in te

More information

1. Questions (a) through (e) refer to the graph of the function f given below. (A) 0 (B) 1 (C) 2 (D) 4 (E) does not exist

1. Questions (a) through (e) refer to the graph of the function f given below. (A) 0 (B) 1 (C) 2 (D) 4 (E) does not exist Mat 1120 Calculus Test 2. October 18, 2001 Your name Te multiple coice problems count 4 points eac. In te multiple coice section, circle te correct coice (or coices). You must sow your work on te oter

More information

Unsupervised Learning

Unsupervised Learning Unsupervsed Learnng Kevn Swngler What s Unsupervsed Learnng? Most smply, t can be thought of as learnng to recognse and recall thngs Recognton I ve seen that before Recall I ve seen that before and I can

More information

Physics 107 Problem 2.5 O. A. Pringle h Physics 107 Problem 2.6 O. A. Pringle

Physics 107 Problem 2.5 O. A. Pringle h Physics 107 Problem 2.6 O. A. Pringle Pysis 07 Problem 25 O A Pringle 3 663 0 34 700 = 284 0 9 Joules ote I ad to set te zero tolerane ere e 6 0 9 ev joules onversion ator ev e ev = 776 ev Pysis 07 Problem 26 O A Pringle 663 0 34 3 ev

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2019 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons Our goal to derve the form of the abstract quanttes n rate equatons, such as synaptc

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Polynomial Regression Models

Polynomial Regression Models LINEAR REGRESSION ANALYSIS MODULE XII Lecture - 6 Polynomal Regresson Models Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Test of sgnfcance To test the sgnfcance

More information

Multivariate Ratio Estimation With Known Population Proportion Of Two Auxiliary Characters For Finite Population

Multivariate Ratio Estimation With Known Population Proportion Of Two Auxiliary Characters For Finite Population Multvarate Rato Estmaton Wth Knon Populaton Proporton Of To Auxlar haracters For Fnte Populaton *Raesh Sngh, *Sachn Mal, **A. A. Adeara, ***Florentn Smarandache *Department of Statstcs, Banaras Hndu Unverst,Varanas-5,

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2015 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons We consder a network of many neurons, each of whch obeys a set of conductancebased,

More information

A Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009

A Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009 A utoral on Data Reducton Lnear Dscrmnant Analss (LDA) hreen Elhaban and Al A Farag Unverst of Lousvlle, CVIP Lab eptember 009 Outlne LDA objectve Recall PCA No LDA LDA o Classes Counter eample LDA C Classes

More information

ORDINARY DIFFERENTIAL EQUATIONS EULER S METHOD

ORDINARY DIFFERENTIAL EQUATIONS EULER S METHOD Numercal Analss or Engneers German Jordanan Unverst ORDINARY DIFFERENTIAL EQUATIONS We wll eplore several metods o solvng rst order ordnar derental equatons (ODEs and we wll sow ow tese metods can be appled

More information

TR/95 February Splines G. H. BEHFOROOZ* & N. PAPAMICHAEL

TR/95 February Splines G. H. BEHFOROOZ* & N. PAPAMICHAEL TR/9 February 980 End Condtons for Interpolatory Quntc Splnes by G. H. BEHFOROOZ* & N. PAPAMICHAEL *Present address: Dept of Matematcs Unversty of Tabrz Tabrz Iran. W9609 A B S T R A C T Accurate end condtons

More information

THE IDEA OF DIFFERENTIABILITY FOR FUNCTIONS OF SEVERAL VARIABLES Math 225

THE IDEA OF DIFFERENTIABILITY FOR FUNCTIONS OF SEVERAL VARIABLES Math 225 THE IDEA OF DIFFERENTIABILITY FOR FUNCTIONS OF SEVERAL VARIABLES Mat 225 As we ave seen, te definition of derivative for a Mat 111 function g : R R and for acurveγ : R E n are te same, except for interpretation:

More information

Continuity and Differentiability Worksheet

Continuity and Differentiability Worksheet Continuity and Differentiability Workseet (Be sure tat you can also do te grapical eercises from te tet- Tese were not included below! Typical problems are like problems -3, p. 6; -3, p. 7; 33-34, p. 7;

More information

Prof. Paolo Colantonio a.a

Prof. Paolo Colantonio a.a Pro. Paolo olantono a.a. 3 4 Let s consder a two ports network o Two ports Network o L For passve network (.e. wthout nternal sources or actve devces), a general representaton can be made by a sutable

More information

Information-Geometric Studies on Neuronal Spike Trains

Information-Geometric Studies on Neuronal Spike Trains Computatonal Neuroscence ESPRC Workshop --Warwck Informaton-Geometrc Studes on Neuronal Spke Trans Shun-ch Amar Shun-ch Amar RIKEN Bran Scence Insttute Mathematcal Neuroscence Unt Neural Frng x1 x2 x3

More information

A solution to the Curse of Dimensionality Problem in Pairwise Scoring Techniques

A solution to the Curse of Dimensionality Problem in Pairwise Scoring Techniques A soluton to the Curse of Dmensonalty Problem n Parwse orng Tehnques Man Wa MAK Dept. of Eletron and Informaton Engneerng The Hong Kong Polytehn Unversty un Yuan KUNG Dept. of Eletral Engneerng Prneton

More information

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1] DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

Midterm Examination. Regression and Forecasting Models

Midterm Examination. Regression and Forecasting Models IOMS Department Regresson and Forecastng Models Professor Wllam Greene Phone: 22.998.0876 Offce: KMC 7-90 Home page: people.stern.nyu.edu/wgreene Emal: wgreene@stern.nyu.edu Course web page: people.stern.nyu.edu/wgreene/regresson/outlne.htm

More information

Physics 2A Chapter 3 HW Solutions

Physics 2A Chapter 3 HW Solutions Phscs A Chapter 3 HW Solutons Chapter 3 Conceptual Queston: 4, 6, 8, Problems: 5,, 8, 7, 3, 44, 46, 69, 70, 73 Q3.4. Reason: (a) C = A+ B onl A and B are n the same drecton. Sze does not matter. (b) C

More information

Solution for singularly perturbed problems via cubic spline in tension

Solution for singularly perturbed problems via cubic spline in tension ISSN 76-769 England UK Journal of Informaton and Computng Scence Vol. No. 06 pp.6-69 Soluton for sngularly perturbed problems va cubc splne n tenson K. Aruna A. S. V. Rav Kant Flud Dynamcs Dvson Scool

More information

Bob Brown Math 251 Calculus 1 Chapter 3, Section 1 Completed 1 CCBC Dundalk

Bob Brown Math 251 Calculus 1 Chapter 3, Section 1 Completed 1 CCBC Dundalk Bob Brown Mat 251 Calculus 1 Capter 3, Section 1 Completed 1 Te Tangent Line Problem Te idea of a tangent line first arises in geometry in te context of a circle. But before we jump into a discussion of

More information

CHAPTER 4d. ROOTS OF EQUATIONS

CHAPTER 4d. ROOTS OF EQUATIONS CHAPTER 4d. ROOTS OF EQUATIONS A. J. Clark School o Engneerng Department o Cvl and Envronmental Engneerng by Dr. Ibrahm A. Assakka Sprng 00 ENCE 03 - Computaton Methods n Cvl Engneerng II Department o

More information

Richard Socher, Henning Peters Elements of Statistical Learning I E[X] = arg min. E[(X b) 2 ]

Richard Socher, Henning Peters Elements of Statistical Learning I E[X] = arg min. E[(X b) 2 ] 1 Prolem (10P) Show that f X s a random varale, then E[X] = arg mn E[(X ) 2 ] Thus a good predcton for X s E[X] f the squared dfference s used as the metrc. The followng rules are used n the proof: 1.

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

Math1110 (Spring 2009) Prelim 3 - Solutions

Math1110 (Spring 2009) Prelim 3 - Solutions Math 1110 (Sprng 2009) Solutons to Prelm 3 (04/21/2009) 1 Queston 1. (16 ponts) Short answer. Math1110 (Sprng 2009) Prelm 3 - Solutons x a 1 (a) (4 ponts) Please evaluate lm, where a and b are postve numbers.

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Properties of Least Squares

Properties of Least Squares Week 3 3.1 Smple Lnear Regresson Model 3. Propertes of Least Squares Estmators Y Y β 1 + β X + u weekly famly expendtures X weekly famly ncome For a gven level of x, the expected level of food expendtures

More information