Introduction to Artificial Neural Networks EE /09/2015. Who am I. Associate Prof. Dr. Turgay IBRIKCI. What you learn from the course

Size: px
Start display at page:

Download "Introduction to Artificial Neural Networks EE /09/2015. Who am I. Associate Prof. Dr. Turgay IBRIKCI. What you learn from the course"

Transcription

1 Introducton to Artfcal Neural Netorks EE-589 Who am I Assocate Prof Dr Turga IBRIKCI Room # 305 Tuesdas 3:30-6:00 2 (322) / 39 eembderslerordpresscom What ou learn from the course Course Outlne 3 Ho to approach a neural netork learnng classfcaton or clusterng Basc knoledge of the common lnear machne learnng algorthms Basc knoledge of Neural Netorks learnng algorthms A good understandng of neural netork algorthms 4 The course s dvded n to parts: theor and practce Theor covers basc topcs n neural netorks theor and applcaton to supervsed, unsupervsed and renforcement learnng 2 Practce deals th bascs of MATLAB and mplementaton of NN learnng algorthms We assume that ou kno MATLAB or ou ll learn ourself Course Gradng Project 5 Gradng the Class: Project 40% Proposal 5%( 3/03/205) Project paper 35% (CD -ll be n paper publcaton format) Presentaton (Week 4 ; 20 mns) Fnal Eam 20% (Week 5-We decde together) Homeorks 40% (At least 4 homeorks) Full attendng the class 0% (Bonus) 6 Choose a topc that s related to our research nterest and pertans to the course materal Dscuss th our supervsor The proposal should nclude the follong sessons (Due: 3/03) the project goal, the problems to be studed, overve of current methods, proposed methods, epected results, and references (about 4 pages: sngle space, fond sze = 2, references are not counted) References should be cted n the proposal A rtten fnal paper n the stle of a journal artcle n the follong slce s also requred Fnal paper s due b 2/2/205-soft&harcop Each student ll gve classroom presentaton-ppt about the fnal paper

2 Fnal Paper WARNING!!!!!!!!! 7 The paper should not eceed 8 pages (ecludng append) n the format provded at IEEE Manuscrpt Templates for Conference Proceedngshttp://eeeorg/conferencesevents/conferences/publshng/templ ateshtml : The paper should address the follong topcs: Introducton general descrpton of the project topc/research paper The problem statement h s an artfcal neural netork requred? What are the ssues? Objectve hat are ou proposng to do/sho? Approach descrbe the method ou have adopted Implementaton results nclude our graphs, tables, etc here Conclusons hat have ou learned? Ho can the algorthm be mproved? Append lstng of the source code (eclude these pages from the total page count) 8 IMPORTANT NOTE: We don t accept late materals at all We ll take our sgnature ever class durng the course that ou attend Academc Integrt Where to go for help All programmng s to be done alone! You can dscuss our programs th anone! Do not share code th anone else n the course (lookng at the code counts as sharng)! Comparson of homeorks ll be to catch cheaters! Mnmum penalt s 2-letter-grade-drop for the course for everone nvolved Feel free to send our code to turgabrkc@hotmalcom and ask me for help Please rte our name and coursenumber to the subject lne Brng our code to the class 9 0 Introducton to Artfcal Neural Netorks Fundamental Concepts of Intellgent Contents Fundamental Concepts of Intellgent Fundamental Concepts of ANNs Basc Models and Learnng Rules Neuron Models ANN structures Learnng Dstrbuted Representatons Conclusons 2 2

3 Intellgent machnes, or hat machnes can do What s Intellgence? Phlosophers have been trng for over 2000 ears to understand and resolve to Bg Questons of the Unverse: Ho does a human mnd ork? Can non-humans have mnds? These questons are stll unansered 3 Then e need to ask that hat Intellgence s 4 Intellgence: the capact to learn and solve problems (Websters dctonar) n partcular, the ablt to solve novel problems the ablt to act ratonall the ablt to act lke humans Artfcal Intellgence buld and understand ntellgent enttes or agents 2 man approaches: engneerng versus cogntve modelng Intellgent machnes Intellgent machnes 5 We can defne ntellgence as the ablt to learn and understand, to solve problems and to make decsons The goal of artfcal ntellgence (AI) as a scence s to make machnes do thngs that ould requre ntellgence f done b humans Therefore, the anser to the queston Can Machnes Thnk? as vtall mportant to the dscplne The anser s not a smple Yes or No 6 As humans, e all have the ablt to learn and understand, to solve problems and to make decsons; hoever, our abltes are not equal n dfferent areas Therefore, e should epect that f machnes can thnk, some of them mght be smarter than others n some as Turng Imtaton Game Turng Imtaton Game 7 One of the most sgnfcant papers on machne ntellgence, Computng Machner and Intellgence, as rtten b the Brtsh mathematcan Alan Turng over fft fve ears ago Hoever, t stll stands up ell under the test of tme, and the Turng s approach remans unversal Turng dd not provde defntons of machnes and thnkng, he just avoded semantc arguments b nventng a game, the Turng Imtaton Game 8 The mtaton game orgnall ncluded to phases In the frst phase, the nterrogator, a man and a oman are each placed n separate rooms The nterrogator s objectve s to ork out ho s the man and ho s the oman b questonng them 3

4 Turng Imtaton Game: Phase Turng Imtaton Game: Phase 2 In the second phase of the game, The man s replaced b a computer programmed to deceve the nterrogator as the man dd It ould even be programmed to make mstakes and provde fuzz ansers n the a a human ould If the computer can fool the nterrogator as often as the man dd, e ma sa ths computer has passed the ntellgent behavour test 9 20 Turng Imtaton Game: Phase 2 The Result of Turng Machnes 2 Turng beleved that b the end of the 20th centur t ould be possble to program a dgtal computer to pla the mtaton game Although modern computers stll cannot pass the Turng test, t provdes a bass for the verfcaton and valdaton of knoledge-based sstems A program ntellgence, n some narro area of epertse, s evaluated b comparng ts performance th the performance of a human epert To buld an ntellgent computer sstem, e have to capture, organze and use human epert knoledge n 22 some narro area of epertse What s Neural Netorkng?? Human Intellgence VS Artfcal Intellgence Artfcal neural netorks are composed of nterconnectng artfcal neurons (programmng constructs that mmc the propertes of bologcal neurons)

5 Human Intellgence VS Artfcal Intellgence Human Intellgence Artfcal Intellgence Human Intellgence VS Artfcal Intellgence Human Intellgence Artfcal Intellgence 25 Intuton, Common sense, Judgment, Creatvt, Belefs etc The ablt to demonstrate ther ntellgence b communcatng effectvel Plausble Reasonng and Crtcal thnkng Ablt to smulate human behavor and cogntve processes Capture and preserve human epertse Fast Response The ablt to comprehend large amounts of data quckl Humans are fallble The have lmted knoledge bases Informaton processng of seral nature proceed ver slol n the bran as compared to computers Humans are unable to retan large amounts of 26 data n memor No common sense Cannot readl deal th med knoledge Ma have hgh development costs Rase legal and ethcal concerns Human Intellgence VS Artfcal Intellgence Artfcal Intellgence VS Conventonal Computng Artfcal Intellgence Conventonal Computng 27 We acheve more than e kno We kno more than e understand We understand more than e can eplan (Claude Bernard, 9th C French scentfc phlosopher) 28 AI softare uses the technques of search and pattern matchng Programmers desgn AI softare to gve the computer onl the problem, not the steps necessar to solve t Conventonal computer softare follo a logcal seres of steps to reach a concluson Computer programmers orgnall desgned softare that accomplshed tasks b completng algorthms Artfcal ntellgence & Our socet Pscholog And Artfcal ntellgence Wh e need AI?? The functonalst approach of AI ves the mnd as a representatonal sstem and pscholog as the stud of the varous computatonal processes hereb mental representatons are constructed, organzed, and nterpreted (Margaret Boden's essas rtten beteen 982 and 988) To supplement natural ntellgence for eg e are buldng ntellgence n an object so that t can do hat e ant t to do, as for eample-- robots, thus reducng human labour and reducng human mstakes

6 Introducton to Artfcal Neural Netorks Fundamental Concepts of ANNs What s ANN? Wh ANN? ANN Artfcal Neural Netorks To smulate human bran behavor A ne generaton of nformaton processng sstem 3 32 Applcatons Pattern Matchng Pattern Recognton Assocate Memor (Content Addressable Memor) Functon Appromaton Learnng Optmzaton Vector Quantzaton Data Clusterng Tradtonal Computers are neffcent at these tasks although ther computaton Applcatons speed s faster Pattern Matchng Pattern Recognton Assocate Memor (Content Addressable Memor) Functon Appromaton Learnng Optmzaton Vector Quantzaton Data Clusterng The Confguraton of ANNs The Bologc Neuron An ANN conssts of a large number of nterconnected processng elements called neurons A human bran conssts of ~0 neurons of man dfferent tpes Ho ANN orks? Collectve behavor

7 The Bologc Neuron The Bologc Neuron Connecton pont tal Ectator or Inhbtor 37 chemcal 38 The Artfcal Neuron The Artfcal Neuron-Perceptron ( t ) a( f ) m m 40 f ( m ) m j m j j a( f ) 0 f 0 otherse The Artfcal Neuron 2 2 j postve ectator negatve nhbtor zero no connecton The Artfcal Neuron 2 2 Proposed b McCulloch and Ptts [943] M-P neurons 4 m m 42 m m 7

8 What can be done b M-P neurons? What ANNs ll be? A hard lmter A bnar threshold unt Hperspace separaton f ( ) 2 2 f ( ) 0 0 otherse ANN A neurall nspred mathematcal model Conssts a large number of hghl nterconnected PEs Its connectons (eghts) holds knoledge The response of PE(Process Element) depends onl on local nformaton Its collectve behavor demonstrates the computaton poer Wth learnng, recallng and, generalzaton capablt Three Basc Enttes of ANN Models Models of Neurons or PEs Models of snaptc nterconnectons and structures Tranng or learnng rules Introducton to Artfcal Neural Netorks Basc Models and Learnng Rules Neuron Models ANN structures Learnng Processng Elements Integraton Functons Etensons of M-P neurons What ntegraton functons e ma have? 47 What actvaton functons e ma have? 48 M-P neuron Quadratc Functon Sphercal Functon Polnomal Functon f net m j f 2 j j m f ( j j m j j j ) 2 j m m j k j k f jk j k j k 8

9 Actvaton Functons Actvaton Functons M-P neuron: (Step functon) Hard Lmter (Threshold functon) a( f ) 0 f 0 otherse a a( f ) sgn( f ) a f 0 f 0 49 f 50 f Actvaton Functons Actvaton Functons Ramp functon: Unpolar sgmod functon: a( f ) f 0 f 0 f f 0 a a( f ) e 5 f 5 f Actvaton Functons Eample: Actvaton Surfaces Bpolar sgmod functon: L 2 a( f ) f e 5 05 L 3 L 2 L L 2 L

10 Eample: Actvaton Surfaces Eample: Actvaton Surfaces L =0 00 L Regon Code 55 L 3 +4=0 L 2 =0 = 2 = 3 = 4 L L 2 L L L L L 2 L 3 Eample: Actvaton Surfaces Eample: Actvaton Surfaces L z=0 L 4 z L z=0 L 4 z 4 =25 L 3 L 3 z= L 2 L L 2 L 3 z= L 2 L L 2 L Eample: Actvaton Surfaces Unpolar sgmod functon: a( f ) e Eample: Actvaton Surfaces f M-P neuron: (Step functon) a( f ) 0 f 0 otherse L 4 z =2 =3 L 4 z L L 2 L 3 L L 2 L 3 =5 =

11 Introducton to Artfcal Neural Netorks ANN Structure (Connectons) Basc Models and Learnng Rules Neuron Models ANN structures Learnng 6 62 Sngle-Laer Feedforard Netorks Multlaer Feedforard Netorks 2 n 2 n m 2 2 2m n 22 n2 nm Output Laer Hdden Laer 63 2 m 64 Input Laer 2 m Pattern Recognton Multlaer Feedforard Netorks Sngle Node th Feedback to Itself Where the knoledge from? Learnng Input Classfcaton Analss Output Feedback Loop 65 66

12 Sngle-Laer Recurrent Netorks 2 n Multlaer Recurrent Netorks m Introducton to Artfcal Neural Netorks Basc Models and Learnng Rules Neuron Models ANN structures Learnng 70 Learnng Consder an ANN th n neurons and each th m adaptve eghts Weght matr: T T 2 W T n n n2 m 2m nm 7 Learnng Consder an ANN th n neurons and each th m adaptve eghts Weght matr: T T 2 W T n n n2 Ho? To Learn the eght matr m 2m nm 72 Learnng Rules Supervsed learnng Unsupervsed learnng Renforcement learnng 2

13 Supervsed Learnng () () (2) (2 ( k) T d d d Supervsed Learnng ) ( k (, ),(, ),,(, ) ), 73 Learnng th a teacher Learnng b eamples Tranng set T d d d () () (2) (2 ) ( k) (, ),(, ),,(, ( k ) ), 74 ANN W Error sgnal Generator d Unsupervsed Learnng Unsupervsed Learnng Self-organzng Clusterng Form proper clusters b dscoverng the smlartes and dssmlartes among objects ANN W Renforcement Learnng Renforcement Learnng 77 Learnng th a crtc Learnng b comments 78 ANN W Crtc sgnal Generator Renforcement Sgnal 3

14 We ant to learn the eghts & bas The General Weght Learnng Rule The General Weght Learnng Rule j j m-,m- Input: net m j Output: a( net ) j j j j m-,m- Input: net m j Output: a( net ) j j We ant to learn the eghts & bas The General Weght Learnng Rule The General Weght Learnng Rule j j m-,m- Input: net net m j j m j j j Let m = and m = j 82 m 2 Input: net jj 2 j j j m m-,m- net jj m = j m = Let m = and m = We ant to learn =(, 2,, m ) T The General Weght Learnng Rule The General Weght Learnng Rule j j m-,m- m = m = Input: net m j (t) =? j j 84 r Learnng Sgnal Generator d 4

15 The General Weght Learnng Rule The General Weght Learnng Rule 85 r Learnng fr( Sgnal,, d ) Generator d 86 ( t) r( t) r Learnng fr( Sgnal,, d ) Generator d ( t) r( t) The General Weght Learnng Rule We ant to learn =(, 2,, m ) T The General Weght Learnng Rule 87 ( t) r( t) r Learnng Rate Learnng fr( Sgnal,, d ) Generator d 88 ( ) r( t) t r f (,, d ) Dscrete-Tme Weght Modfcaton Rule: r f (,, d ) ( t ) ( t ) ( t ) ( t ) ( t ) ( t ) r Contnuous-Tme Weght Modfcaton Rule: d () t r() t dt Hebb s Learnng La Hebb s Learnng La Hebb [949] hpothess that hen an aonal nput from A to B causes neuron B to mmedatel emt a pulse (fre) and ths stuaton happens repeatedl or persstentl Then, the effcac of that aonal nput, n terms of ablt to help neuron B to fre n future, s someho ncreased T r f (,, d ) a( ) r ( t) r( t) j j 89 Hebb s learnng rule s a unsupervsed learnng rule

16 Introducton to Artfcal Neural Netorks Dstrbuted Representatons Dstrbuted Representatons Dstrbuted Representaton: An entt s represented b a pattern of actvt dstrbuted over man PEs Each Processng element s nvolved n representng man dfferent enttes Local Representaton: Each entt s represented b one PE 9 92 Act as a content addressable memor Eample Advantages Dog Cat Bread P 0 P P 2 P 3 P 4 P 5 P 6 P 7 P 8 P 9 P 0 P P 2 P 3 P 4 P Dog Cat Bread P 0 P P 2 P 3 P 4 P 5 P 6 P 7 P 8 P 9 P 0 P P 2 P 3 P 4 P P 0 P P 2 P 3 P 4 P 5 P 6 P 7 P 8 P 9 P 0 P P 2 P 3 P 4 P What s ths? Advantages Act as a content addressable memor Make nducton eas Advantages Act as a content addressable memor Make nducton eas Make the creaton of ne enttes or concept eas (thout allocaton of ne hardare) 95 P 0 P P 2 P 3 P 4 P 5 P 6 P 7 P 8 P 9 P 0 P P 2 P 3 P 4 P 5 Dog Cat Bread P 0 P P 2 P 3 P 4 P 5 P 6 P 7 P 8 P 9 P 0 P P 2 P 3 P 4 P 5 Fdo Dog has 4 legs? Ho man for Fdo? 96 P 0 P P 2 P 3 P 4 P 5 P 6 P 7 P 8 P 9 P 0 P P 2 P 3 P 4 P 5 Dog Cat Bread Doughnut Add doughnut b changng eghts 6

17 Advantages Act as a content addressable memor Make nducton eas Make the creaton of ne enttes or concept eas (thout allocaton of ne hardare) Fault Tolerance Dsadvantages Dog Cat Bread P 0 P P 2 P 3 P 4 P 5 P 6 P 7 P 8 P 9 P 0 P P 2 P 3 P 4 P Ho to understand? Ho to modf? Learnng procedures are requred Some PEs break don don t cause problem REFERENCES All materals ere collected from dfferent eb pages Thanks for the authors of these sldes 99 7

Supervised Learning NNs

Supervised Learning NNs EE788 Robot Cognton and Plannng, Prof. J.-H. Km Lecture 6 Supervsed Learnng NNs Robot Intellgence Technolog Lab. From Jang, Sun, Mzutan, Ch.9, Neuro-Fuzz and Soft Computng, Prentce Hall Contents. Introducton.

More information

Introduction to the Introduction to Artificial Neural Network

Introduction to the Introduction to Artificial Neural Network Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification.

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification. Page 1 Model of Neurons CS 416 Artfcal Intellgence Lecture 18 Neural Nets Chapter 20 Multple nputs/dendrtes (~10,000!!!) Cell body/soma performs computaton Sngle output/axon Computaton s typcally modeled

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Discriminative classifier: Logistic Regression. CS534-Machine Learning

Discriminative classifier: Logistic Regression. CS534-Machine Learning Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng robablstc Classfer Gven an nstance, hat does a probablstc classfer do dfferentl compared to, sa, perceptron? It does not drectl predct Instead,

More information

Discriminative classifier: Logistic Regression. CS534-Machine Learning

Discriminative classifier: Logistic Regression. CS534-Machine Learning Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng 2 Logstc Regresson Gven tranng set D stc regresson learns the condtonal dstrbuton We ll assume onl to classes and a parametrc form for here s

More information

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We

More information

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION CAPTER- INFORMATION MEASURE OF FUZZY MATRI AN FUZZY BINARY RELATION Introducton The basc concept of the fuzz matr theor s ver smple and can be appled to socal and natural stuatons A branch of fuzz matr

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Neural Networks. Neural Network Motivation. Why Neural Networks? Comments on Blue Gene. More Comments on Blue Gene

Neural Networks. Neural Network Motivation. Why Neural Networks? Comments on Blue Gene. More Comments on Blue Gene Motvaton for non-lnear Classfers Neural Networs CPS 27 Ron Parr Lnear methods are wea Mae strong assumptons Can only express relatvely smple functons of nputs Comng up wth good features can be hard Why

More information

SVMs: Duality and Kernel Trick. SVMs as quadratic programs

SVMs: Duality and Kernel Trick. SVMs as quadratic programs /8/9 SVMs: Dualt and Kernel rck Machne Learnng - 6 Geoff Gordon MroslavDudík [[[partl ased on sldes of Zv-Bar Joseph] http://.cs.cmu.edu/~ggordon/6/ Novemer 8 9 SVMs as quadratc programs o optmzaton prolems:

More information

CSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing

CSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing CSC321 Tutoral 9: Revew of Boltzmann machnes and smulated annealng (Sldes based on Lecture 16-18 and selected readngs) Yue L Emal: yuel@cs.toronto.edu Wed 11-12 March 19 Fr 10-11 March 21 Outlne Boltzmann

More information

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17 Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer

More information

SVMs: Duality and Kernel Trick. SVMs as quadratic programs

SVMs: Duality and Kernel Trick. SVMs as quadratic programs 11/17/9 SVMs: Dualt and Kernel rck Machne Learnng - 161 Geoff Gordon MroslavDudík [[[partl ased on sldes of Zv-Bar Joseph] http://.cs.cmu.edu/~ggordon/161/ Novemer 18 9 SVMs as quadratc programs o optmzaton

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Unsupervised Learning

Unsupervised Learning Unsupervsed Learnng Kevn Swngler What s Unsupervsed Learnng? Most smply, t can be thought of as learnng to recognse and recall thngs Recognton I ve seen that before Recall I ve seen that before and I can

More information

Multi-layer neural networks

Multi-layer neural networks Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

Hopfield Training Rules 1 N

Hopfield Training Rules 1 N Hopfeld Tranng Rules To memorse a sngle pattern Suppose e set the eghts thus - = p p here, s the eght beteen nodes & s the number of nodes n the netor p s the value requred for the -th node What ll the

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Lecture 10 Support Vector Machines. Oct

Lecture 10 Support Vector Machines. Oct Lecture 10 Support Vector Machnes Oct - 20-2008 Lnear Separators Whch of the lnear separators s optmal? Concept of Margn Recall that n Perceptron, we learned that the convergence rate of the Perceptron

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

A New Algorithm Using Hopfield Neural Network with CHN for N-Queens Problem

A New Algorithm Using Hopfield Neural Network with CHN for N-Queens Problem 36 IJCSS Internatonal Journal of Computer Scence and etwork Securt, VOL9 o4, Aprl 009 A ew Algorthm Usng Hopfeld eural etwork wth CH for -Queens Problem We Zhang and Zheng Tang, Facult of Engneerng, Toama

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

A Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009

A Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009 A utoral on Data Reducton Lnear Dscrmnant Analss (LDA) hreen Elhaban and Al A Farag Unverst of Lousvlle, CVIP Lab eptember 009 Outlne LDA objectve Recall PCA No LDA LDA o Classes Counter eample LDA C Classes

More information

A neural network with localized receptive fields for visual pattern classification

A neural network with localized receptive fields for visual pattern classification Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Finite Difference Method

Finite Difference Method 7/0/07 Instructor r. Ramond Rump (9) 747 698 rcrump@utep.edu EE 337 Computatonal Electromagnetcs (CEM) Lecture #0 Fnte erence Method Lecture 0 These notes ma contan coprghted materal obtaned under ar use

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

Nonlinear Classifiers II

Nonlinear Classifiers II Nonlnear Classfers II Nonlnear Classfers: Introducton Classfers Supervsed Classfers Lnear Classfers Perceptron Least Squares Methods Lnear Support Vector Machne Nonlnear Classfers Part I: Mult Layer Neural

More information

EXPERT CONTROL BASED ON NEURAL NETWORKS FOR CONTROLLING GREENHOUSE ENVIRONMENT

EXPERT CONTROL BASED ON NEURAL NETWORKS FOR CONTROLLING GREENHOUSE ENVIRONMENT EXPERT CONTROL BASED ON NEURAL NETWORKS FOR CONTROLLING GREENHOUSE ENVIRONMENT Le Du Bejng Insttute of Technology, Bejng, 100081, Chna Abstract: Keyords: Dependng upon the nonlnear feature beteen neural

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

Available online Journal of Chemical and Pharmaceutical Research, 2014, 6(5): Research Article

Available online   Journal of Chemical and Pharmaceutical Research, 2014, 6(5): Research Article Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research 4 6(5):7-76 Research Artcle ISSN : 975-7384 CODEN(USA) : JCPRC5 Stud on relatonshp between nvestment n scence and technolog and

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

MATH 567: Mathematical Techniques in Data Science Lab 8

MATH 567: Mathematical Techniques in Data Science Lab 8 1/14 MATH 567: Mathematcal Technques n Data Scence Lab 8 Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 11, 2017 Recall We have: a (2) 1 = f(w (1) 11 x 1 + W (1) 12 x 2 + W

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Evaluation of classifiers MLPs

Evaluation of classifiers MLPs Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

SOLVING CAPACITATED VEHICLE ROUTING PROBLEMS WITH TIME WINDOWS BY GOAL PROGRAMMING APPROACH

SOLVING CAPACITATED VEHICLE ROUTING PROBLEMS WITH TIME WINDOWS BY GOAL PROGRAMMING APPROACH Proceedngs of IICMA 2013 Research Topc, pp. xx-xx. SOLVIG CAPACITATED VEHICLE ROUTIG PROBLEMS WITH TIME WIDOWS BY GOAL PROGRAMMIG APPROACH ATMII DHORURI 1, EMIUGROHO RATA SARI 2, AD DWI LESTARI 3 1Department

More information

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural

More information

The Fundamental Theorem of Algebra. Objective To use the Fundamental Theorem of Algebra to solve polynomial equations with complex solutions

The Fundamental Theorem of Algebra. Objective To use the Fundamental Theorem of Algebra to solve polynomial equations with complex solutions 5-6 The Fundamental Theorem of Algebra Content Standards N.CN.7 Solve quadratc equatons wth real coeffcents that have comple solutons. N.CN.8 Etend polnomal denttes to the comple numbers. Also N.CN.9,

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Double Layered Fuzzy Planar Graph

Double Layered Fuzzy Planar Graph Global Journal of Pure and Appled Mathematcs. ISSN 0973-768 Volume 3, Number 0 07), pp. 7365-7376 Research Inda Publcatons http://www.rpublcaton.com Double Layered Fuzzy Planar Graph J. Jon Arockaraj Assstant

More information

Machine Learning CS-527A ANN ANN. ANN Short History ANN. Artificial Neural Networks (ANN) Artificial Neural Networks

Machine Learning CS-527A ANN ANN. ANN Short History ANN. Artificial Neural Networks (ANN) Artificial Neural Networks Machne Learnng CS-57A Artfcal Neural Networks Burchan (bourch-khan) Bayazt http://www.cse.wustl.edu/~bayazt/courses/cs57a/ Malng lst: cs-57a@cse.wustl.edu Artfcal Neural Networks (ANN) Neural network nspred

More information

A Network Intrusion Detection Method Based on Improved K-means Algorithm

A Network Intrusion Detection Method Based on Improved K-means Algorithm Advanced Scence and Technology Letters, pp.429-433 http://dx.do.org/10.14257/astl.2014.53.89 A Network Intruson Detecton Method Based on Improved K-means Algorthm Meng Gao 1,1, Nhong Wang 1, 1 Informaton

More information

Comparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy

Comparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy Comparatve Studes of Law of Conservaton of Energy and Law Clusters of Conservaton of Generalzed Energy No.3 of Comparatve Physcs Seres Papers Fu Yuhua (CNOOC Research Insttute, E-mal:fuyh1945@sna.com)

More information

Other NN Models. Reinforcement learning (RL) Probabilistic neural networks

Other NN Models. Reinforcement learning (RL) Probabilistic neural networks Other NN Models Renforcement learnng (RL) Probablstc neural networks Support vector machne (SVM) Renforcement learnng g( (RL) Basc deas: Supervsed dlearnng: (delta rule, BP) Samples (x, f(x)) to learn

More information

Hopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen

Hopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen Hopfeld networks and Boltzmann machnes Geoffrey Hnton et al. Presented by Tambet Matsen 18.11.2014 Hopfeld network Bnary unts Symmetrcal connectons http://www.nnwj.de/hopfeld-net.html Energy functon The

More information

An Improved multiple fractal algorithm

An Improved multiple fractal algorithm Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton

More information

Tutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant

Tutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant Tutoral 2 COMP434 ometrcs uthentcaton Jun Xu, Teachng sstant csjunxu@comp.polyu.edu.hk February 9, 207 Table of Contents Problems Problem : nswer the questons Problem 2: Power law functon Problem 3: Convoluton

More information

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere Fall Analyss of Epermental Measurements B. Esensten/rev. S. Errede Some mportant probablty dstrbutons: Unform Bnomal Posson Gaussan/ormal The Unform dstrbuton s often called U( a, b ), hch stands for unform

More information

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Multilayer neural networks

Multilayer neural networks Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer

More information

MODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS

MODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS The 3 rd Internatonal Conference on Mathematcs and Statstcs (ICoMS-3) Insttut Pertanan Bogor, Indonesa, 5-6 August 28 MODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS 1 Deky Adzkya and 2 Subono

More information

Lecture 10: Dimensionality reduction

Lecture 10: Dimensionality reduction Lecture : Dmensonalt reducton g The curse of dmensonalt g Feature etracton s. feature selecton g Prncpal Components Analss g Lnear Dscrmnant Analss Intellgent Sensor Sstems Rcardo Guterrez-Osuna Wrght

More information

Spectral Clustering. Shannon Quinn

Spectral Clustering. Shannon Quinn Spectral Clusterng Shannon Qunn (wth thanks to Wllam Cohen of Carnege Mellon Unverst, and J. Leskovec, A. Raaraman, and J. Ullman of Stanford Unverst) Graph Parttonng Undrected graph B- parttonng task:

More information

Solutions to Homework 7, Mathematics 1. 1 x. (arccos x) (arccos x) 1

Solutions to Homework 7, Mathematics 1. 1 x. (arccos x) (arccos x) 1 Solutons to Homework 7, Mathematcs 1 Problem 1: a Prove that arccos 1 1 for 1, 1. b* Startng from the defnton of the dervatve, prove that arccos + 1, arccos 1. Hnt: For arccos arccos π + 1, the defnton

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Generative classification models

Generative classification models CS 675 Intro to Machne Learnng Lecture Generatve classfcaton models Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Data: D { d, d,.., dn} d, Classfcaton represents a dscrete class value Goal: learn

More information

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9 Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

2 Laminar Structure of Cortex. 4 Area Structure of Cortex

2 Laminar Structure of Cortex. 4 Area Structure of Cortex Networks!! Lamnar Structure of Cortex. Bology: The cortex. Exctaton: Undrectonal (transformatons) Local vs. dstrbuted representatons Bdrectonal (pattern completon, amplfcaton). Inhbton: Controllng bdrectonal

More information

Jifeng Zuo School of Science, Agricultural University of Hebei, Baoding , Hebei,China

Jifeng Zuo School of Science, Agricultural University of Hebei, Baoding , Hebei,China Rev. Téc. Ing. Unv. Zula. Vol. 39, Nº 7, 76 80, 016 do:10.1311/001.39.7.34 Proof on Decson Tree Algorthm Jfeng Zuo School of Scence, Agrcultural Unversty of Hebe, Baodng 071001, Hebe,Chna Pepe Ja Basc

More information

Support Vector Machines

Support Vector Machines CS 2750: Machne Learnng Support Vector Machnes Prof. Adrana Kovashka Unversty of Pttsburgh February 17, 2016 Announcement Homework 2 deadlne s now 2/29 We ll have covered everythng you need today or at

More information

CHAPTER IV RESEARCH FINDING AND DISCUSSIONS

CHAPTER IV RESEARCH FINDING AND DISCUSSIONS CHAPTER IV RESEARCH FINDING AND DISCUSSIONS A. Descrpton of Research Fndng. The Implementaton of Learnng Havng ganed the whole needed data, the researcher then dd analyss whch refers to the statstcal data

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radal-Bass uncton Networs v.0 March 00 Mchel Verleysen Radal-Bass uncton Networs - Radal-Bass uncton Networs p Orgn: Cover s theorem p Interpolaton problem p Regularzaton theory p Generalzed RBN p Unversal

More information

Wavelet chaotic neural networks and their application to continuous function optimization

Wavelet chaotic neural networks and their application to continuous function optimization Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,

More information

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant

More information

CSE 252C: Computer Vision III

CSE 252C: Computer Vision III CSE 252C: Computer Vson III Lecturer: Serge Belonge Scrbe: Catherne Wah LECTURE 15 Kernel Machnes 15.1. Kernels We wll study two methods based on a specal knd of functon k(x, y) called a kernel: Kernel

More information

Chapter 14 Simple Linear Regression

Chapter 14 Simple Linear Regression Chapter 4 Smple Lnear Regresson Chapter 4 - Smple Lnear Regresson Manageral decsons often are based on the relatonshp between two or more varables. Regresson analss can be used to develop an equaton showng

More information

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,* Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton

More information

Estimating Delays. Gate Delay Model. Gate Delay. Effort Delay. Computing Logical Effort. Logical Effort

Estimating Delays. Gate Delay Model. Gate Delay. Effort Delay. Computing Logical Effort. Logical Effort Estmatng Delas Would be nce to have a back of the envelope method for szng gates for speed Logcal Effort ook b Sutherland, Sproull, Harrs Chapter s on our web page Gate Dela Model Frst, normalze a model

More information

15-381: Artificial Intelligence. Regression and cross validation

15-381: Artificial Intelligence. Regression and cross validation 15-381: Artfcal Intellgence Regresson and cross valdaton Where e are Inputs Densty Estmator Probablty Inputs Classfer Predct category Inputs Regressor Predct real no. Today Lnear regresson Gven an nput

More information

Multivariate Ratio Estimation With Known Population Proportion Of Two Auxiliary Characters For Finite Population

Multivariate Ratio Estimation With Known Population Proportion Of Two Auxiliary Characters For Finite Population Multvarate Rato Estmaton Wth Knon Populaton Proporton Of To Auxlar haracters For Fnte Populaton *Raesh Sngh, *Sachn Mal, **A. A. Adeara, ***Florentn Smarandache *Department of Statstcs, Banaras Hndu Unverst,Varanas-5,

More information

Classification learning II

Classification learning II Lecture 8 Classfcaton learnng II Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Logstc regresson model Defnes a lnear decson boundar Dscrmnant functons: g g g g here g z / e z f, g g - s a logstc functon

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

Time Series Forecasting Using Artificial Neural Networks under Dempster Shafer Evidence Theory and Trimmed-winsorized Means

Time Series Forecasting Using Artificial Neural Networks under Dempster Shafer Evidence Theory and Trimmed-winsorized Means Internatonal Journal of Informaton and Computaton Technology. ISSN 0974-2239 Volume 3, Number 5 (2013), pp. 383-390 Internatonal Research Publcatons House http://www. rphouse.com /jct.htm Tme Seres Forecastng

More information

Neural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj

Neural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj Neural Networs Class 22: MLSP, Fall 2016 Instructor: Bhsha Raj IMPORTANT ADMINSTRIVIA Fnal wee. Project presentatons on 6th 18797/11755 2 Neural Networs are tang over! Neural networs have become one of

More information

OPTIMISATION. Introduction Single Variable Unconstrained Optimisation Multivariable Unconstrained Optimisation Linear Programming

OPTIMISATION. Introduction Single Variable Unconstrained Optimisation Multivariable Unconstrained Optimisation Linear Programming OPTIMIATION Introducton ngle Varable Unconstraned Optmsaton Multvarable Unconstraned Optmsaton Lnear Programmng Chapter Optmsaton /. Introducton In an engneerng analss, sometmes etremtes, ether mnmum or

More information

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1] DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

18-660: Numerical Methods for Engineering Design and Optimization

18-660: Numerical Methods for Engineering Design and Optimization 8-66: Numercal Methods for Engneerng Desgn and Optmzaton n L Department of EE arnege Mellon Unversty Pttsburgh, PA 53 Slde Overve lassfcaton Support vector machne Regularzaton Slde lassfcaton Predct categorcal

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Pre-Calculus Summer Assignment

Pre-Calculus Summer Assignment Pre-Calculus Summer Assgnment Dear Future Pre-Calculus Student, Congratulatons on our successful completon of Algebra! Below ou wll fnd the summer assgnment questons. It s assumed that these concepts,

More information

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them? Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of

More information

Resource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis

Resource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis Resource Allocaton and Decson Analss (ECON 800) Sprng 04 Foundatons of Regresson Analss Readng: Regresson Analss (ECON 800 Coursepak, Page 3) Defntons and Concepts: Regresson Analss statstcal technques

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information