CoSMo 2012 Gunnar Blohm

Size: px
Start display at page:

Download "CoSMo 2012 Gunnar Blohm"

Transcription

1 Sensory-motor computatons CoSMo 2012 Gunnar Blom

2 Outlne Introducton (Day 1) Sabes Sensory-motor transformatons Blom Populaton codng and parallel computng Modellng sensory-motor mappngs t artfcal neural netorks Tutoral: gan modulatons for reference frames Blom Mult-sensory ntegraton (Day 2) Blom Sensory-motor control Sabes Efference copy, forard models and Kalman flters Prsms and nter-sensory adaptaton Tutoral: Kalman flter and LQR Sabes 2

3 Motor plannng Hand-goal dstance computaton 3 Blom et al. 2009

4 Sensory-motor transformatons 4

5 Sensory-motor transformatons Justn: DLR robot ball catcng Sensory ref frames ~= motor ref frame Sensory code ~= motor code 5

6 Sensory-motor transformatons Reference frames Determned by sensory and motor apparatus Vson: attaced to te retna Audton: attaced to te ead Proprocepton: relatve jont angles Arm movement: relatve to attacment at soulder 6

7 Sensory-motor transformatons Reference frames Knoledge ab reference frames s requred to localze sensory and motor events Same retnal mage dfferent spatal locatons 7

8 Sensory-motor transformatons Reference frames Te relatve orentaton of dfferent sensory and motor reference frames s not fxed Canges t every movement 6D canges (e.g. Eyes re. Soulder) Non-commutatve Teed & Vls, Blom et al. 2009

9 Sensory-motor transformatons Reference frames A reference frame transformaton s needed to map sensory to motor coordnates Requres estmates of body geometry Blom et al

10 Sensory-motor transformatons Reference frames A reference frame transformaton s needed to map sensory to motor coordnates Requres estmates of body geometry Nose n estmates stocastc reference frame transformatons Nosy jont angles Ref. Frame Transformaton, , 2 10

11 Sgnal-dependent sensory nose Reac varablty depends on gaze fxatons Sgnal-dependent nose n muscle spndles explans arm poston varablty Neuronal nose s sgnal-dependent (Posson) Blom & Craford (2007) Mamon & Assad (2009) Scott & Loeb (1994) 11

12 Examples: reference frame transformatons Saccades Sortest pat Lstng s la Kler & Craford (1998) 12

13 Examples: reference frame transformatons Reacng / pontng Blom, & Craford (2007) Blom, Ket & Craford (2009) 13 Leclercq et al. (n preparaton)

14 Examples: reference frame transformatons Movng body / objects Smoot pursut Blom & Lefevre (2010) Leclercq et al. (n preparaton) 14

15 Examples: reference frame transformatons Movng body / objects Manual trackng Leclercq, Blom & Lefevre (n preparaton) Leclercq et al. (2012) 15

16 Examples: reference frame transformatons Vsual-motor transformaton defcts n optc ataxa Absolute reacng error (deg) 16 Kan et al. (2005a, b, 2009)

17 Examples: reference frame transformatons Reference frame transformaton defcts n optc ataxa Compensaton error ndex Kan, Psella, Blom (n revson) 17

18 Populaton codng and parallel computng 18

19 Drect vs. populaton codng Vson: populaton code Movement: dgtal / current command to actuators 19

20 Cosne tunng Tunng curves to nd drecton for lo-velocty nterneurons of crcket cercal system Cosne tunng: Frng rate f f ( s) rmax cos( s s a ) Preferred drecton s a 4 neurons can encode all nd drectons! v 4 r est c a a1 rmax 20 Dayan & Abbott, 2001

21 Cosne tunng Motor-related cosne tunng n PMv and M1 Kake, Hoffman, Strck,

22 Populaton codes Crcket nd drecton: 4 neurons = populaton codng! Prncple: eac neuron codes for a dfferent set of stmulus values Togeter, all neurons encode all possble stmul as a populaton Redundancy s alays present! (counter-example: crcket) Example: Gaussan receptve felds Dayan & Abbott,

23 Populaton codes Encodng a stmulus usng populaton codes 23 Dayan & Abbott, 2001

24 Narro versus de tunng Queston: at s better, narro or de tunng? For fxed nose, tn one layer: narro s better! In a neural netork, put tunng curves sould be der tan nput tunng curves Informaton n te put layer cannot be greater tan n te nput layer Sarpenng tunng curves n te put can only decrease (or at best preserve) nformaton content Result: te de tunng of te nput layer contans more nformaton tat te narro tunng n te put layer Consequence for te bran: narro-to de n processng erarcy Pouget & Sejnosk,

25 Populaton codes Example: cue combnaton t populaton codes Probablstc populaton codes Posson-lke neural nose Varance nversely related to gans of populaton code Ma et al. Nat Neuro

26 Modellng sensory-motor mappngs t artfcal neural netorks ANN arctecture and connectvty 26

27 Goals Feasblty of neural netork mplementaton Mostly trval... More nterestng questons Wat s te optmal netork structure gven a fxed number of neurons / unts? Wat propertes emerge from tranng? Can tese emergng propertes explan aspects of real bran functon / dysfuncton? Can e understand te dfference beteen electropysologcal tecnques (e.g. recordng vs. stmulaton)?... 27

28 From spkes to frng rates Approxmatons Sze One unt n a rate-based netork represents average local populaton beavour One unts beavour mmcs populaton computatons Tme Average frng rate does not capture spke dynamcs, varablty n spkes etc Complexty of spke tme nteractons tn a netork lost Trappenberg

29 Feed-forard netorks Input E.g. sensory feature vector Samplng Trappenberg

30 Feed-forard netorks Perceptron r n x y r Example: y 1 x1 2 x2 General sngle-layer mappng (= smple perceptron) r g j r n j r g r n 30

31 Feed-forard netorks Perceptron Example: g(x)=x y 1 x1 2 x2 1 =1, 2 =-1 x 1 x 2 y Good generalzaton of netork! Trappenberg

32 Feed-forard netorks Perceptron Boolean functon g: tresold node 1 f x g( x) 0 elseere bas nput Lnear separable functon Not lnear separable Trappenberg

33 Feed-forard netorks Mult-layer perceptron Unversal functon approxmator Gven enoug dden nodes, any functon can be approxmated t arbtrary precson Example: sne ave approx. t logstc transfer functon Trappenberg

34 Feed-forard netorks Mult-layer perceptron Generalzaton = performance sde te tranng set Good nterpolaton abltes for sgmod netorks Trappenberg

35 Feed-forard netorks Mult-layer perceptron Lmtatons Bran-lke performance does NOT mean te bran performs some mappng te same ay Tranng rules are non-pysologc (see next secton) Strengts Hdden layer actvty mgt resemble bran functon gven approprate coces of nput and put codngs Te bran = a mappng netork Self-organzaton, analogous to te bran Hg flexblty n possble computatons 35

36 Neural transfer functons r j f r n, Trappenberg 2010 f : x S n 1 n y S n 2 36

37 Neural transfer functons Naka-Ruston functon (1966) Vsual neurons (LGN, V1, MT) Response to stmul t dfferent contrasts M=100 σ=50 Wlson

38 Neural transfer functons Idealzed transfer functons n nodes Te Matorks Inc 38

39 Modellng sensory-motor mappngs t artfcal neural netorks Tranng algortms 39

40 Tranng algortms Gradent descent Cost functon: E (mean squared error) 1 2 r 2 y Desred put (data, tranng set) Goal: mnmze te cost functon Cange netork egts j j j E j j 40

41 Tranng algortms 41 Gradent descent Wt can rule Delta rule: f j n j j j j y r g E j j g g f f n j j r r y g ) (

42 Tranng algortms Gradent descent Lnear perceptron: g( g( ) ) 1 Perceptron learnng rule: j y r r n j Works also for most oter transfer functons g Smlarty to Hebban learnng (supervsed Hebban): Increase / decrease proportonal to netork error AND nput strengt 42

43 Tranng algortms Generalzaton to mult-layer perceptrons Back-propagaton r r g g j r j r j 3-layer perceptron: r g g r n Wegts to adjust 43

44 Tranng algortms Generalzaton to mult-layer perceptrons Back-propagaton Generalzed delta rule: put egts E j t 1 r y 2 2 j r j g ' r y 44

45 Tranng algortms Generalzaton to mult-layer perceptrons Back-propagaton Generalzed delta rule: dden layer egts E j t 1 2 r 1 2 j n j g j g g j ' r n k y j 2 k k k jk r n k y Back-propagaton of error term! 2 45

46 Netork desgn and analyss 46

47 Netork desgn Stck to knon pysology as muc as possble Input / put codng Connectvty Transfer functons Learnng rule? E.g. 3-D reac plannng netork Blom, Ket, Craford, 2009; Blom,

48 Netork analyss Receptve felds = actvaton pattern of a neuron for targets across space Blom, Kan, Craford, 2009 (adapted from Andersen, et al., 1985) 48

49 Gan modulaton = cange of receptve feld strengt t secondary nput E.g. eye poston gan modulaton of vsual receptve felds n posteror paretal cortex Blom, Kan, Craford, 2009 (adapted from Andersen, et al., 1985) 49

50 Gan modulaton Reference frame transformatons Zpser & Andersen, Nature 1988 Eye poston gan modulaton of dden layer unts 50

51 Gan modulaton Poerful computatonal means for Cue combnaton Reference frame transformatons Mult-sensory ntegraton Blom & Craford, 2009

52 Reference frame transformatons Reference frames based on electropysologcal analyss of a 3-D vsuo-motor transformaton netork Blom, Ket, Craford,

53 Concluson 53

54 Concluson Feed-forard rate-based netorks are te smplest form of ANNs Computatonally effcent Poerful But non-trval mappng to bology? Learnng algortms Fast and robust algortms can be found Mostly remote from bology More complcated algortms are more realstc FF-ANNs are very useful tools for nvestgatng Computatons n te bran (reference frames, mult-sensory, ) Herarccal processng Receptve felds 54

55 Ts afternoon Implement back-propagaton learnng Analyze gan felds and receptve felds 55

56 Matlab tutoral: ANNs and gan felds 56

57 Exercse 1: Back-propagaton Goal: program a smple feed-forard neural netork and tran t t back-propagaton Task: retnal-to-spatal transformaton n 1-D Spatal = retnal + eye orentaton 3-layer: nput, dden layer, put Input: 1-D retnal map Transfer functons: sgmod Output: 1-D spatal map Tranng metod: error back-propagaton Generate random tranng set 57

58 Exercse 1: Back-propagaton 58 j j j j j E n j j r r y j j j y r g t r y r E ' k k k n n j j k n k jk j j j j g t r y r g g y r E '

59 Exercse 2: RF & gan feld analyss Use Matlab neural netork toolbox Code provded Tran a netork (just run te code) You can use dfferent versons of back-propagaton (default: reslent back-prop) Plot RFs for ndvdual dden layer and put layer unts Ho do tese RFs cange t eye orentaton? Gan felds versus RF sfts... 59

Introduction to the Introduction to Artificial Neural Network

Introduction to the Introduction to Artificial Neural Network Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17 Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Supervised Learning NNs

Supervised Learning NNs EE788 Robot Cognton and Plannng, Prof. J.-H. Km Lecture 6 Supervsed Learnng NNs Robot Intellgence Technolog Lab. From Jang, Sun, Mzutan, Ch.9, Neuro-Fuzz and Soft Computng, Prentce Hall Contents. Introducton.

More information

Multivariate Ratio Estimator of the Population Total under Stratified Random Sampling

Multivariate Ratio Estimator of the Population Total under Stratified Random Sampling Open Journal of Statstcs, 0,, 300-304 ttp://dx.do.org/0.436/ojs.0.3036 Publsed Onlne July 0 (ttp://www.scrp.org/journal/ojs) Multvarate Rato Estmator of te Populaton Total under Stratfed Random Samplng

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

2 Laminar Structure of Cortex. 4 Area Structure of Cortex

2 Laminar Structure of Cortex. 4 Area Structure of Cortex Networks!! Lamnar Structure of Cortex. Bology: The cortex. Exctaton: Undrectonal (transformatons) Local vs. dstrbuted representatons Bdrectonal (pattern completon, amplfcaton). Inhbton: Controllng bdrectonal

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Feedforward neural network. IFT Réseaux neuronaux

Feedforward neural network. IFT Réseaux neuronaux Feedforard neural netork IFT 725 - Réseau neuronau Septemer Astrat6, 22 ugolaroelle@userrookeasepte verste de Serrooke Sep Mat for my sldes Feedforard neural netork ARTIFICIAL NEURON aroelle@userrookea

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING 1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N

More information

A neural network with localized receptive fields for visual pattern classification

A neural network with localized receptive fields for visual pattern classification Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton

More information

Mathematical Preparations

Mathematical Preparations 1 Introducton Mathematcal Preparatons The theory of relatvty was developed to explan experments whch studed the propagaton of electromagnetc radaton n movng coordnate systems. Wthn expermental error the

More information

The Cortex. Networks. Laminar Structure of Cortex. Chapter 3, O Reilly & Munakata.

The Cortex. Networks. Laminar Structure of Cortex. Chapter 3, O Reilly & Munakata. Networks The Cortex Chapter, O Relly & Munakata. Bology of networks: The cortex Exctaton: Undrectonal (transformatons) Local vs. dstrbuted representatons Bdrectonal (pattern completon, amplfcaton) Inhbton:

More information

3) Surrogate Responses

3) Surrogate Responses 1) Introducton Vsual neurophysology has benefted greatly for many years through the use of smple, controlled stmul lke bars and gratngs. One common characterzaton of the responses elcted by these stmul

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

COMP4630: λ-calculus

COMP4630: λ-calculus COMP4630: λ-calculus 4. Standardsaton Mcael Norrs Mcael.Norrs@ncta.com.au Canberra Researc Lab., NICTA Semester 2, 2015 Last Tme Confluence Te property tat dvergent evaluatons can rejon one anoter Proof

More information

Feature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013

Feature Selection & Dynamic Tracking F&P Textbook New: Ch 11, Old: Ch 17 Guido Gerig CS 6320, Spring 2013 Feature Selecton & Dynamc Trackng F&P Textbook New: Ch 11, Old: Ch 17 Gudo Gerg CS 6320, Sprng 2013 Credts: Materal Greg Welch & Gary Bshop, UNC Chapel Hll, some sldes modfed from J.M. Frahm/ M. Pollefeys,

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc

More information

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?

Image classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them? Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of

More information

Networks of Neurons (Chapter 7)

Networks of Neurons (Chapter 7) CSE/NEUBEH 58 Networks of Neurons (Chapter 7) Drawng by Ramón y Cajal Today s Agenda F Computaton n Networks of Neurons Feedforward Networks: What can they do? Recurrent Networks: What more can they do?

More information

Other NN Models. Reinforcement learning (RL) Probabilistic neural networks

Other NN Models. Reinforcement learning (RL) Probabilistic neural networks Other NN Models Renforcement learnng (RL) Probablstc neural networks Support vector machne (SVM) Renforcement learnng g( (RL) Basc deas: Supervsed dlearnng: (delta rule, BP) Samples (x, f(x)) to learn

More information

MATH 567: Mathematical Techniques in Data Science Lab 8

MATH 567: Mathematical Techniques in Data Science Lab 8 1/14 MATH 567: Mathematcal Technques n Data Scence Lab 8 Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 11, 2017 Recall We have: a (2) 1 = f(w (1) 11 x 1 + W (1) 12 x 2 + W

More information

Hierarchical Bayesian Inference in Networks of Spiking Neurons

Hierarchical Bayesian Inference in Networks of Spiking Neurons To appear n Advances n NIPS, Vol. 17, MIT Press, 25. Herarchcal Bayesan Inference n Networks of Spkng Neurons Raesh P. N. Rao Department of Computer Scence and Engneerng Unversty of Washngton, Seattle,

More information

Neural networks. Nuno Vasconcelos ECE Department, UCSD

Neural networks. Nuno Vasconcelos ECE Department, UCSD Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X

More information

Lecture 23: Artificial neural networks

Lecture 23: Artificial neural networks Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of

More information

Linear discriminants. Nuno Vasconcelos ECE Department, UCSD

Linear discriminants. Nuno Vasconcelos ECE Department, UCSD Lnear dscrmnants Nuno Vasconcelos ECE Department UCSD Classfcaton a classfcaton problem as to tpes of varables e.g. X - vector of observatons features n te orld Y - state class of te orld X R 2 fever blood

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radal-Bass uncton Networs v.0 March 00 Mchel Verleysen Radal-Bass uncton Networs - Radal-Bass uncton Networs p Orgn: Cover s theorem p Interpolaton problem p Regularzaton theory p Generalzed RBN p Unversal

More information

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,*

Using Immune Genetic Algorithm to Optimize BP Neural Network and Its Application Peng-fei LIU1,Qun-tai SHEN1 and Jun ZHI2,* Advances n Computer Scence Research (ACRS), volume 54 Internatonal Conference on Computer Networks and Communcaton Technology (CNCT206) Usng Immune Genetc Algorthm to Optmze BP Neural Network and Its Applcaton

More information

Microwave Diversity Imaging Compression Using Bioinspired

Microwave Diversity Imaging Compression Using Bioinspired Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification.

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification. Page 1 Model of Neurons CS 416 Artfcal Intellgence Lecture 18 Neural Nets Chapter 20 Multple nputs/dendrtes (~10,000!!!) Cell body/soma performs computaton Sngle output/axon Computaton s typcally modeled

More information

Fundamentals of Neural Networks

Fundamentals of Neural Networks Fundamentals of Neural Networks Xaodong Cu IBM T. J. Watson Research Center Yorktown Heghts, NY 10598 Fall, 2018 Outlne Feedforward neural networks Forward propagaton Neural networks as unversal approxmators

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Multigrid Methods and Applications in CFD

Multigrid Methods and Applications in CFD Multgrd Metods and Applcatons n CFD Mcael Wurst 0 May 009 Contents Introducton Typcal desgn of CFD solvers 3 Basc metods and ter propertes for solvng lnear systems of equatons 4 Geometrc Multgrd 3 5 Algebrac

More information

1 Input-Output Mappings. 2 Hebbian Failure. 3 Delta Rule Success.

1 Input-Output Mappings. 2 Hebbian Failure. 3 Delta Rule Success. Task Learnng 1 / 27 1 Input-Output Mappngs. 2 Hebban Falure. 3 Delta Rule Success. Input-Output Mappngs 2 / 27 0 1 2 3 4 5 6 7 8 9 Output 3 8 2 7 Input 5 6 0 9 1 4 Make approprate: Response gven stmulus.

More information

Origin of information-limiting noise correlations

Origin of information-limiting noise correlations Orgn of nformaton-lmtng nose correlatons Ingmar Kantscheder a,b,1,2, Ruben oen-agl a,1, and Alexandre Pouget a,c,d a Department of asc Neuroscence, Unversty of Geneva, 1211 Geneva, Swtzerland; b enter

More information

Nonlinear Classifiers II

Nonlinear Classifiers II Nonlnear Classfers II Nonlnear Classfers: Introducton Classfers Supervsed Classfers Lnear Classfers Perceptron Least Squares Methods Lnear Support Vector Machne Nonlnear Classfers Part I: Mult Layer Neural

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural

More information

Neural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj

Neural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj Neural Networs Class 22: MLSP, Fall 2016 Instructor: Bhsha Raj IMPORTANT ADMINSTRIVIA Fnal wee. Project presentatons on 6th 18797/11755 2 Neural Networs are tang over! Neural networs have become one of

More information

Fundamentals of Computational Neuroscience 2e

Fundamentals of Computational Neuroscience 2e Fundamentals of Computatonal Neuroscence e Thomas Trappenberg February 7, 9 Chapter 6: Feed-forward mappng networks Dgtal representaton of letter A 3 3 4 5 3 33 4 5 34 35

More information

Multilayer neural networks

Multilayer neural networks Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer

More information

Multi-layer neural networks

Multi-layer neural networks Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong Moton Percepton Under Uncertanty Hongjng Lu Department of Psychology Unversty of Hong Kong Outlne Uncertanty n moton stmulus Correspondence problem Qualtatve fttng usng deal observer models Based on sgnal

More information

Neural Implementation of Hierarchical Bayesian Inference by Importance Sampling

Neural Implementation of Hierarchical Bayesian Inference by Importance Sampling Neural Implementaton of Herarchcal Bayesan Inference by Importance Samplng Le Sh Helen Wlls Neuroscence Insttute Unversty of Calforna, Berkeley Berkeley, CA 9470 lsh@berkeley.edu Thomas L. Grffths Department

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Statistical machine learning and its application to neonatal seizure detection

Statistical machine learning and its application to neonatal seizure detection 19/Oct/2009 Statstcal machne learnng and ts applcaton to neonatal sezure detecton Presented by Andry Temko Department of Electrcal and Electronc Engneerng Page 2 of 42 A. Temko, Statstcal Machne Learnng

More information

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG

De-noising Method Based on Kernel Adaptive Filtering for Telemetry Vibration Signal of the Vehicle Test Kejun ZENG 6th Internatonal Conference on Mechatroncs, Materals, Botechnology and Envronment (ICMMBE 6) De-nosng Method Based on Kernel Adaptve Flterng for elemetry Vbraton Sgnal of the Vehcle est Kejun ZEG PLA 955

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

Evaluation of classifiers MLPs

Evaluation of classifiers MLPs Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label

More information

Supplementary Materials

Supplementary Materials Supplementary Materals hs secton s organzed nto three parts. In the frst, we show that when the lkelhood functon, p(r s), belongs to the exponental famly wth lnear suffcent statstcs, optmal cue combnaton

More information

Rethinking MIMO for Wireless Networks: Linear Throughput Increases with Multiple Receive Antennas

Rethinking MIMO for Wireless Networks: Linear Throughput Increases with Multiple Receive Antennas Retnng MIMO for Wreless etwors: Lnear Trougput Increases wt Multple Receve Antennas ar Jndal Unversty of Mnnesota Unverstat Pompeu Fabra Jont wor wt Jeff Andrews & Steven Weber MIMO n Pont-to-Pont Cannels

More information

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology

More information

15-381: Artificial Intelligence. Regression and cross validation

15-381: Artificial Intelligence. Regression and cross validation 15-381: Artfcal Intellgence Regresson and cross valdaton Where e are Inputs Densty Estmator Probablty Inputs Classfer Predct category Inputs Regressor Predct real no. Today Lnear regresson Gven an nput

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING Qn Wen, Peng Qcong 40 Lab, Insttuton of Communcaton and Informaton Engneerng,Unversty of Electronc Scence and Technology

More information

Inductance Calculation for Conductors of Arbitrary Shape

Inductance Calculation for Conductors of Arbitrary Shape CRYO/02/028 Aprl 5, 2002 Inductance Calculaton for Conductors of Arbtrary Shape L. Bottura Dstrbuton: Internal Summary In ths note we descrbe a method for the numercal calculaton of nductances among conductors

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Pattern Matching Based on a Generalized Transform [Final Report]

Pattern Matching Based on a Generalized Transform [Final Report] Pattern Matchng ased on a Generalzed Transform [Fnal Report] Ram Rajagopal Natonal Instruments 5 N. Mopac Expwy., uldng, Austn, T 78759-354 ram.rajagopal@n.com Abstract In a two-dmensonal pattern matchng

More information

Integrating Neural Networks and PCA for Fast Covert Surveillance

Integrating Neural Networks and PCA for Fast Covert Surveillance Integratng Neural Networks and PCA for Fast Covert Survellance Hazem M. El-Bakry, and Mamoon H. Mamoon Faculty of Computer Scence & Informaton Systems, Mansoura Unversty, EGYPT E-mal: helbakry0@yahoo.com

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

Multiple Sound Source Location in 3D Space with a Synchronized Neural System

Multiple Sound Source Location in 3D Space with a Synchronized Neural System Multple Sound Source Locaton n D Space wth a Synchronzed Neural System Yum Takzawa and Atsush Fukasawa Insttute of Statstcal Mathematcs Research Organzaton of Informaton and Systems 0- Mdor-cho, Tachkawa,

More information

Why feed-forward networks are in a bad shape

Why feed-forward networks are in a bad shape Why feed-forward networks are n a bad shape Patrck van der Smagt, Gerd Hrznger Insttute of Robotcs and System Dynamcs German Aerospace Center (DLR Oberpfaffenhofen) 82230 Wesslng, GERMANY emal smagt@dlr.de

More information

EXPERT CONTROL BASED ON NEURAL NETWORKS FOR CONTROLLING GREENHOUSE ENVIRONMENT

EXPERT CONTROL BASED ON NEURAL NETWORKS FOR CONTROLLING GREENHOUSE ENVIRONMENT EXPERT CONTROL BASED ON NEURAL NETWORKS FOR CONTROLLING GREENHOUSE ENVIRONMENT Le Du Bejng Insttute of Technology, Bejng, 100081, Chna Abstract: Keyords: Dependng upon the nonlnear feature beteen neural

More information

Deep Learning. Boyang Albert Li, Jie Jay Tan

Deep Learning. Boyang Albert Li, Jie Jay Tan Deep Learnng Boyang Albert L, Je Jay Tan An Unrelated Vdeo A bcycle controller learned usng NEAT (Stanley) What do you mean, deep? Shallow Hdden Markov models ANNs wth one hdden layer Manually selected

More information

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess

More information

Classification learning II

Classification learning II Lecture 8 Classfcaton learnng II Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Logstc regresson model Defnes a lnear decson boundar Dscrmnant functons: g g g g here g z / e z f, g g - s a logstc functon

More information

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression 11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING

More information

Lecture Topics VMSC Prof. Dr.-Ing. habil. Hermann Lödding Prof. Dr.-Ing. Wolfgang Hintze. PD Dr.-Ing. habil.

Lecture Topics VMSC Prof. Dr.-Ing. habil. Hermann Lödding Prof. Dr.-Ing. Wolfgang Hintze. PD Dr.-Ing. habil. Lecture Topcs 1. Introducton 2. Sensor Gudes Robots / Machnes 3. Motvaton Model Calbraton 4. 3D Vdeo Metrc (Geometrcal Camera Model) 5. Grey Level Pcture Processng for Poston Measurement 6. Lght and Percepton

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015

CS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015 CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research

More information

Change Detection: Current State of the Art and Future Directions

Change Detection: Current State of the Art and Future Directions Change Detecton: Current State of the Art and Future Drectons Dapeng Olver Wu Electrcal & Computer Engneerng Unversty of Florda http://www.wu.ece.ufl.edu/ Outlne Motvaton & problem statement Change detecton

More information

Feb 14: Spatial analysis of data fields

Feb 14: Spatial analysis of data fields Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s

More information

Flexible Allocation of Capacity in Multi-Cell CDMA Networks

Flexible Allocation of Capacity in Multi-Cell CDMA Networks Flexble Allocaton of Capacty n Mult-Cell CDMA Networs Robert Al, Manu Hegde, Mort Naragh-Pour*, Paul Mn Washngton Unversty, St. Lous, MO *Lousana State Unversty, Baton Rouge, LA Outlne Capacty and Probablty

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Using deep belief network modelling to characterize differences in brain morphometry in schizophrenia

Using deep belief network modelling to characterize differences in brain morphometry in schizophrenia Usng deep belef network modellng to characterze dfferences n bran morphometry n schzophrena Walter H. L. Pnaya * a ; Ary Gadelha b ; Orla M. Doyle c ; Crstano Noto b ; André Zugman d ; Qurno Cordero b,

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Complex Network Dynamics:

Complex Network Dynamics: Complex Network Dynamcs: Theory & Applcaton Yur Mastrenko E-mal: y.mastrenko@bomed.kev.ua Lecture 2 Network Scence Complex networks versus Dynamcal networks Oscllatory networks Ensembles of oscllators

More information

Cathy Walker March 5, 2010

Cathy Walker March 5, 2010 Cathy Walker March 5, 010 Part : Problem Set 1. What s the level of measurement for the followng varables? a) SAT scores b) Number of tests or quzzes n statstcal course c) Acres of land devoted to corn

More information

Fourier Transform. Additive noise. Fourier Tansform. I = S + N. Noise doesn t depend on signal. We ll consider:

Fourier Transform. Additive noise. Fourier Tansform. I = S + N. Noise doesn t depend on signal. We ll consider: Flterng Announcements HW2 wll be posted later today Constructng a mosac by warpng mages. CSE252A Lecture 10a Flterng Exampel: Smoothng by Averagng Kernel: (From Bll Freeman) m=2 I Kernel sze s m+1 by m+1

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

Multi layer feed-forward NN FFNN. XOR problem. XOR problem. Neural Network for Speech. NETtalk (Sejnowski & Rosenberg, 1987) NETtalk (contd.

Multi layer feed-forward NN FFNN. XOR problem. XOR problem. Neural Network for Speech. NETtalk (Sejnowski & Rosenberg, 1987) NETtalk (contd. NN 3-00 Mult layer feed-forard NN FFNN We consder a more general netor archtecture: beteen the nput and output layers there are hdden layers, as llustrated belo. Hdden nodes do not drectly send outputs

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

CS294A Lecture notes. Andrew Ng

CS294A Lecture notes. Andrew Ng CS294A Lecture notes Andrew Ng Sparse autoencoder 1 Introducton Supervsed learnng s one of the most powerful tools of AI, and has led to automatc zp code recognton, speech recognton, self-drvng cars, and

More information

Numerical Simulation of One-Dimensional Wave Equation by Non-Polynomial Quintic Spline

Numerical Simulation of One-Dimensional Wave Equation by Non-Polynomial Quintic Spline IOSR Journal of Matematcs (IOSR-JM) e-issn: 78-578, p-issn: 319-765X. Volume 14, Issue 6 Ver. I (Nov - Dec 018), PP 6-30 www.osrournals.org Numercal Smulaton of One-Dmensonal Wave Equaton by Non-Polynomal

More information

Population model learned on different stimulus ensembles predicts network responses in the retina

Population model learned on different stimulus ensembles predicts network responses in the retina borxv preprnt frst posted onlne Jan. 5, 28; do: http://dx.do.org/./24386. The copyrght holder for ths preprnt (whch was Populaton model learned on dfferent stmulus ensembles predcts network responses n

More information

TOPICS MULTIPLIERLESS FILTER DESIGN ELEMENTARY SCHOOL ALGORITHM MULTIPLICATION

TOPICS MULTIPLIERLESS FILTER DESIGN ELEMENTARY SCHOOL ALGORITHM MULTIPLICATION 1 2 MULTIPLIERLESS FILTER DESIGN Realzaton of flters wthout full-fledged multplers Some sldes based on support materal by W. Wolf for hs book Modern VLSI Desgn, 3 rd edton. Partly based on followng papers:

More information