CN2 1: Introduction. Paul Gribble. Sep 10,

Similar documents
Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Lecture 4: Feed Forward Neural Networks

Neural networks. Chapter 19, Sections 1 5 1

Data Mining Part 5. Prediction

Introduction to Neural Networks

Neural Networks Introduction

Lecture 7 Artificial neural networks: Supervised learning

Introduction Biologically Motivated Crude Model Backpropagation

Neural networks. Chapter 20. Chapter 20 1

Artificial Neural Network and Fuzzy Logic

Nervous Tissue. Neurons Neural communication Nervous Systems

CMSC 421: Neural Computation. Applications of Neural Networks

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

Nervous Systems: Neuron Structure and Function

Artificial Neural Network

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Artificial Intelligence Hopfield Networks

Chapter 9: The Perceptron

Introduction To Artificial Neural Networks

Nervous System Organization

Neural networks. Chapter 20, Section 5 1

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Neuro-Fuzzy Comp. Ch. 1 March 24, 2005

Neurons and Nervous Systems

Revision: Neural Network

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Artificial Intelligence

Dendrites - receives information from other neuron cells - input receivers.

Chapter 9. Nerve Signals and Homeostasis

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Introduction to Artificial Neural Networks

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Neurons, Synapses, and Signaling

Chapter 48 Neurons, Synapses, and Signaling

Artificial Neural Networks

PV021: Neural networks. Tomáš Brázdil

Part 8: Neural Networks

CSC242: Intro to AI. Lecture 21

Fundamentals of Neural Networks

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen

The Neuron - F. Fig. 45.3

ARTIFICIAL INTELLIGENCE. Artificial Neural Networks

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

CS:4420 Artificial Intelligence

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

Artificial neural networks

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

Administration. Registration Hw3 is out. Lecture Captioning (Extra-Credit) Scribing lectures. Questions. Due on Thursday 10/6

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

Artificial Neural Networks. Historical description

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017

Intro and Homeostasis

CISC 3250 Systems Neuroscience

Nervous System Organization

EEE 241: Linear Systems

Feedforward Neural Nets and Backpropagation

BIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer

Learning and Memory in Neural Networks

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Nervous system. 3 Basic functions of the nervous system !!!! !!! 1-Sensory. 2-Integration. 3-Motor

Lecture 4: Perceptrons and Multilayer Perceptrons

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

MEMBRANE POTENTIALS AND ACTION POTENTIALS:

Deep Learning: a gentle introduction

Introduction and Perceptron Learning

Artificial Neural Networks

1. Neurons & Action Potentials

In the Name of God. Lecture 9: ANN Architectures

UNIT I INTRODUCTION TO ARTIFICIAL NEURAL NETWORK IT 0469 NEURAL NETWORKS

Artificial Neural Networks The Introduction

Neural Networks (Part 1) Goals for the lecture

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Neuronale Netzwerke. VL Algorithmisches Lernen, Teil 2a. Norman Hendrich

Course 395: Machine Learning - Lectures

Neural Networks biological neuron artificial neuron 1

Plan. Perceptron Linear discriminant. Associative memories Hopfield networks Chaotic networks. Multilayer perceptron Backpropagation

Artificial Neural Networks

Neurons, Synapses, and Signaling

Model of a Biological Neuron as a Temporal Neural Network

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

Neurochemistry 1. Nervous system is made of neurons & glia, as well as other cells. Santiago Ramon y Cajal Nobel Prize 1906

Introduction to Neural Networks

COMP304 Introduction to Neural Networks based on slides by:

Study of a neural network-based system for stability augmentation of an airplane

Using a Hopfield Network: A Nuts and Bolts Approach

CS 4700: Foundations of Artificial Intelligence

AI Programming CS F-20 Neural Networks

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

Neural Networks. Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation

Course Structure. Psychology 452 Week 12: Deep Learning. Chapter 8 Discussion. Part I: Deep Learning: What and Why? Rufus. Rufus Processed By Fetch

Artificial Neural Networks Examination, June 2004

Math in systems neuroscience. Quan Wen

Transcription:

CN2 1: Introduction Paul Gribble http://gribblelab.org Sep 10, 2012

Administrivia Class meets Mondays, 2:00pm - 3:30pm and Thursdays, 11:30am - 1:00pm, in NSC 245A Contact me with any questions or to set up a meeting: paul@gribblelab.org Office: NSC 228 Lab: NSC 245G Materials will be posted on the course website: http://www.gribblelab.org/compneuro/

Grading Component Grade Assignments 50% Written Report 20% Oral Presentation 20% Participation 10% Total 100%

Topics Modelling Dynamical Systems Modelling Action Potentials Musculoskeletal Models Perceptrons Multi-Layer Networks Recurrent Networks Stochastic Networks Unsupervised Learning Computational Motor Control

Useful Background Knowledge Calculus Linear Algebra Basic Probability Theory Differential Equations Computer programming No Fear! Let me know if you need a refresher on any topics, I can provide resources to help you catch up.

Readings Selected readings from a number of books Selected original research articles

Software We will be using Python: http://www.python.org/ also the add-on library SciPy http://www.scipy.org/ free open-source available for all operating systems widely used

What is Computational Neuroscience? CN borrows methods and ways of thinking from mathematics physics computer science statistics machine learning psychology physiology neuroscience

Levels of Abstraction 1 m CNS PFC PMC 10 cm System HCMP 1 cm Maps 1 mm Networks 100 µm Neurons 1 µm Synapses H O H N 2 C C OH R 1 A Molecules

Integration Quantitative models (mathematical) Nonlinear dynamics Information theory Refinement feedback Psychology Psychology Psychology Psychology Neurophysiology Neurophysiology Experimental Facts facts Computational Computational neuroscience Neuroscience Experimental predictions Predictions Neurophysiology Neurophysiology Neurobiology Neurobiology Applications Refinement feedback

What is a Model? abstraction e.g. architectural model concrete implementation express as equations simulated on a computer generate quantitative predictions test (not confirm) hypothesis

Types of Models Descriptive Models phenomenological functional description characterization of data curve-fitting directionally tuned neuro in primary motor corte Cosing Tuning of M1 Neurons d(m) = b kcosθ CM

Types of Models Explanatory Models explain complex functions in terms of interactions of subordinate (simplified) components Hodgkin-Huxley Model C E Iext R R R L K Na E E L K Na A. B. 150 100 Mebrane potential V 100 50 0 Firing frequency [Hz] 80 60 40 20 50 0 50 100 Time [ms] 0 0

Serial vs Parallel Processing Computers small # of very powerful CPUs CPUs execute complex binary instructions Gflops (billions of operations per sec) Nervous Systems Many simple processing units millions (billions?) of simple integrate-and-fire units out of rich connectivity emerges complex computation

Parallel Distributed Processing artificial neural networks connectionist models interactions between units enables processing abilities not present in single units emergent properties simple rules lead to complex behaviour knowledge emerges that is not explicitly specified in system networks learn from experience

Neural Networks: Properties Generalization Nonlinearity Parallelism Gradedness Contextual Processing Adaptivity Graceful Degradation

Neural Networks: Applications Aerospace: aircraft autopilot systems Banking: OCR (e.g. cheques); credit evaluation; credit card fraud Financial: real estate appraisal; currency/equity price prediction Military: sonar object classification; image identification; who knows what else Industrial: process control; equipment failure prediction Medical: diagnosis; screening

Computing with Neurons McCulloch & Pitts (1943) simple binary units proved: with sufficient # of small all-or-none units, and synaptic connections set to appropriate weights, a network can compute any computable function in h out n n n r h 1 r 1 in w 1 w 2 Σ g r out in r 1 in r 2 out r 1 r 2 in out r n out in r nin w h w out

Perceptron Rosenblatt (1960) single-layer networks, given a simple local learning rule are guaranteed to converge on appropriate weights perceptrons compute mappings from one space to another many to one mappings A. B. C. Dendrites Inhibitory axon terminal Soma Excitatory axon terminal Axon Axon hillock Ranvier node Myelin sheath Synaptic cleft Axon D. E. Postsynaptic neurons

27 Multi-Layer Networks Multilayer NNets single-layer perceptrons are limited to problems that are linearly separable single-layer for more perceptrons complex (e.g. arenonlinear) limited to problems problems that multi-layer are linearlynetworks separable are required multi-layer networks given enough units and appropriate weights can compute any a different (but still relatively simple) mapping learning rule allows for adapting learning synaptic rule weights called backpropagation allows weights to adapt e.g. the backpropagation rule in r 1 in r 2 in r nin input layer in h out n n n w h hidden layer r h 1 w out output layer out r 1 out r n out

Multi-Layer Networks each character is a 12 x 13 matrix

Multi-Layer Networks in h out n n n r h 1 in r 1 in r 2 in r nin out r 1 out r n out w h w out 12 x 13 = 156 input neurons 26 neurons intermediate layer total # synaptic weights = (156*26)26 (26*26)26 = 4,784 26 neurons output layer

Unsupervised Learning Hopfield Nets Supervised Learning requires a training set of input-output examples and a detailed teaching signal Unsupervised Learning approaches allow for learning about properties of the world by exposure only to inputs (exemplars) auto-associative memory outputs Hopfield fully Networks connected to all inputs network outputs will respond fully connected with an tooutput close to inputs a previously learned input pattern auto-associative memory network updates dynamically network updates weights iteratively over time, converges on over time equilibrium states a dynamical system with point attractors or equilibrium states

Hopfield Network train a network on four 5x5 patterns each 5x5 pattern is represented by a vector of length 25 there are 25x25 = 625 weights

Hopfield Network noisy input ITERATION 1 ITERATION 2 ITERATION 3 ITERATION 4 ITERATION 5 ITERATION 6

Hopfield Network partial input ITERATION 1 ITERATION 2 ITERATION 3 ITERATION 4 ITERATION 5 ITERATION 6

Musculoskeletal & Neuromuscular Models Sensory-Motor Control in order to understand what neural control signals to muscles look like, it s necessary to have a detailed account of the neuromuscular plant imagine you were studying the control inputs to a mechanical system containing springs, but you didn t know that springs exert force in response to mechanical deformation what would you conclude about the control signals?

For each muscle, we model the dependence of force on muscle length, on the velocity of muscle lengthening or shortening, the graded development of force over time, and Musculoskeletal & Neuromuscular Models the passive elastic stiffness of muscle (see Fig. 1). The model is a variant of that described by Zajac (1989), with activation and contraction dynamics and passive muscle [x] / Å x, if x ú 0 0, if x 0 The muscle lengths and velocities associated with positive values of A thus define an activation area a region in (2) FIG. 1. Schematic representation of the muscle model. Model includes the dependence of force on muscle length and velocity, graded force development over time, passive elastic stiffness, and length and velocity dependent afferent input. / 9k26$$mr13 J323-7 02-12-98 19:36:11 neupa LP-Neurophys