Neural Networks. Understanding the Brain

Size: px
Start display at page:

Download "Neural Networks. Understanding the Brain"

Transcription

1 Neural Neworks Threshold unis Neural Neworks Gradien descen Mulilayer neworks Backpropagaion Hidden layer represenaions Example: Face Recogniion Advanced opics And, more Neworks of processing unis (neurons) wih connecions (synapses) beween hem Large number of neurons: Large conneciiviy: 5 Parallel processing Disribued compuaion/memory Robus o noise, failures Blue slides: from Michell Orange slides: from Alpaydin Lecure Noes for E Alpaydın 24 Inroducion o Machine Learning The MIT Press (V) 3 Undersanding he Brain Levels of analysis (Marr, 982) Compuaional heory 2 Represenaion and algorihm 3 Hardware implemenaion Reverse engineering: From hardware o heory Parallel processing: SIMD vs MIMD Neural ne: SIMD wih modifiable local memory Learning: Updae by raining/experience Biological Neurons and Neworks Neuron swiching ime second ( ms) Number of neurons Connecions per neuron 4 5 Scene recogniion ime second ( ms) processing seps doesn seem like enough [ ] much parallel compuaion 4 2

2 Arificial Neural Neworks Biologically Moivaed (or Accurae) Neural Neworks k w kj oupu Spiking neurons j w ji hidden Complex morphological models i inpu Deailed dynamical models Many neuron-like hreshold swiching unis (real-valued) Many weighed inerconnecions among unis Highly parallel, disribued process Emphasis on uning weighs auomaically: New learning algorihms, new opimizaion echniques, new learning principles Conneciviy eiher based on or rained o mimic biology Focus on modeling nework/neural/subneural processes Focus on naural principles of neural compuaion Differen forms of learning: spike-iming-dependen plasiciy, covariance learning, shor-erm and long-erm plasiciy, ec 3 4 When o Consider Neural Neworks Example Applicaions (more laer) Sharp Lef Sraigh Ahead Sharp Righ Inpu is high-dimensional discree or real-valued (eg raw sensor inpu) Oupu is discree or real valued Oupu is a vecor of values Possibly noisy daa Long raining ime (may need occasional, exensive reraining) Form of arge funcion is unknown Fas evaluaion of learned arge funcion Human readabiliy of resul is unimporan 5 Examples: 4 Hidden Unis (a) ALVINN Speech synhesis 3 Oupu Unis 3x32 Sensor Inpu Reina (b) hp://yannlecuncom Handwrien characer recogniion (from yannlecuncom) Financial predicion, Transacion fraud deecion (Big issue laely) Driving a car on he highway 6

3 Perceprons Hypohesis Space of Perceprons x x 2 x n w w 2 w n x w Σ n Σ wi x i i n if Σ w > o { i x i i - oherwise x x 2 x n w w 2 w n x w Σ n Σ wi x i i n if Σ w > o { i x i i - oherwise 8 < o(x,, x n ) : if w + w x + + w n x n > oherwise The unable parameers are he weighs w, w,, w n, so he space H of candidae hypoheses is he se of all possible combinaion of real-valued weigh vecors: Someimes we ll use simpler vecor noaion: H { w w R (n+) } 8 < o( x) : if w x > oherwise 7 8 Boolean Logic Gaes wih Percepron Unis W W AND OR W NOT W2 W2 Wha Perceprons Can Represen w w W Oupu Slope W W Russel & Norvig Perceprons can represen basic boolean funcions Oupufs Thus, a nework of percepron unis can compue any Boolean funcion Wha abou OR or EQUIV? Perceprons can only represen linearly separable funcions Oupu of he percepron: W I + W I >, hen oupu is W I + W I, hen oupu is The hypohesis space is a collecion of separaing lines 9

4 Geomeric Inerpreaion w w W Oupufs Oupu Slope W W w w The Role of he Bias W Slope W W Rearranging W I + W I >, hen oupu is, we ge (if W > ) I > W W I + W, where poins above he line, he oupu is, and - for hose below he line Compare wih y W x + W W Wihou he bias ( ), learning is limied o adjusmen of he slope of he separaing line passing hrough he origin Three example lines wih differen weighs are shown 2 Limiaion of Perceprons w w W Oupufs Oupu Slope W W x Generalizing o n-dimensions z (x,y,z) n [a b c] T (x,y,z ) y x y z a b c d hp://mahworldwolframcom/planehml Only funcions where he - poins and poins are clearly separable can be represened by perceprons The geomeric inerpreaion is generalizable o funcions of n argumens, ie percepron wih n inpus plus one hreshold (or bias) uni 3 n (a, b, c), x (x, y, z), x (x, y, z ) Equaion of a plane: n ( x x ) In shor, ax + by + cz + d, where a, b, c can serve as he weigh, and d n x as he bias For n-d inpu space, he decision boundary becomes a (n )-D hyperplane (-D less han he inpu space) 4

5 Linear Separabiliy Linear Separabiliy (con d) Linearly separable No Linearly separable For funcions ha ake ineger or real values as argumens and oupu eiher - or Lef: linearly separable (ie, can draw a sraigh line beween he classes) Righ: no linearly separable (ie, perceprons canno represen such a funcion) AND OR OR Perceprons canno represen OR! Minsky and Paper (969)? 5 6 # I I OR OR in Deail w w W Oupufs W I + W I >, hen oupu is : 2 W > W > 3 W > W > 4 W + W W + W Oupu 2 < W + W < (from 2, 3, and 4), bu (from ), a conradicion 7 Slope W W x x 2 x n w w 2 w n Learning: Percepron Rule x w Σ n Σ wi x i i n if Σ w > o { i x i i - oherwise The weighs do no have o be calculaed manually We can rain he nework wih (inpu,oupu) pair according o he following weigh updae rule: w i w i + η( o)x i where η is he learning rae parameer Proven o converge if inpu se is linearly separable and η is small 8

6 Learning in Perceprons (Con d) w i w i + η( o)x i When o, weigh says When and o, change in weigh is: η( ( ))x i > if x i are all posiive Thus w x will increase, hus evenually, oupu o will urn o When and o, change in weigh is: η( )x i < if x i are all posiive Thus w x will decrease, hus evenually, oupu o will urn o - 9 x y Learning in Percepron: Anoher Look w(a,b) q p a + b The percepron on he lef can be represened as a line shown on he righ (why? see page 4) Learning can be hough of as adjusmen of w urning oward he inpu vecor x: w w + η( o) x Adjusmen of he bias moves he line closer or away from he origin 2 y x Anoher Learning Rule: Dela Rule The percepron rule canno deal wih noisy daa Gradien Descen The dela rule will find an approximae soluion even when inpu se is no linearly separable E[w] 5 2 Use linear uni wihou he sep funcion: o( x) w x Wan o reduce he error by adjusing w: E( w) 2 2 ( d o d ) 2 d D Wan o minimize by adjusing w: E( w) 2 w w Pd D ( d o d ) 2 Noe: he error surface is defined by he raining daa D A differen daa se will give a differen surface E(w, w ) is he error funcion above, and we wan o change (w, w ) o posiion under a low E

7 Gradien Descen (Con d) Gradien Descen (Example) Gradien line 2-2 Training rule: E[ w]» E w, E w, w η E[ w] E w n ie, w i η E Gradien poins in he maximum increasing direcion Gradien is prependicular o he level curve (uphill direcion) E(w, w ) is he error funcion above, so E ( E, E ), a vecor on a 2D plane w w E E Gradien Descen (Con d) 2 2 d d 2 d d ( d o d ) 2 d ( d o d ) 2 2( d o d ) ( d o d ) ( d o d ) ( d w x d ) ( d o d )( x i,d ) Gradien Descen: Summary Gradien-Descen (raining examples, η) Each raining example is a pair of he form x,, where x is he vecor of inpu values, and is he arge oupu value η is he learning rae (eg, 5) Iniialize each w i o some small random value Unil he erminaion condiion is me, Do Iniialize each w i o zero For each x, in raining examples, Do Inpu he insance x o he uni and compue he oupu o For each linear uni weigh w i, Do For each linear uni weigh w i, Do w i w i + η( o)x i Since we wan w i η E, w i η P d ( d o d )x i,d 25 w i w i + w i 26

8 Gradien Descen Properies Gradien descen is effecive in searching hrough a large or infinie H: H conains coninuously parameerized hypoheses, and he error can be differeniaed wr he parameers Limiaions: Sochasic Approximaion o Grad Desc Avoiding local minima: Incremenal gradien descen, or sochasic gradien descen Insead of weigh updae based on all inpu in D, immediaely updae weighs afer each inpu example: w i η( o)x i, convergence can be slow, and finds local minima (global minumum no guaraneed) insead of w i η d D( d o d )x i, Can be seen as minimizing error funcion E d ( w) 2 ( d o d ) Sandard and Sochasic Grad Desc: Differences Summary In he sandard version, error is defined over enire D In he sandard version, more compuaion is needed per weigh updae, bu η can be larger Sochasic version can someimes avoid local minima Percepron raining rule guaraneed o succeed if Training examples are linearly separable Sufficienly small learning rae η Linear uni raining rule using gradien descen Asympoic convergence o hypohesis wih minimum squared error Given sufficienly small learning rae η Even when raining daa conains noise Even when raining daa no separable by H 29 3

9 Exercise: Implemening he Percepron x w x Mulilayer Neworks I is fairly easy o implemen a percepron You can implemen i in any programming language: C/C++, ec Look for examples on he web, and JAVA apple demos x 2 x n w 2 w n w Σ n ne Σ w i x i i Differeniable hreshold uni: sigmoid σ(y) + exp( y) o σ(ne) + ē ne Ineresing propery: dσ(y) dy Oupu: σ(y)( σ(y)) o σ( w x) Oher funcions: anh(y) exp( 2y) exp( 2y) Mulilayer Neworks and Backpropagaion head hid who d hood Error Gradien for a Sigmoid Uni F F2 Nonlinear decision surfaces Oupu Inpu sigm(x+y-) (a) One oupu Anoher example: OR 6 8 Inpu Oupu 2 4 Inpu sigm(sigm(x+y-)+sigm(-x-y+3)-) Inpu 2 (b) Two hidden, one oupu E 2 2 d ( d o d ) 2 d D ( d o d ) 2 2( d o d ) 2 d d d ( d o d ) ( d o d ) «o d o d ( d o d ) ne d 34 ne d

10 From he previous page: Bu we know: So: Error Gradien for a Sigmoid Uni E E d o d ( d o d ) ne d ne d o d σ(ne d) o d ( o d ) ne d ne d ne d ( w x d) d D x i,d ( d o d )o d ( o d )x i,d Backpropagaion Algorihm Iniialize all weighs o small random numbers Unil saisfied, Do For each raining example, Do Inpu he raining example o he nework and compue he nework oupus 2 For each oupu uni k δ k o k ( o k )( k o k ) 3 For each hidden uni h δ h o h ( o h ) P k oupus w khδ k 4 Updae each nework weigh w i,j w ji w ji + w ji where w ji ηδ j x i Noe: w ji is he weigh from i o j (ie, w j i ) For oupu uni: For hidden uni: The δ Term δ k o k ( o k ) ( k o k ) {z } {z } σ (ne k ) Error δ h o h ( o h ) w kh δ k {z } σ k oupus (ne h ) {z } Backpropagaed error In sum, δ is he derivaive imes he error Derivaion o be presened laer Wan o updae weigh as: where error is defined as: Derivaion of w E d ( w) 2 Given ne j P j w jix i, w ji η, k oupus Differen formula for oupu and hidden ( k o k )

11 Derivaion of w: Oupu Uni Weighs From he previous page, Firs, calculae : 2 k oupus 2 ( j o j ) 2 ( k o k ) ( j o j ) ( j o j ) ( j o j ) 39 Derivaion of w: Oupu Uni Weighs From he previous page, Nex, calculae ( j o j ) : : Since o j σ(ne j ), and σ (ne j ) o j ( o j ), Puing everyhing ogeher, o j ( o j ) ( j o j )o j ( o j ) 4 Derivaion of w: Oupu Uni Weighs From he previous page: Since ( j o j )o j ( o j ) P k w jkx k x, ji ( j o j )o j ( o j ) {z } δ j error σ (ne) x i {z} inpu Derivaion of w: Hidden Uni Weighs Sar wih x i : k Downsream(j) k Downsream(j) k Downsream(j) k Downsream(j) k Downsream(j) ne k ne k δ k ne k δ k ne k δ k w kj δ k w kj o j ( o j ) {z } σ (ne) () 4 42

12 Finally, given and Derivaion of w: Hidden Uni Weighs x i, k Downsream(j) w ji η η [o j ( o j ) {z } σ (ne) δ k w kj o j ( o j ), {z } σ (ne) k Downsream(j) δ k w kj ] {z } error {z } δ j x i Exension o Differen Nework Topologies k w kj j w ji i oupu hidden inpu Arbirary number of layers: for neurons in layer m: δ r o r ( o r ) Arbirary acyclic graph: δ r o r ( o r ) s layer m+ w sr δs w sr δs s Downsream(r) Backpropagaion: Properies Gradien descen over enire nework weigh vecor Easily generalized o arbirary direced graphs Will find a local, no necessarily global error minimum: In pracice, ofen works well (can run muliple imes wih differen iniial weighs) Ofen include weigh momenum α w i,j (n) ηδ j x i,j + α w i,j (n ) Represenaional Power of Feedforward Neworks Boolean funcions: every boolean funcion represenable wih wo layers (hidden uni size can grow exponenially in he wors case: one hidden uni per inpu example, and OR hem) Coninous funcions: Every bounded coninuous funcion can be approximaed wih an arbirarily small error (oupu unis are linear) Arbirary funcions: wih hree layers (oupu unis are linear) Minimizes error over raining examples: Will i generalize well o subsequen examples? Training can ake housands of ieraions slow! Using he nework afer raining is very fas 45 46

13 H-Space Search and Inducive Bias Learning Hidden Layer Represenaions H-space n-d weigh space (when here are n weighs) The space is coninuous, unlike decision ree or general-o-specific concep learning algorihms Inducive bias: Smooh inerpolaion beween daa poins Inpus Oupus Inpu Oupu Learned Hidden Layer Represenaions Learned Hidden Layer Represenaions Inpus Oupus Inpu Hidden Oupu Values Learned encoding is similar o sandard 3-bi binary code Auomaic discovery of useful hidden layer represenaions is a key feaure of ANN Noe: The hidden layer represenaion is compressed 5

14 Error Error versus weigh updaes (example ) Training se error Validaion se error Number of weigh updaes Overfiing Error Error versus weigh updaes (example 2) Training se error Validaion se error Number of weigh updaes Penalize large weighs: E( w) 2 Alernaive Error Funcions d D k oupus( kd o kd ) 2 + γ i,j w 2 ji Train on arge slopes as well as values (when he slope is available): Error in wo differen robo percepion asks Training se and validaion se error Early sopping ensures good performance on unobserved samples, bu mus be careful Weigh decay, use of validaion ses, use of k-fold cross-validaion, ec o overcome he problem 2 E( w) 4( kd o kd ) 2 + kd 2 d D k oupus j inpus x j d Tie ogeher weighs: eg, in phoneme recogniion nework, or handwrien characer recogniion (weigh sharing) o kd x j d A Recurren Neworks Recurren Neworks (Con d) oupu hidden delay Sequence recogniion Sore ree srucure (nex slide) Can be rained wih plain inpu sack inpu, sack inpu sack delay A A (A, B) B B C (A, B) (C, A, B) C (A, B) delay inpu conex backpropagaion Generalizaion may no be perfec Auoassociaion (inpu oupu) Represen a sack using he hidden layer represenaion Accuracy depends on numerical precision 53 54

15 36 37 Learning Time Time-Delay Neural Neworks Applicaions: Sequence recogniion: Speech recogniion Sequence reproducion: Time-series predicion Sequence associaion Nework archiecures Time-delay neworks (Waibel e al, 989) Recurren neworks (Rumelhar e al, 986) ecure Noes for E Alpaydın 24 Inroducion o Machine Learning The MIT Press (V) 34 Lecure Noes for E Alpaydın 24 Inroducion o Machine Learning The MIT Press (V) 35 Recurren Neworks Unfolding in Time

16 Some Applicaions: NETalk NETalk: Sejnowski and Rosenberg (987) Learn o pronounce English ex Demo Daa available in UCI ML reposiory NETalk daa aardvark a-rdvark <<<>2<< aback xb@k-><< <>< abaf xb@f ><< 2<>> abandon xb@ndxn ><>< abase xbes-><< abash xb@s-><< abae xbe-><< Word Pronunciaion Sress/Syllable abou 2, words Backpropagaion Exercise URL: hp://wwwcsamuedu/faculy/choe/src/backprop-6argz Unar and read he README file: gzip -dc backprop-6argz ar xvf - Run make o build (on deparmenal unix machines) Run /bp conf/xorconf ec Backpropagaion: Example Resuls Error Backprop OR AND OR , Epochs Epoch: one full cycle of raining hrough all raining inpu paerns OR was easies, AND he nex, and OR was he mos difficul o learn Nework had 2 inpu, 2 hidden and oupu uni Learning rae was 57 58

17 27 28 Backpropagaion: Example Resuls (con d) Backpropagaion: Things o Try Error Backprop OR AND OR OR, Epochs AND How does increasing he number of hidden layer unis affec he () ime and he (2) number of epochs of raining? How does increasing or decreasing he learning rae affec he rae of convergence? How does changing he slope of he sigmoid affec he rae of convergence? Differen problem domains: handwriing recogniion, ec Oupu o (,), (,), (,), and (,) form each row OR 59 6 Srucured MLP Weigh Sharing (Le Cun e al, 989)

18 Tuning he Nework Size Bayesian Learning Desrucive Weigh decay: w i E' E E η w λ + 2 i i w λw 2 i i Consrucive Growing neworks (Ash, 989) (Fahlman and Lebiere, 989) Consider weighs w i as random vars, prior p(w i ) p ( w ) p p logp ( w ) p( w ) wˆ MAP arg max p( ) w ( w ) logp( w ) + logp( w ) ( w ) p( wi ) where p( wi ) i E' E + λ w ( w ) 2 w i c exp 2( / 2λ) Weigh decay, ridge regression, regularizaion cosdaa-misfi + λ complexiy 2 logp + C ecure Noes for E Alpaydın 24 Inroducion o Machine Learning The MIT Press (V) 3 Lecure Noes for E Alpaydın 24 Inroducion o Machine Learning The MIT Press (V) 3 Summary ANN learning provides general mehod for learning real-valued funcions over coninuous or discree-valued aribued ANNs are robus o noise H is he space of all funcions parameerized by he weighs H space search is hrough gradien descen: convergence o local minima Backpropagaion gives novel hidden layer represenaions Overfiing is an issue More advanced algorihms exis 6

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks - Deep Learning: Theory, Techniques & Applicaions - Recurren Neural Neworks - Prof. Maeo Maeucci maeo.maeucci@polimi.i Deparmen of Elecronics, Informaion and Bioengineering Arificial Inelligence and Roboics

More information

CSCE 496/896 Lecture 2: Basic Artificial Neural Networks. Stephen Scott. Introduction. Supervised Learning. Basic Units.

CSCE 496/896 Lecture 2: Basic Artificial Neural Networks. Stephen Scott. Introduction. Supervised Learning. Basic Units. (Adaped from Vinod Variyam, Ehem Alpaydin, Tom Michell, Ian Goodfellow, and Aurélien Géron) learning is mos fundamenal, classic form machine learning par comes from he par labels for examples (insances)

More information

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he

More information

Slide03 Historical Overview Haykin Chapter 3 (Chap 1, 3, 3rd Ed): Single-Layer Perceptrons Multiple Faces of a Single Neuron Part I: Adaptive Filter

Slide03 Historical Overview Haykin Chapter 3 (Chap 1, 3, 3rd Ed): Single-Layer Perceptrons Multiple Faces of a Single Neuron Part I: Adaptive Filter Slide3 Haykin Chaper 3 (Chap, 3, 3rd Ed): Single-Layer Perceprons CPSC 636-6 Insrucor: Yoonsuck Choe Hisorical Overview McCulloch and Pis (943): neural neworks as compuing machines. Hebb (949): posulaed

More information

The Rosenblatt s LMS algorithm for Perceptron (1958) is built around a linear neuron (a neuron with a linear

The Rosenblatt s LMS algorithm for Perceptron (1958) is built around a linear neuron (a neuron with a linear In The name of God Lecure4: Percepron and AALIE r. Majid MjidGhoshunih Inroducion The Rosenbla s LMS algorihm for Percepron 958 is buil around a linear neuron a neuron ih a linear acivaion funcion. Hoever,

More information

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN The MIT Press, 2014 Lecure Slides for INTRODUCTION TO MACHINE LEARNING 3RD EDITION alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/i2ml3e CHAPTER 2: SUPERVISED LEARNING Learning a Class

More information

Dimitri Solomatine. D.P. Solomatine. Data-driven modelling (part 2). 2

Dimitri Solomatine. D.P. Solomatine. Data-driven modelling (part 2). 2 Daa-driven modelling. Par. Daa-driven Arificial di Neural modelling. Newors Par Dimiri Solomaine Arificial neural newors D.P. Solomaine. Daa-driven modelling par. 1 Arificial neural newors ANN: main pes

More information

Ensamble methods: Boosting

Ensamble methods: Boosting Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room

More information

Ensamble methods: Bagging and Boosting

Ensamble methods: Bagging and Boosting Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par

More information

Predator - Prey Model Trajectories and the nonlinear conservation law

Predator - Prey Model Trajectories and the nonlinear conservation law Predaor - Prey Model Trajecories and he nonlinear conservaion law James K. Peerson Deparmen of Biological Sciences and Deparmen of Mahemaical Sciences Clemson Universiy Ocober 28, 213 Ouline Drawing Trajecories

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.

More information

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks Threshold units Gradient descent Multilayer networks Backpropagation Hidden layer representations Example: Face Recognition Advanced topics 1 Connectionist Models Consider humans:

More information

1 Review of Zero-Sum Games

1 Review of Zero-Sum Games COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any

More information

CSE/NB 528 Lecture 14: From Supervised to Reinforcement Learning (Chapter 9) R. Rao, 528: Lecture 14

CSE/NB 528 Lecture 14: From Supervised to Reinforcement Learning (Chapter 9) R. Rao, 528: Lecture 14 CSE/NB 58 Lecure 14: From Supervised o Reinforcemen Learning Chaper 9 1 Recall from las ime: Sigmoid Neworks Oupu v T g w u g wiui w Inpu nodes u = u 1 u u 3 T i Sigmoid oupu funcion: 1 g a 1 a e 1 ga

More information

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Article from. Predictive Analytics and Futurism. July 2016 Issue 13 Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes

2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes Some common engineering funcions 2.7 Inroducion This secion provides a caalogue of some common funcions ofen used in Science and Engineering. These include polynomials, raional funcions, he modulus funcion

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Noes for EE7C Spring 018: Convex Opimizaion and Approximaion Insrucor: Moriz Hard Email: hard+ee7c@berkeley.edu Graduae Insrucor: Max Simchowiz Email: msimchow+ee7c@berkeley.edu Ocober 15, 018 3

More information

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t... Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger

More information

Section 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients

Section 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients Secion 3.5 Nonhomogeneous Equaions; Mehod of Undeermined Coefficiens Key Terms/Ideas: Linear Differenial operaor Nonlinear operaor Second order homogeneous DE Second order nonhomogeneous DE Soluion o homogeneous

More information

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),

More information

Tasty Coffee example

Tasty Coffee example Lecure Slides for (Binary) Classificaion: Learning a Class from labeled Examples ITRODUCTIO TO Machine Learning ETHEM ALPAYDI The MIT Press, 00 (modified by dph, 0000) CHAPTER : Supervised Learning Things

More information

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still. Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in

More information

Matlab and Python programming: how to get started

Matlab and Python programming: how to get started Malab and Pyhon programming: how o ge sared Equipping readers he skills o wrie programs o explore complex sysems and discover ineresing paerns from big daa is one of he main goals of his book. In his chaper,

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

An introduction to the theory of SDDP algorithm

An introduction to the theory of SDDP algorithm An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking

More information

EXERCISES FOR SECTION 1.5

EXERCISES FOR SECTION 1.5 1.5 Exisence and Uniqueness of Soluions 43 20. 1 v c 21. 1 v c 1 2 4 6 8 10 1 2 2 4 6 8 10 Graph of approximae soluion obained using Euler s mehod wih = 0.1. Graph of approximae soluion obained using Euler

More information

Notes on Kalman Filtering

Notes on Kalman Filtering Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren

More information

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LDA, logisic

More information

Slide04 Haykin Chapter 4: Multi-Layer Perceptrons

Slide04 Haykin Chapter 4: Multi-Layer Perceptrons Introduction Slide4 Hayin Chapter 4: Multi-Layer Perceptrons CPSC 636-6 Instructor: Yoonsuc Choe Spring 28 Networs typically consisting of input, hidden, and output layers. Commonly referred to as Multilayer

More information

Some Basic Information about M-S-D Systems

Some Basic Information about M-S-D Systems Some Basic Informaion abou M-S-D Sysems 1 Inroducion We wan o give some summary of he facs concerning unforced (homogeneous) and forced (non-homogeneous) models for linear oscillaors governed by second-order,

More information

Lecture 9: September 25

Lecture 9: September 25 0-725: Opimizaion Fall 202 Lecure 9: Sepember 25 Lecurer: Geoff Gordon/Ryan Tibshirani Scribes: Xuezhi Wang, Subhodeep Moira, Abhimanu Kumar Noe: LaTeX emplae couresy of UC Berkeley EECS dep. Disclaimer:

More information

Stability and Bifurcation in a Neural Network Model with Two Delays

Stability and Bifurcation in a Neural Network Model with Two Delays Inernaional Mahemaical Forum, Vol. 6, 11, no. 35, 175-1731 Sabiliy and Bifurcaion in a Neural Nework Model wih Two Delays GuangPing Hu and XiaoLing Li School of Mahemaics and Physics, Nanjing Universiy

More information

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid

More information

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LTU, decision

More information

HW6: MRI Imaging Pulse Sequences (7 Problems for 100 pts)

HW6: MRI Imaging Pulse Sequences (7 Problems for 100 pts) HW6: MRI Imaging Pulse Sequences (7 Problems for 100 ps) GOAL The overall goal of HW6 is o beer undersand pulse sequences for MRI image reconsrucion. OBJECTIVES 1) Design a spin echo pulse sequence o image

More information

10. State Space Methods

10. State Space Methods . Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he

More information

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006 2.160 Sysem Idenificaion, Esimaion, and Learning Lecure Noes No. 8 March 6, 2006 4.9 Eended Kalman Filer In many pracical problems, he process dynamics are nonlinear. w Process Dynamics v y u Model (Linearized)

More information

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes Represening Periodic Funcions by Fourier Series 3. Inroducion In his Secion we show how a periodic funcion can be expressed as a series of sines and cosines. We begin by obaining some sandard inegrals

More information

Hidden Markov Models

Hidden Markov Models Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe

More information

Reading from Young & Freedman: For this topic, read sections 25.4 & 25.5, the introduction to chapter 26 and sections 26.1 to 26.2 & 26.4.

Reading from Young & Freedman: For this topic, read sections 25.4 & 25.5, the introduction to chapter 26 and sections 26.1 to 26.2 & 26.4. PHY1 Elecriciy Topic 7 (Lecures 1 & 11) Elecric Circuis n his opic, we will cover: 1) Elecromoive Force (EMF) ) Series and parallel resisor combinaions 3) Kirchhoff s rules for circuis 4) Time dependence

More information

CSE/NB 528 Lecture 14: Reinforcement Learning (Chapter 9)

CSE/NB 528 Lecture 14: Reinforcement Learning (Chapter 9) CSE/NB 528 Lecure 14: Reinforcemen Learning Chaper 9 Image from hp://clasdean.la.asu.edu/news/images/ubep2001/neuron3.jpg Lecure figures are from Dayan & Abbo s book hp://people.brandeis.edu/~abbo/book/index.hml

More information

CHAPTER 12 DIRECT CURRENT CIRCUITS

CHAPTER 12 DIRECT CURRENT CIRCUITS CHAPTER 12 DIRECT CURRENT CIUITS DIRECT CURRENT CIUITS 257 12.1 RESISTORS IN SERIES AND IN PARALLEL When wo resisors are conneced ogeher as shown in Figure 12.1 we said ha hey are conneced in series. As

More information

) were both constant and we brought them from under the integral.

) were both constant and we brought them from under the integral. YIELD-PER-RECRUIT (coninued The yield-per-recrui model applies o a cohor, bu we saw in he Age Disribuions lecure ha he properies of a cohor do no apply in general o a collecion of cohors, which is wha

More information

Chapter 3 Boundary Value Problem

Chapter 3 Boundary Value Problem Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le

More information

Random Walk with Anti-Correlated Steps

Random Walk with Anti-Correlated Steps Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and

More information

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering Inroducion o Arificial Inelligence V22.0472-001 Fall 2009 Lecure 18: aricle & Kalman Filering Announcemens Final exam will be a 7pm on Wednesday December 14 h Dae of las class 1.5 hrs long I won ask anyhing

More information

MATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018

MATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018 MATH 5720: Gradien Mehods Hung Phan, UMass Lowell Ocober 4, 208 Descen Direcion Mehods Consider he problem min { f(x) x R n}. The general descen direcions mehod is x k+ = x k + k d k where x k is he curren

More information

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions Muli-Period Sochasic Models: Opimali of (s, S) Polic for -Convex Objecive Funcions Consider a seing similar o he N-sage newsvendor problem excep ha now here is a fixed re-ordering cos (> 0) for each (re-)order.

More information

EECE 301 Signals & Systems Prof. Mark Fowler

EECE 301 Signals & Systems Prof. Mark Fowler EECE 3 Signals & Sysems Prof. Mark Fowler Noe Se # Wha are Coninuous-Time Signals??? /6 Coninuous-Time Signal Coninuous Time (C-T) Signal: A C-T signal is defined on he coninuum of ime values. Tha is:

More information

Two Coupled Oscillators / Normal Modes

Two Coupled Oscillators / Normal Modes Lecure 3 Phys 3750 Two Coupled Oscillaors / Normal Modes Overview and Moivaion: Today we ake a small, bu significan, sep owards wave moion. We will no ye observe waves, bu his sep is imporan in is own

More information

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,

More information

STATE-SPACE MODELLING. A mass balance across the tank gives:

STATE-SPACE MODELLING. A mass balance across the tank gives: B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing

More information

Retrieval Models. Boolean and Vector Space Retrieval Models. Common Preprocessing Steps. Boolean Model. Boolean Retrieval Model

Retrieval Models. Boolean and Vector Space Retrieval Models. Common Preprocessing Steps. Boolean Model. Boolean Retrieval Model 1 Boolean and Vecor Space Rerieval Models Many slides in his secion are adaped from Prof. Joydeep Ghosh (UT ECE) who in urn adaped hem from Prof. Dik Lee (Univ. of Science and Tech, Hong Kong) Rerieval

More information

Laplace transfom: t-translation rule , Haynes Miller and Jeremy Orloff

Laplace transfom: t-translation rule , Haynes Miller and Jeremy Orloff Laplace ransfom: -ranslaion rule 8.03, Haynes Miller and Jeremy Orloff Inroducory example Consider he sysem ẋ + 3x = f(, where f is he inpu and x he response. We know is uni impulse response is 0 for

More information

Embedded Systems and Software. A Simple Introduction to Embedded Control Systems (PID Control)

Embedded Systems and Software. A Simple Introduction to Embedded Control Systems (PID Control) Embedded Sysems and Sofware A Simple Inroducion o Embedded Conrol Sysems (PID Conrol) Embedded Sysems and Sofware, ECE:3360. The Universiy of Iowa, 2016 Slide 1 Acknowledgemens The maerial in his lecure

More information

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)

More information

Lecture Notes 2. The Hilbert Space Approach to Time Series

Lecture Notes 2. The Hilbert Space Approach to Time Series Time Series Seven N. Durlauf Universiy of Wisconsin. Basic ideas Lecure Noes. The Hilber Space Approach o Time Series The Hilber space framework provides a very powerful language for discussing he relaionship

More information

Chapter 2. First Order Scalar Equations

Chapter 2. First Order Scalar Equations Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.

More information

Pattern Classification and NNet applications with memristive crossbar circuits. Fabien ALIBART D. Strukov s group, ECE-UCSB Now at IEMN-CNRS, France

Pattern Classification and NNet applications with memristive crossbar circuits. Fabien ALIBART D. Strukov s group, ECE-UCSB Now at IEMN-CNRS, France Paern Classificaion and NNe applicaions wih memrisive crossbar circuis Fabien ALIBART D. Srukov s group, ECE-UCSB Now a IEMN-CNRS, France Ouline Inroducion: Neural Nework wih memrisive devices Engineering

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

Hamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation:

Hamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation: M ah 5 7 Fall 9 L ecure O c. 4, 9 ) Hamilon- J acobi Equaion: Weak S oluion We coninue he sudy of he Hamilon-Jacobi equaion: We have shown ha u + H D u) = R n, ) ; u = g R n { = }. ). In general we canno

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and

More information

Sequential Importance Resampling (SIR) Particle Filter

Sequential Importance Resampling (SIR) Particle Filter Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle

More information

Mathcad Lecture #8 In-class Worksheet Curve Fitting and Interpolation

Mathcad Lecture #8 In-class Worksheet Curve Fitting and Interpolation Mahcad Lecure #8 In-class Workshee Curve Fiing and Inerpolaion A he end of his lecure, you will be able o: explain he difference beween curve fiing and inerpolaion decide wheher curve fiing or inerpolaion

More information

5. Stochastic processes (1)

5. Stochastic processes (1) Lec05.pp S-38.45 - Inroducion o Teleraffic Theory Spring 2005 Conens Basic conceps Poisson process 2 Sochasic processes () Consider some quaniy in a eleraffic (or any) sysem I ypically evolves in ime randomly

More information

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18 A Firs ourse on Kineics and Reacion Engineering lass 19 on Uni 18 Par I - hemical Reacions Par II - hemical Reacion Kineics Where We re Going Par III - hemical Reacion Engineering A. Ideal Reacors B. Perfecly

More information

Ground Rules. PC1221 Fundamentals of Physics I. Kinematics. Position. Lectures 3 and 4 Motion in One Dimension. A/Prof Tay Seng Chuan

Ground Rules. PC1221 Fundamentals of Physics I. Kinematics. Position. Lectures 3 and 4 Motion in One Dimension. A/Prof Tay Seng Chuan Ground Rules PC11 Fundamenals of Physics I Lecures 3 and 4 Moion in One Dimension A/Prof Tay Seng Chuan 1 Swich off your handphone and pager Swich off your lapop compuer and keep i No alking while lecure

More information

More Digital Logic. t p output. Low-to-high and high-to-low transitions could have different t p. V in (t)

More Digital Logic. t p output. Low-to-high and high-to-low transitions could have different t p. V in (t) EECS 4 Spring 23 Lecure 2 EECS 4 Spring 23 Lecure 2 More igial Logic Gae delay and signal propagaion Clocked circui elemens (flip-flop) Wriing a word o memory Simplifying digial circuis: Karnaugh maps

More information

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence Supplemen for Sochasic Convex Opimizaion: Faser Local Growh Implies Faser Global Convergence Yi Xu Qihang Lin ianbao Yang Proof of heorem heorem Suppose Assumpion holds and F (w) obeys he LGC (6) Given

More information

Phys1112: DC and RC circuits

Phys1112: DC and RC circuits Name: Group Members: Dae: TA s Name: Phys1112: DC and RC circuis Objecives: 1. To undersand curren and volage characerisics of a DC RC discharging circui. 2. To undersand he effec of he RC ime consan.

More information

EECE 301 Signals & Systems Prof. Mark Fowler

EECE 301 Signals & Systems Prof. Mark Fowler EECE 3 Signals & Sysems Prof. Mark Fowler Noe Se #2 Wha are Coninuous-Time Signals??? Reading Assignmen: Secion. of Kamen and Heck /22 Course Flow Diagram The arrows here show concepual flow beween ideas.

More information

Sections 2.2 & 2.3 Limit of a Function and Limit Laws

Sections 2.2 & 2.3 Limit of a Function and Limit Laws Mah 80 www.imeodare.com Secions. &. Limi of a Funcion and Limi Laws In secion. we saw how is arise when we wan o find he angen o a curve or he velociy of an objec. Now we urn our aenion o is in general

More information

Introduction to Numerical Analysis. In this lesson you will be taken through a pair of techniques that will be used to solve the equations of.

Introduction to Numerical Analysis. In this lesson you will be taken through a pair of techniques that will be used to solve the equations of. Inroducion o Nuerical Analysis oion In his lesson you will be aen hrough a pair of echniques ha will be used o solve he equaions of and v dx d a F d for siuaions in which F is well nown, and he iniial

More information

Réseaux de neurones récurrents Handwriting Recognition with Long Short-Term Memory Networks

Réseaux de neurones récurrents Handwriting Recognition with Long Short-Term Memory Networks Réseaux de neurones récurrens Handwriing Recogniion wih Long Shor-Term Memory Neworks Dr. Marcus Eichenberger-Liwicki DFKI, Germany Marcus.Liwicki@dfki.de Handwriing Recogniion (Sae of he Ar) Transform

More information

KINEMATICS IN ONE DIMENSION

KINEMATICS IN ONE DIMENSION KINEMATICS IN ONE DIMENSION PREVIEW Kinemaics is he sudy of how hings move how far (disance and displacemen), how fas (speed and velociy), and how fas ha how fas changes (acceleraion). We say ha an objec

More information

Experiments on logistic regression

Experiments on logistic regression Experimens on logisic regression Ning Bao March, 8 Absrac In his repor, several experimens have been conduced on a spam daa se wih Logisic Regression based on Gradien Descen approach. Firs, he overfiing

More information

Robotics I. April 11, The kinematics of a 3R spatial robot is specified by the Denavit-Hartenberg parameters in Tab. 1.

Robotics I. April 11, The kinematics of a 3R spatial robot is specified by the Denavit-Hartenberg parameters in Tab. 1. Roboics I April 11, 017 Exercise 1 he kinemaics of a 3R spaial robo is specified by he Denavi-Harenberg parameers in ab 1 i α i d i a i θ i 1 π/ L 1 0 1 0 0 L 3 0 0 L 3 3 able 1: able of DH parameers of

More information

Christos Papadimitriou & Luca Trevisan November 22, 2016

Christos Papadimitriou & Luca Trevisan November 22, 2016 U.C. Bereley CS170: Algorihms Handou LN-11-22 Chrisos Papadimiriou & Luca Trevisan November 22, 2016 Sreaming algorihms In his lecure and he nex one we sudy memory-efficien algorihms ha process a sream

More information

5.1 - Logarithms and Their Properties

5.1 - Logarithms and Their Properties Chaper 5 Logarihmic Funcions 5.1 - Logarihms and Their Properies Suppose ha a populaion grows according o he formula P 10, where P is he colony size a ime, in hours. When will he populaion be 2500? We

More information

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Georey E. Hinton. University oftoronto.   Technical Report CRG-TR February 22, Abstract Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical

More information

Guest Lectures for Dr. MacFarlane s EE3350 Part Deux

Guest Lectures for Dr. MacFarlane s EE3350 Part Deux Gues Lecures for Dr. MacFarlane s EE3350 Par Deux Michael Plane Mon., 08-30-2010 Wrie name in corner. Poin ou his is a review, so I will go faser. Remind hem o go lisen o online lecure abou geing an A

More information

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time. Supplemenary Figure 1 Spike-coun auocorrelaions in ime. Normalized auocorrelaion marices are shown for each area in a daase. The marix shows he mean correlaion of he spike coun in each ime bin wih he spike

More information

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 17

Designing Information Devices and Systems I Spring 2019 Lecture Notes Note 17 EES 16A Designing Informaion Devices and Sysems I Spring 019 Lecure Noes Noe 17 17.1 apaciive ouchscreen In he las noe, we saw ha a capacior consiss of wo pieces on conducive maerial separaed by a nonconducive

More information

ODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004

ODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004 ODEs II, Lecure : Homogeneous Linear Sysems - I Mike Raugh March 8, 4 Inroducion. In he firs lecure we discussed a sysem of linear ODEs for modeling he excreion of lead from he human body, saw how o ransform

More information

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality Marix Versions of Some Refinemens of he Arihmeic-Geomeric Mean Inequaliy Bao Qi Feng and Andrew Tonge Absrac. We esablish marix versions of refinemens due o Alzer ], Carwrigh and Field 4], and Mercer 5]

More information

Differential Equations

Differential Equations Mah 21 (Fall 29) Differenial Equaions Soluion #3 1. Find he paricular soluion of he following differenial equaion by variaion of parameer (a) y + y = csc (b) 2 y + y y = ln, > Soluion: (a) The corresponding

More information

Isolated-word speech recognition using hidden Markov models

Isolated-word speech recognition using hidden Markov models Isolaed-word speech recogniion using hidden Markov models Håkon Sandsmark December 18, 21 1 Inroducion Speech recogniion is a challenging problem on which much work has been done he las decades. Some of

More information

Welcome Back to Physics 215!

Welcome Back to Physics 215! Welcome Back o Physics 215! (General Physics I) Thurs. Jan 19 h, 2017 Lecure01-2 1 Las ime: Syllabus Unis and dimensional analysis Today: Displacemen, velociy, acceleraion graphs Nex ime: More acceleraion

More information

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions

More information

Class Meeting # 10: Introduction to the Wave Equation

Class Meeting # 10: Introduction to the Wave Equation MATH 8.5 COURSE NOTES - CLASS MEETING # 0 8.5 Inroducion o PDEs, Fall 0 Professor: Jared Speck Class Meeing # 0: Inroducion o he Wave Equaion. Wha is he wave equaion? The sandard wave equaion for a funcion

More information

RANDOM LAGRANGE MULTIPLIERS AND TRANSVERSALITY

RANDOM LAGRANGE MULTIPLIERS AND TRANSVERSALITY ECO 504 Spring 2006 Chris Sims RANDOM LAGRANGE MULTIPLIERS AND TRANSVERSALITY 1. INTRODUCTION Lagrange muliplier mehods are sandard fare in elemenary calculus courses, and hey play a cenral role in economic

More information

Solutions from Chapter 9.1 and 9.2

Solutions from Chapter 9.1 and 9.2 Soluions from Chaper 9 and 92 Secion 9 Problem # This basically boils down o an exercise in he chain rule from calculus We are looking for soluions of he form: u( x) = f( k x c) where k x R 3 and k is

More information

References are appeared in the last slide. Last update: (1393/08/19)

References are appeared in the last slide. Last update: (1393/08/19) SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be

More information

Longest Common Prefixes

Longest Common Prefixes Longes Common Prefixes The sandard ordering for srings is he lexicographical order. I is induced by an order over he alphabe. We will use he same symbols (,

More information

Fishing limits and the Logistic Equation. 1

Fishing limits and the Logistic Equation. 1 Fishing limis and he Logisic Equaion. 1 1. The Logisic Equaion. The logisic equaion is an equaion governing populaion growh for populaions in an environmen wih a limied amoun of resources (for insance,

More information

( ) ( ) if t = t. It must satisfy the identity. So, bulkiness of the unit impulse (hyper)function is equal to 1. The defining characteristic is

( ) ( ) if t = t. It must satisfy the identity. So, bulkiness of the unit impulse (hyper)function is equal to 1. The defining characteristic is UNIT IMPULSE RESPONSE, UNIT STEP RESPONSE, STABILITY. Uni impulse funcion (Dirac dela funcion, dela funcion) rigorously defined is no sricly a funcion, bu disribuion (or measure), precise reamen requires

More information