Neural Networks. Understanding the Brain

Size: px
Start display at page:

Download "Neural Networks. Understanding the Brain"

Transcription

1 Threshold uns Graden descen Mullayer neworks Backpropagaon Hdden layer represenaons Example: Face Recognon Advanced opcs Neural Neworks Neural Neworks Neworks of processng uns (neurons) wh connecons (synapses) beween hem Large number of neurons: 1 1 Large connecvy: 1 5 Parallel processng Dsrbued compuaon/memory Robus o nose, falures And, more Blue sldes: from Mchell Turquose sldes: from Alpaydn 1 Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 3 Undersandng he Bran Levels of analyss (Marr, 1982) 1 Compuaonal heory 2 Represenaon and algorhm 3 Hardware mplemenaon Reverse engneerng: From hardware o heory Parallel processng: SIMD vs MIMD Neural ne: SIMD wh modfable local memory Learnng: Updae by ranng/experence Bologcal Neurons and Neworks Neuron swchng me 1 second (1 ms) Number of neurons 1 1 Connecons per neuron Scene recognon me 1 second (1 ms) 1 processng seps doesn seem lke enough [ ] much parallel compuaon Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 4 2

2 Arfcal Neural Neworks Bologcally Movaed (or Accurae) Neural Neworks k w kj oupu Spkng neurons j w j hdden Complex morphologcal models npu Dealed dynamcal models Many neuron-lke hreshold swchng uns (real-valued) Many weghed nerconnecons among uns Hghly parallel, dsrbued process Emphass on unng weghs auomacally: New learnng algorhms, new opmzaon echnques, new learnng prncples Connecvy eher based on or raned o mmc bology Focus on modelng nework/neural/subneural processes Focus on naural prncples of neural compuaon Dfferen forms of learnng: spke-mng-dependen plascy, covarance learnng, shor-erm and long-erm plascy, ec 3 4 When o Consder Neural Neworks Example Applcaons (more laer) Sharp Lef Sragh Ahead Sharp Rgh Inpu s hgh-dmensonal dscree or real-valued (eg raw sensor npu) Oupu s dscree or real valued Oupu s a vecor of values Possbly nosy daa Long ranng me (may need occasonal, exensve reranng) Form of arge funcon s unknown Fas evaluaon of learned arge funcon Human readably of resul s unmporan 5 Examples: 4 Hdden Uns (a) ALVINN Speech synhess 3 Oupu Uns 3x32 Sensor Inpu Rena (b) hp://yannlecuncom Handwren characer recognon (from yannlecuncom) Fnancal predcon, Transacon fraud deecon (Bg ssue laely) Drvng a car on he hghway 6

3 Perceprons Hypohess Space of Perceprons x 1 x 2 x n w 1 w 2 w n x 1 w Σ n Σ w x n 1 f Σ w > o { x -1 oherwse x 1 x 2 x n w 1 w 2 w n x 1 w Σ n Σ w x n 1 f Σ w > o { x -1 oherwse o(x 1,, x n ) 1 1 f w + w 1 x w n x n > oherwse The unable parameers are he weghs w, w 1,, w n, so he space H of canddae hypoheses s he se of all possble combnaon of real-valued wegh vecors: Somemes we ll use smpler vecor noaon: H { w w R (n+1) } o( x) 1 1 f w x > oherwse 7 8 Boolean Logc Gaes wh Percepron Uns AND OR 1 NOT W21 W21 1 Wha Perceprons Can Represen w w1 Oupu 1 Slope W Russel & Norvg npu: { 1, 1} Oupufs Perceprons can represen basc boolean funcons Thus, a nework of percepron uns can compue any Boolean funcon Wha abou XOR or EQUIV? Perceprons can only represen lnearly separable funcons Oupu of he percepron: W I + W 1 I 1 >, hen oupu s 1 W I + W 1 I 1, hen oupu s 1 The hypohess space s a collecon of separang lnes 9 1

4 1 Geomerc Inerpreaon w w1 Oupufs Oupu 1 Slope W 1 w w1 The Role of he Bas Slope W Rearrangng W I + W 1 I 1 >, hen oupu s 1, we ge (f W 1 > ) I 1 > W W 1 I + W 1, where pons above he lne, he oupu s 1, and -1 for hose below he lne Compare wh y W x + W 1 11 W 1 Whou he bas ( ), learnng s lmed o adjusmen of he slope of he separang lne passng hrough he orgn Three example lnes wh dfferen weghs are shown 12 1 Lmaon of Perceprons w w1 Oupufs Oupu 1 Slope W x Generalzng o n-dmensons z (x,y,z) 1 n [a b c] T (x,y,z ) y x y z a b c d hp://mahworldwolframcom/planehml Only funcons where he -1 pons and 1 pons are clearly separable can be represened by perceprons The geomerc nerpreaon s generalzable o funcons of n argumens, e percepron wh n npus plus one hreshold (or bas) un 13 n (a, b, c), x (x, y, z), x (x, y, z ) Equaon of a plane: n ( x x ) In shor, ax + by + cz + d, where a, b, c can serve as he wegh, and d n x n n as he bas For n-d npu space, he decson boundary becomes a (n 1)-D hyperplane (1-D less han he npu space) 14

5 Lnear Separably Lnear Separably (con d) Lnearly separable No Lnearly separable For funcons ha ake neger or real values as argumens and oupu eher -1 or 1 Lef: lnearly separable (e, can draw a sragh lne beween he classes) Rgh: no lnearly separable (e, perceprons canno represen such a funcon) AND OR XOR Perceprons canno represen XOR! Mnsky and Paper (1969) 1? # I I 1 XOR XOR n Deal 1 w w1 Oupufs W I + W 1 I 1 >, hen oupu s 1: 1 2 W 1 > W 1 > 3 W > W > 4 W + W 1 W + W 1 Oupu 1 2 < W + W 1 < (from 2, 3, and 4), bu (from 1), a conradcon 17 Slope W x 1 x 2 x n w 1 w 2 w n Learnng: Percepron Rule x 1 w Σ n Σ w x n 1 f Σ w > o { x -1 oherwse The weghs do no have o be calculaed manually We can ran he nework wh (npu,oupu) par accordng o he followng wegh updae rule: w w + η( o)x where η s he learnng rae parameer Proven o converge f npu se s lnearly separable and η s small 18

6 Learnng n Perceprons (Con d) w w + η( o)x When o, wegh says When 1 and o 1, change n wegh s: η(1 ( 1))x > f x are all posve Thus w x wll ncrease, hus evenually, oupu o wll urn o 1 When 1 and o 1, change n wegh s: η( 1 1)x < f x are all posve Thus w x wll decrease, hus evenually, oupu o wll urn o x y Learnng n Percepron: Anoher Look w(a,b) q p a + b The percepron on he lef can be represened as a lne shown on he rgh (why? see page 14) Learnng can be hough of as adjusmen of w urnng oward he npu vecor x: w w + η( o) x Adjusmen of he bas moves he lne closer or away from he orgn 2 y x Anoher Learnng Rule: Dela Rule The percepron rule canno deal wh nosy daa Graden Descen The dela rule wll fnd an approxmae soluon even when npu se s no lnearly separable E[w] Use lnear un whou he sep funcon: o( x) w x Wan o reduce he error by adjusng w: E( w) ( d o d ) 2 d D Wan o mnmze by adjusng w: E( w) w w1 d D ( d o d ) 2 Noe: he error surface s defned by he ranng daa D A dfferen daa se wll gve a dfferen surface E(w, w 1 ) s he error funcon above, and we wan o change (w, w 1 ) o poson under a low E

7 Graden Descen (Con d) Graden Descen (Example) Graden lne Tranng rule: E[ w] [ E w, E w 1, w η E[ w] E w n ] e, w η E Graden pons n he maxmum ncreasng drecon Graden s prependcular o he level curve (uphll drecon) E(w, w 1 ) s he error funcon above, so E ( E, E ), a vecor on a 2D plane w w E E Graden Descen (Con d) d d 1 2 d d ( d o d ) 2 d ( d o d ) 2 2( d o d ) ( d o d ) ( d o d ) ( d w x d ) ( d o d )( x,d ) Graden Descen: Summary Graden-Descen (ranng examples, η) Each ranng example s a par of he form x,, where x s he vecor of npu values, and s he arge oupu value η s he learnng rae (eg, 5) Inalze each w o some small random value Unl he ermnaon condon s me, Do Inalze each w o zero For each x, n ranng examples, Do Inpu he nsance x o he un and compue he oupu o For each lnear un wegh w, Do For each lnear un wegh w, Do w w + η( o)x Snce we wan w η E, w η d ( d o d )x,d 25 w w + w 26

8 Graden Descen Properes Graden descen s effecve n searchng hrough a large or nfne H: H conans connuously parameerzed hypoheses, and he error can be dfferenaed wr he parameers Lmaons: Sochasc Approxmaon o Grad Desc Avodng local mnma: Incremenal graden descen, or sochasc graden descen Insead of wegh updae based on all npu n D, mmedaely updae weghs afer each npu example: w η( o)x, convergence can be slow, and fnds local mnma (global mnumum no guaraneed) nsead of w η d D( d o d )x, Can be seen as mnmzng error funcon E d ( w) 1 2 ( d o d ) Sandard and Sochasc Grad Desc: Dfferences Summary In he sandard verson, error s defned over enre D In he sandard verson, more compuaon s needed per wegh updae, bu η can be larger Sochasc verson can somemes avod local mnma Percepron ranng rule guaraneed o succeed f Tranng examples are lnearly separable Suffcenly small learnng rae η Lnear un ranng rule usng graden descen Asympoc convergence o hypohess wh mnmum squared error Gven suffcenly small learnng rae η Even when ranng daa conans nose Even when ranng daa no separable by H 29 3

9 Exercse: Implemenng he Percepron x 1 w 1 x 1 Mullayer Neworks I s farly easy o mplemen a percepron You can mplemen n any programmng language: C/C++, ec Look for examples on he web, and JAVA apple demos x 2 x n w 2 w n w Σ n ne Σ w x Dfferenable hreshold un: sgmod σ(y) exp( y) o σ(ne) ē ne Ineresng propery: dσ(y) dy Oupu: σ(y)(1 σ(y)) o σ( w x) Oher funcons: anh(y) exp( 2y) 1 exp( 2y) Mullayer Neworks and Backpropagaon head hd who d hood Error Graden for a Sgmod Un F1 F2 Nonlnear decson surfaces Oupu Inpu sgm(x+y-11) (a) One oupu Anoher example: XOR Inpu Oupu 2 4 Inpu 1 sgm(sgm(x+y-11)+sgm(-x-y+113)-1) Inpu 2 (b) Two hdden, one oupu E d ( d o d ) 2 d D ( d o d ) 2 1 2( d o d ) 2 d d d ( d o d ) ( d o d ) ) ( o d o d ( d o d ) ne d 34 ne d

10 From he prevous page: Bu we know: So: Error Graden for a Sgmod Un E E d o d ( d o d ) ne d ne d o d σ(ne d) o d (1 o d ) ne d ne d ne d ( w x d) d D x,d ( d o d )o d (1 o d )x,d Backpropagaon Algorhm Inalze all weghs o small random numbers Unl sasfed, Do For each ranng example, Do 1 Inpu he ranng example o he nework and compue he nework oupus 2 For each oupu un k δ k o k (1 o k )( k o k ) 3 For each hdden un h δ h o h (1 o h ) k oupus w khδ k 4 Updae each nework wegh w,j w j w j + w j where w j ηδ j x Noe: w j s he wegh from o j (e, w j ) For oupu un: For hdden un: The δ Term δ k o k (1 o k ) ( k o k ) }{{}}{{} σ (ne k ) Error δ h o h (1 o h ) w kh δ k }{{} σ k oupus (ne h ) }{{} Backpropagaed error In sum, δ s he dervave mes he error Dervaon o be presened laer Wan o updae wegh as: where error s defned as: Dervaon of w E d ( w) 1 2 Gven ne j j w jx, w j η, k oupus Dfferen formula for oupu and hdden ( k o k )

11 Dervaon of w: Oupu Un Weghs From he prevous page, Frs, calculae : 1 2 k oupus 1 2 ( j o j ) 2 ( k o k ) ( j o j ) ( j o j ) ( j o j ) 39 Dervaon of w: Oupu Un Weghs From he prevous page, Nex, calculae ( j o j ) : : Snce o j σ(ne j ), and σ (ne j ) o j (1 o j ), Pung everyhng ogeher, o j (1 o j ) ( j o j )o j (1 o j ) 4 Dervaon of w: Oupu Un Weghs From he prevous page: Snce ( j o j )o j (1 o j ) k w jkx k x, j ( j o j )o j (1 o j ) }{{} δ j error σ (ne) x }{{} npu Dervaon of w: Hdden Un Weghs Sar wh x : k Downsream(j) k Downsream(j) k Downsream(j) k Downsream(j) k Downsream(j) ne k ne k δ k ne k δ k ne k δ k w kj δ k w kj o j (1 o j ) }{{} σ (ne) (1) 41 42

12 Fnally, gven and Dervaon of w: Hdden Un Weghs x, k Downsream(j) w j η η [o j (1 o j ) }{{} σ (ne) δ k w kj o j (1 o j ), }{{} σ (ne) k Downsream(j) δ k w kj ] }{{} error } {{ } δ j x Exenson o Dfferen Nework Topologes k w kj j w j oupu hdden npu Arbrary number of layers: for neurons n layer m: δ r o r (1 o r ) Arbrary acyclc graph: δ r o r (1 o r ) s layer m+1 w sr δs w sr δs s Downsream(r) Backpropagaon: Properes Graden descen over enre nework wegh vecor Easly generalzed o arbrary dreced graphs Wll fnd a local, no necessarly global error mnmum: In pracce, ofen works well (can run mulple mes wh dfferen nal weghs) Ofen nclude wegh momenum α w,j (n) ηδ j x,j + α w,j (n 1) Represenaonal Power of Feedforward Neworks Boolean funcons: every boolean funcon represenable wh wo layers (hdden un sze can grow exponenally n he wors case: one hdden un per npu example, and OR hem) Connous funcons: Every bounded connuous funcon can be approxmaed wh an arbrarly small error (oupu uns are lnear) Arbrary funcons: wh hree layers (oupu uns are lnear) Mnmzes error over ranng examples: Wll generalze well o subsequen examples? Tranng can ake housands of eraons slow! Usng he nework afer ranng s very fas 45 46

13 H-Space Search and Inducve Bas Learnng Hdden Layer Represenaons H-space n-d wegh space (when here are n weghs) The space s connuous, unlke decson ree or general-o-specfc concep learnng algorhms Inducve bas: Smooh nerpolaon beween daa pons Inpus Oupus Inpu Oupu Learned Hdden Layer Represenaons Learned Hdden Layer Represenaons Inpus Oupus Inpu Hdden Oupu Values Learned encodng s smlar o sandard 3-b bnary code Auomac dscovery of useful hdden layer represenaons s a key feaure of ANN Noe: The hdden layer represenaon s compressed 5

14 Error Error versus wegh updaes (example 1) Tranng se error Valdaon se error Number of wegh updaes Overfng Error Error versus wegh updaes (example 2) Tranng se error Valdaon se error Number of wegh updaes Penalze large weghs: E( w) 1 2 Alernave Error Funcons d D k oupus( kd o kd ) 2 + γ,j w 2 j Tran on arge slopes as well as values (when he slope s avalable): Error n wo dfferen robo percepon asks Tranng se and valdaon se error Early soppng ensures good performance on unobserved samples, bu mus be careful Wegh decay, use of valdaon ses, use of k-fold cross-valdaon, ec o overcome he problem E( w) 1 ( kd o kd ) 2 + µ kd 2 d D k oupus j npus x j d Te ogeher weghs: eg, n phoneme recognon nework, or handwren characer recognon (wegh sharng) o kd x j d Recurren Neworks Recurren Neworks (Con d) oupu hdden delay Sequence recognon Sore ree srucure (nex slde) Can be raned wh plan npu sack npu, sack npu sack delay A A (A, B) B B C (A, B) (C, A, B) C (A, B) delay npu conex backpropagaon Generalzaon may no be perfec Auoassocaon (npu oupu) Represen a sack usng he hdden layer represenaon Accuracy depends on numercal precson 53 54

15 Learnng Tme Tme-Delay Neural Neworks Applcaons: Sequence recognon: Speech recognon Sequence reproducon: Tme-seres predcon Sequence assocaon Nework archecures Tme-delay neworks (Wabel e al, 1989) Recurren neworks (Rumelhar e al, 1986) Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 34 Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 35 Recurren Neworks Unfoldng n Tme Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 36 Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 37

16 Some Applcaons: NETalk NETalk: Sejnowsk and Rosenberg (1987) Learn o pronounce Englsh ex Demo Daa avalable n UCI ML reposory NETalk daa aardvark a-rdvark 1<<<>2<< aback xb@k->1<< 1<>< abaf xb@f >1<< 2<>1> abandon xb@ndxn >1<>< abase xbes->1<< abash xb@s->1<< abae xbe->1<< Word Pronuncaon Sress/Syllable abou 2, words Backpropagaon Exercse URL: hp://wwwcsamuedu/faculy/choe/src/backprop-16argz Unar and read he README fle: gzp -dc backprop-16argz ar xvf - Run make o buld (on deparmenal unx machnes) Run /bp conf/xorconf ec Backpropagaon: Example Resuls Error Backprop OR AND XOR , Epochs Epoch: one full cycle of ranng hrough all ranng npu paerns OR was eases, AND he nex, and XOR was he mos dffcul o learn Nework had 2 npu, 2 hdden and 1 oupu un Learnng rae was

17 Backpropagaon: Example Resuls (con d) Backpropagaon: Thngs o Try Error Backprop OR AND XOR OR 1, Epochs AND How does ncreasng he number of hdden layer uns affec he (1) me and he (2) number of epochs of ranng? How does ncreasng or decreasng he learnng rae affec he rae of convergence? How does changng he slope of he sgmod affec he rae of convergence? Dfferen problem domans: handwrng recognon, ec Oupu o (,), (,1), (1,), and (1,1) form each row XOR 59 6 Srucured MLP Wegh Sharng (Le Cun e al, 1989) Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 27 Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 28

18 Tunng he Nework Sze Desrucve Consrucve Wegh decay: Growng neworks w E w E' E 2 w w 2 (Ash, 1989) (Fahlman and Lebere, 1989) Bayesan Learnng Consder weghs w as random vars, pror p(w ) p X w p w p w X wˆ MAP argmaxlog p w X p X w log p w X log p X w log p w C p w p w where p w E' E w 2 w c exp ( / ) Wegh decay, rdge regresson, regularzaon cosdaa-msf + λ complexy More abou Bayesan mehods n chaper 14 2 Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 3 Lecure Noes for E Alpaydın 21 Inroducon o Machne Learnng 2e The MIT Press (V1) 31 Summary ANN learnng provdes general mehod for learnng real-valued funcons over connuous or dscree-valued arbued ANNs are robus o nose H s he space of all funcons parameerzed by he weghs H space search s hrough graden descen: convergence o local mnma Backpropagaon gves novel hdden layer represenaons Overfng s an ssue More advanced algorhms exs 61

CSCE 478/878 Lecture 5: Artificial Neural Networks and Support Vector Machines. Stephen Scott. Introduction. Outline. Linear Threshold Units

CSCE 478/878 Lecture 5: Artificial Neural Networks and Support Vector Machines. Stephen Scott. Introduction. Outline. Linear Threshold Units (Adaped from Ehem Alpaydn and Tom Mchell) Consder humans: Toal number of neurons Neuron schng me 3 second (vs ) Connecons per neuron 4 5 Scene recognon me second nference seps doesn seem lke enough ) much

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Neural Networks. Understanding the Brain

Neural Networks. Understanding the Brain Neural Neworks Threshold unis Neural Neworks Gradien descen Mulilayer neworks Backpropagaion Hidden layer represenaions Example: Face Recogniion Advanced opics And, more Neworks of processing unis (neurons)

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

CHAPTER 2: Supervised Learning

CHAPTER 2: Supervised Learning HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press, Lecure Sldes for INTRDUCTIN T Machne Learnng ETHEM ALAYDIN The MIT ress, 2004 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/2ml CHATER 3: Hdden Marov Models Inroducon Modelng dependences n npu; no

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition EHEM ALPAYDI he MI Press, 04 Lecure Sldes for IRODUCIO O Machne Learnng 3rd Edon alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/ml3e Sldes from exboo resource page. Slghly eded and wh addonal examples

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

Chapter 4. Neural Networks Based on Competition

Chapter 4. Neural Networks Based on Competition Chaper 4. Neural Neworks Based on Compeon Compeon s mporan for NN Compeon beween neurons has been observed n bologcal nerve sysems Compeon s mporan n solvng many problems To classfy an npu paern _1 no

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Notes on the stability of dynamic systems and the use of Eigen Values.

Notes on the stability of dynamic systems and the use of Eigen Values. Noes on he sabl of dnamc ssems and he use of Egen Values. Source: Macro II course noes, Dr. Davd Bessler s Tme Seres course noes, zarads (999) Ineremporal Macroeconomcs chaper 4 & Techncal ppend, and Hamlon

More information

Supervised Learning in Multilayer Networks

Supervised Learning in Multilayer Networks Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp://www.cambrdge.org/521642981 You can buy hs book for 3 pounds or $5. See hp://www.nference.phy.cam.ac.uk/mackay/la/ for lnks.

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

CS 268: Packet Scheduling

CS 268: Packet Scheduling Pace Schedulng Decde when and wha pace o send on oupu ln - Usually mplemened a oupu nerface CS 68: Pace Schedulng flow Ion Soca March 9, 004 Classfer flow flow n Buffer managemen Scheduler soca@cs.bereley.edu

More information

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes. umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

General Weighted Majority, Online Learning as Online Optimization

General Weighted Majority, Online Learning as Online Optimization Sascal Technques n Robocs (16-831, F10) Lecure#10 (Thursday Sepember 23) General Weghed Majory, Onlne Learnng as Onlne Opmzaon Lecurer: Drew Bagnell Scrbe: Nahanel Barshay 1 1 Generalzed Weghed majory

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

Hidden Markov Models

Hidden Markov Models 11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc

More information

Advanced time-series analysis (University of Lund, Economic History Department)

Advanced time-series analysis (University of Lund, Economic History Department) Advanced me-seres analss (Unvers of Lund, Economc Hsor Dearmen) 3 Jan-3 Februar and 6-3 March Lecure 4 Economerc echnues for saonar seres : Unvarae sochasc models wh Box- Jenns mehodolog, smle forecasng

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

Part II CONTINUOUS TIME STOCHASTIC PROCESSES

Part II CONTINUOUS TIME STOCHASTIC PROCESSES Par II CONTINUOUS TIME STOCHASTIC PROCESSES 4 Chaper 4 For an advanced analyss of he properes of he Wener process, see: Revus D and Yor M: Connuous marngales and Brownan Moon Karazas I and Shreve S E:

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Density Matrix Description of NMR BCMB/CHEM 8190

Density Matrix Description of NMR BCMB/CHEM 8190 Densy Marx Descrpon of NMR BCMBCHEM 89 Operaors n Marx Noaon Alernae approach o second order specra: ask abou x magnezaon nsead of energes and ranson probables. If we say wh one bass se, properes vary

More information

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth Should Exac Index umbers have Sandard Errors? Theory and Applcaon o Asan Growh Rober C. Feensra Marshall B. Rensdorf ovember 003 Proof of Proposon APPEDIX () Frs, we wll derve he convenonal Sao-Vara prce

More information

Tight results for Next Fit and Worst Fit with resource augmentation

Tight results for Next Fit and Worst Fit with resource augmentation Tgh resuls for Nex F and Wors F wh resource augmenaon Joan Boyar Leah Epsen Asaf Levn Asrac I s well known ha he wo smple algorhms for he classc n packng prolem, NF and WF oh have an approxmaon rao of

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

A Deterministic Algorithm for Summarizing Asynchronous Streams over a Sliding Window

A Deterministic Algorithm for Summarizing Asynchronous Streams over a Sliding Window A Deermnsc Algorhm for Summarzng Asynchronous Sreams over a Sldng ndow Cosas Busch Rensselaer Polyechnc Insue Srkana Trhapura Iowa Sae Unversy Oulne of Talk Inroducon Algorhm Analyss Tme C Daa sream: 3

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

2/20/2013. EE 101 Midterm 2 Review

2/20/2013. EE 101 Midterm 2 Review //3 EE Mderm eew //3 Volage-mplfer Model The npu ressance s he equalen ressance see when lookng no he npu ermnals of he amplfer. o s he oupu ressance. I causes he oupu olage o decrease as he load ressance

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

3. OVERVIEW OF NUMERICAL METHODS

3. OVERVIEW OF NUMERICAL METHODS 3 OVERVIEW OF NUMERICAL METHODS 3 Inroducory remarks Ths chaper summarzes hose numercal echnques whose knowledge s ndspensable for he undersandng of he dfferen dscree elemen mehods: he Newon-Raphson-mehod,

More information

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy Arcle Inernaonal Journal of Modern Mahemacal Scences, 4, (): - Inernaonal Journal of Modern Mahemacal Scences Journal homepage: www.modernscenfcpress.com/journals/jmms.aspx ISSN: 66-86X Florda, USA Approxmae

More information

Chapters 2 Kinematics. Position, Distance, Displacement

Chapters 2 Kinematics. Position, Distance, Displacement Chapers Knemacs Poson, Dsance, Dsplacemen Mechancs: Knemacs and Dynamcs. Knemacs deals wh moon, bu s no concerned wh he cause o moon. Dynamcs deals wh he relaonshp beween orce and moon. The word dsplacemen

More information

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5 TPG460 Reservor Smulaon 08 page of 5 DISCRETIZATIO OF THE FOW EQUATIOS As we already have seen, fne dfference appromaons of he paral dervaves appearng n he flow equaons may be obaned from Taylor seres

More information

How about the more general "linear" scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )?

How about the more general linear scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )? lmcd Lnear ransformaon of a vecor he deas presened here are que general hey go beyond he radonal mar-vecor ype seen n lnear algebra Furhermore, hey do no deal wh bass and are equally vald for any se of

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

Relative controllability of nonlinear systems with delays in control

Relative controllability of nonlinear systems with delays in control Relave conrollably o nonlnear sysems wh delays n conrol Jerzy Klamka Insue o Conrol Engneerng, Slesan Techncal Unversy, 44- Glwce, Poland. phone/ax : 48 32 37227, {jklamka}@a.polsl.glwce.pl Keywor: Conrollably.

More information

Clustering with Gaussian Mixtures

Clustering with Gaussian Mixtures Noe o oher eachers and users of hese sldes. Andrew would be delghed f you found hs source maeral useful n gvng your own lecures. Feel free o use hese sldes verbam, or o modfy hem o f your own needs. PowerPon

More information

Testing a new idea to solve the P = NP problem with mathematical induction

Testing a new idea to solve the P = NP problem with mathematical induction Tesng a new dea o solve he P = NP problem wh mahemacal nducon Bacground P and NP are wo classes (ses) of languages n Compuer Scence An open problem s wheher P = NP Ths paper ess a new dea o compare he

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC CH.3. COMPATIBILITY EQUATIONS Connuum Mechancs Course (MMC) - ETSECCPB - UPC Overvew Compably Condons Compably Equaons of a Poenal Vecor Feld Compably Condons for Infnesmal Srans Inegraon of he Infnesmal

More information

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa

More information

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project GMM paraeer esaon Xaoye Lu M290c Fnal rojec GMM nroducon Gaussan ure Model obnaon of several gaussan coponens Noaon: For each Gaussan dsrbuon:, s he ean and covarance ar. A GMM h ures(coponens): p ( 2π

More information

ISSN MIT Publications

ISSN MIT Publications MIT Inernaonal Journal of Elecrcal and Insrumenaon Engneerng Vol. 1, No. 2, Aug 2011, pp 93-98 93 ISSN 2230-7656 MIT Publcaons A New Approach for Solvng Economc Load Dspach Problem Ansh Ahmad Dep. of Elecrcal

More information

Approximation Lasso Methods for Language Modeling

Approximation Lasso Methods for Language Modeling Approxmaon Lasso Mehods for Language Modelng Janfeng Gao Mcrosof Research One Mcrosof Way Redmond WA 98052 USA jfgao@mcrosof.com Hsam Suzuk Mcrosof Research One Mcrosof Way Redmond WA 98052 USA hsams@mcrosof.com

More information

Handout # 6 (MEEN 617) Numerical Integration to Find Time Response of SDOF mechanical system Y X (2) and write EOM (1) as two first-order Eqs.

Handout # 6 (MEEN 617) Numerical Integration to Find Time Response of SDOF mechanical system Y X (2) and write EOM (1) as two first-order Eqs. Handou # 6 (MEEN 67) Numercal Inegraon o Fnd Tme Response of SDOF mechancal sysem Sae Space Mehod The EOM for a lnear sysem s M X DX K X F() () X X X X V wh nal condons, a 0 0 ; 0 Defne he followng varables,

More information

EP2200 Queuing theory and teletraffic systems. 3rd lecture Markov chains Birth-death process - Poisson process. Viktoria Fodor KTH EES

EP2200 Queuing theory and teletraffic systems. 3rd lecture Markov chains Birth-death process - Poisson process. Viktoria Fodor KTH EES EP Queung heory and eleraffc sysems 3rd lecure Marov chans Brh-deah rocess - Posson rocess Vora Fodor KTH EES Oulne for oday Marov rocesses Connuous-me Marov-chans Grah and marx reresenaon Transen and

More information

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach A Novel Iron Loss Reducon Technque for Dsrbuon Transformers Based on a Combned Genec Algorhm - Neural Nework Approach Palvos S. Georglaks Nkolaos D. Doulams Anasasos D. Doulams Nkos D. Hazargyrou and Sefanos

More information

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems

Genetic Algorithm in Parameter Estimation of Nonlinear Dynamic Systems Genec Algorhm n Parameer Esmaon of Nonlnear Dynamc Sysems E. Paeraks manos@egnaa.ee.auh.gr V. Perds perds@vergna.eng.auh.gr Ah. ehagas kehagas@egnaa.ee.auh.gr hp://skron.conrol.ee.auh.gr/kehagas/ndex.hm

More information