Supervised Learning in Multilayer Networks

Size: px
Start display at page:

Download "Supervised Learning in Multilayer Networks"

Transcription

1 Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp:// You can buy hs book for 3 pounds or $5. See hp:// for lnks. 44 Supervsed Learnng n Mullayer Neworks 44.1 Mullayer perceprons No course on neural neworks could be complee whou a dscusson of supervsed mullayer neworks, also known as backpropagaon neworks. The mullayer percepron s a feedforward nework. I has npu neurons, hdden neurons and oupu neurons. The hdden neurons may be arranged n a sequence of layers. The mos common mullayer perceprons have a sngle hdden layer, and are known as wo-layer neworks, he number wo counng he number of layers of neurons no ncludng he npus. Such a feedforward nework defnes a nonlnear parameerzed mappng from an npu x o an oupu y = y(x; w, A). The oupu s a connuous funcon of he npu and of he parameers w; he archecure of he ne,.e., he funconal form of he mappng, s denoed by A. Feedforward neworks can be raned o perform regresson and classfcaon asks. Regresson neworks In he case of a regresson problem, he mappng for a nework wh one hdden layer may have he form: Oupus Hddens Inpus Fgure A ypcal wo-layer nework, wh sx npus, seven hdden uns, and hree oupus. Each lne represens one wegh. Hdden layer: a (1) = l w (1) l x l + θ (1) ; h = f (1) (a (1) ) (44.1) Oupu layer: a (2) = w (2) h + θ (2) ; y = f (2) (a (2) ) (44.2).4.2 where, for example, f (1) (a) = anh(a), and f (2) (a) = a. Here l runs over he npus x 1,..., x L, runs over he hdden uns, and runs over he oupus. The weghs w and bases θ ogeher make up he parameer vecor w. The nonlnear sgmod funcon f (1) a he hdden layer gves he neural nework greaer compuaonal flexbly han a sandard lnear regresson model. Graphcally, we can represen he neural nework as a se of layers of conneced neurons (fgure 44.1). Wha sors of funcons can hese neworks mplemen? Jus as we explored he wegh space of he sngle neuron n Chaper 39, examnng he funcons could produce, le us explore he wegh space of a mullayer nework. In fgures 44.2 and 44.3 I ake a nework wh one npu and one oupu and a large number H of hdden uns, se he bases Fgure Samples from he pror over funcons of a one-npu nework. For each of a sequence of values of σ bas = 8, 6, 4, 3, 2, 1.6, 1.2,.8,.4,.3,.2, and σ n = 5σbas w, one random funcon s shown. The oher hyperparameers of he nework were H = 4, σou w =

2 Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp:// You can buy hs book for 3 pounds or $5. See hp:// for lnks Supervsed Learnng n Mullayer Neworks y Oupu σ ou Hdden layer σ bas σ n Inpu 1 x Oupu 1 5 Hσou 1/σ n -5-1 σ bas/σ n Inpu Fgure Properes of a funcon produced by a random nework. The vercal scale of a ypcal funcon produced by he nework wh random weghs s of order Hσ ou ; he horzonal range n whch he funcon vares sgnfcanly s of order σ bas /σ n ; and he shores horzonal lengh scale s of order 1/σ n. The funcon shown was produced by makng a random nework wh H = 4 hdden uns, and Gaussan weghs wh σ bas = 4, σ n = 8, and σ ou =.5. and weghs θ (1), w (1) l, θ (2) and w (2) o random values, and plo he resulng funcon y(x). I se he hdden uns bases θ (1) o random values from a Gaussan wh zero mean and sandard devaon σ bas ; he npu-o-hdden weghs w (1) l o random values wh sandard devaon σ n ; and he bas and oupu weghs θ (2) and w (2) o random values wh sandard devaon σ ou. The sor of funcons ha we oban depend on he values of σ bas, σ n and σ ou. As he weghs and bases are made bgger we oban more complex funcons wh more feaures and a greaer sensvy o he npu varable. The vercal scale of a ypcal funcon produced by he nework wh random weghs s of order Hσ ou ; he horzonal range n whch he funcon vares sgnfcanly s of order σ bas /σ n ; and he shores horzonal lengh scale s of order 1/σ n. Radford Neal (1996) has also shown ha n he lm as H he sascal properes of he funcons generaed by randomzng he weghs are ndependen of he number of hdden uns; so, neresngly, he complexy of he funcons becomes ndependen of he number of parameers n he model. Wha deermnes he complexy of he ypcal funcons s he characersc magnude of he weghs. Thus we ancpae ha when we f hese models o real daa, an mporan way of conrollng he complexy of he fed funcon wll be o conrol he characersc magnude of he weghs. Fgure 44.4 shows one ypcal funcon produced by a nework wh wo npus and one oupu. Ths should be conrased wh he funcon produced by a radonal lnear regresson model, whch s a fla plane. Neural neworks can creae funcons wh more complexy han a lnear regresson Fgure One sample from he pror of a wo-npu nework wh {H, σ w n, σw bas, σw ou} = {4, 8., 8.,.5} How a regresson nework s radonally raned Ths nework s raned usng a daa se D = {x (n), (n) } by adusng w so as o mnmze an error funcon, e.g., E D (w) = 1 2 ( (n) 2 y (x (n) ; w)). (44.3) n Ths obecve funcon s a sum of erms, one for each npu/arge par {x, }, measurng how close he oupu y(x; w) s o he arge. Ths mnmzaon s based on repeaed evaluaon of he graden of E D. Ths graden can be effcenly compued usng he backpropagaon algorhm (Rumelhar e al., 1986), whch uses he chan rule o fnd he dervaves.

3 Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp:// You can buy hs book for 3 pounds or $5. See hp:// for lnks. 44.3: Neural nework learnng as nference 529 Ofen, regularzaon (also known as wegh decay) s ncluded, modfyng he obecve funcon o: M(w) = βe D + αe W (44.4) where, for example, E W = 1 2 w2. Ths addonal erm favours small values of w and decreases he endency of a model o overf nose n he ranng daa. Rumelhar e al. (1986) showed ha mullayer perceprons can be raned, by graden descen on M(w), o dscover soluons o non-rval problems such as decdng wheher an mage s symmerc or no. These neworks have been successfully appled o real-world asks as vared as pronouncng Englsh ex (Senowsk and Rosenberg, 1987) and focussng mulple-mrror elescopes (Angel e al., 199) Neural nework learnng as nference The neural nework learnng process above can be gven he followng probablsc nerpreaon. [Here we repea and generalze he dscusson of Chaper 41.] The error funcon s nerpreed as defnng a nose model. βe D s he negave log lkelhood: P (D w, β, H) = 1 Z D (β) exp( βe D). (44.5) Thus, he use of he sum-squared error E D (44.3) corresponds o an assumpon of Gaussan nose on he arge varables, and he parameer β defnes a nose level σ 2 ν = 1/β. Smlarly he regularzer s nerpreed n erms of a log pror probably dsrbuon over he parameers: P (w α, H) = 1 Z W (α) exp( αe W ). (44.6) If E W s quadrac as defned above, hen he correspondng pror dsrbuon s a Gaussan wh varance σ 2 W = 1/α. The probablsc model H specfes he archecure A of he nework, he lkelhood (44.5), and he pror (44.6). The obecve funcon M(w) hen corresponds o he nference of he parameers w, gven he daa: P (w D, α, β, H) = P (D w, β, H)P (w α, H) P (D α, β, H) (44.7) = 1 Z M exp( M(w)). (44.8) The w found by (locally) mnmzng M(w) s hen nerpreed as he (locally) mos probable parameer vecor, w MP. The nerpreaon of M(w) as a log probably adds lle new a hs sage. Bu new ools wll emerge when we proceed o oher nferences. Frs, hough, le us esablsh he probablsc nerpreaon of classfcaon neworks, o whch he same ools apply.

4 Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp:// You can buy hs book for 3 pounds or $5. See hp:// for lnks Supervsed Learnng n Mullayer Neworks Bnary classfcaon neworks If he arges n a daa se are bnary classfcaon labels (, 1), s naural o use a neural nework whose oupu y(x; w, A) s bounded beween and 1, and s nerpreed as a probably P (=1 x, w, A). For example, a nework wh one hdden layer could be descrbed by he feedforward equaons (44.1) and (44.2), wh f (2) (a) = 1/(1 + e a ). The error funcon βe D s replaced by he negave log lkelhood: [ ] G(w) = (n) ln y(x (n) ; w) + (1 (n) ) ln(1 y(x (n) ; w)). (44.9) n The oal obecve funcon s hen M = G + αe W. Noe ha hs ncludes no parameer β (because here s no Gaussan nose). Mul-class classfcaon neworks For a mul-class classfcaon problem, we can represen he arges by a vecor,, n whch a sngle elemen s se o 1, ndcang he correc class, and all oher elemens are se o. In hs case s approprae o use a sofmax nework havng coupled oupus whch sum o one and are nerpreed as class probables y = P ( =1 x, w, A). The las par of equaon (44.2) s replaced by: y = ea e a The negave log lkelhood n hs case s G(w) = n. (44.1) (n) ln y (x (n) ; w). (44.11) As n he case of he regresson nework, he mnmzaon of he obecve funcon M(w) = G + αe W corresponds o an nference of he form (44.8). A varey of useful resuls can be bul on hs nerpreaon Benefs of he Bayesan approach o supervsed feedforward neural neworks From he sascal perspecve, supervsed neural neworks are nohng more han nonlnear curve-fng devces. Curve fng s no a rval ask however. The effecve complexy of an nerpolang model s of crucal mporance, as llusraed n fgure Consder a conrol parameer ha nfluences he complexy of a model, for example a regularzaon consan α (wegh decay parameer). As he conrol parameer s vared o ncrease he complexy of he model (descendng from fgure 44.5a c and gong from lef o rgh across fgure 44.5d), he bes f o he ranng daa ha he model can acheve becomes ncreasngly good. However, he emprcal performance of he model, he es error, frs decreases hen ncreases agan. An over-complex model overfs he daa and generalzes poorly. Ths problem may also complcae he choce of archecure n a mullayer percepron, he radus of he bass funcons n a radal bass funcon nework, and he choce of he npu varables hemselves n any muldmensonal regresson problem. Fndng values for model conrol parameers ha are approprae for he daa s herefore an mporan and non-rval problem.

5 Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp:// You can buy hs book for 3 pounds or $5. See hp:// for lnks. 44.4: Benefs of he Bayesan approach o supervsed feedforward neural neworks 531 (a) (d) Tes Error Tranng Error Model Conrol Parameers Log Probably(Tranng Daa Conrol Parameers) Fgure Opmzaon of model complexy. Panels (a c) show a radal bass funcon model nerpolang a smple daa se wh one npu varable and one oupu varable. As he regularzaon consan s vared o ncrease he complexy of he model (from (a) o (c)), he nerpolan s able o f he ranng daa ncreasngly well, bu beyond a ceran pon he generalzaon ably (es error) of he model deeroraes. Probably heory allows us o opmze he conrol parameers whou needng a es se. (b) (e) Model Conrol Parameers (c) The overfng problem can be solved by usng a Bayesan approach o conrol model complexy. If we gve a probablsc nerpreaon o he model, hen we can evaluae he evdence for alernave values of he conrol parameers. As was explaned n Chaper 28, over-complex models urn ou o be less probable, and he evdence P (Daa Conrol Parameers) can be used as an obecve funcon for opmzaon of model conrol parameers (fgure 44.5e). The seng of α ha maxmzes he evdence s dsplayed n fgure 44.5b. Bayesan opmzaon of model conrol parameers has four mporan advanages. (1) No es se or valdaon se s nvolved, so all avalable ranng daa can be devoed o boh model fng and model comparson. (2) Regularzaon consans can be opmzed on-lne,.e., smulaneously wh he opmzaon of ordnary model parameers. (3) The Bayesan obecve funcon s no nosy, n conras o a cross-valdaon measure. (4) The graden of he evdence wh respec o he conrol parameers can be evaluaed, makng possble o smulaneously opmze a large number of conrol parameers. Probablsc modellng also handles uncerany n a naural manner. I offers a unque prescrpon, margnalzaon, for ncorporang uncerany abou parameers no predcons; hs procedure yelds beer predcons, as we saw n Chaper 41. Fgure 44.6 shows error bars on he predcons of a raned neural nework. Fgure Error bars on he predcons of a raned regresson nework. The sold lne gves he predcons of he bes-f parameers of a mullayer percepron raned on he daa pons. The error bars (doed lnes) are hose produced by he uncerany of he parameers w. Noce ha he error bars become larger where he daa are sparse. Implemenaon of Bayesan nference As was menoned n Chaper 41, Bayesan nference for mullayer neworks may be mplemened by Mone Carlo samplng, or by deermnsc mehods employng Gaussan approxmaons (Neal, 1996; MacKay, 1992c).

6 Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp:// You can buy hs book for 3 pounds or $5. See hp:// for lnks Supervsed Learnng n Mullayer Neworks Whn he Bayesan framework for daa modellng, s easy o mprove our probablsc models. For example, f we beleve ha some npu varables n a problem may be rrelevan o he predced quany, bu we don know whch, we can defne a new model wh mulple hyperparameers ha capures he dea of unceran npu varable relevance (MacKay, 1994b; Neal, 1996; MacKay, 1995b); hese models hen nfer auomacally from he daa whch are he relevan npu varables for a problem Exercses Exercse [4 ] How o measure a classfer s qualy. You ve us wren a new classfcaon algorhm and wan o measure how well performs on a es se, and compare wh oher classfers. Wha performance measure should you use? There are several sandard answers. Le s assume he classfer gves an oupu y(x), where x s he npu, whch we won dscuss furher, and ha he rue arge value s. In he smples dscussons of classfers, boh y and are bnary varables, bu you mgh care o consder cases where y and are more general obecs also. The mos wdely used measure of performance on a es se s he error rae he fracon of msclassfcaons made by he classfer. Ths measure forces he classfer o gve a /1 oupu and gnores any addonal nformaon ha he classfer mgh be able o offer for example, an ndcaon of he frmness of a predcon. Unforunaely, he error rae does no necessarly measure how nformave a classfer s oupu s. Consder frequency ables showng he on frequency of he /1 oupu of a classfer (horzonal axs), and he rue /1 varable (vercal axs). The numbers ha we ll show are percenages. The error rae e s he sum of he wo off-dagonal numbers, whch we could call he false posve rae e + and he false negave rae e. Of he followng hree classfers, A and B have he same error rae of 1% and C has a greaer error rae of 12%. Classfer A Classfer B Classfer C y 1 y 1 y Bu clearly classfer A, whch smply guesses ha he oucome s for all cases, s conveyng no nformaon a all abou ; whereas classfer B has an nformave oupu: f y = hen we are sure ha really s zero; and f y = 1 hen here s a 5% chance ha = 1, as compared o he pror probably P ( = 1) =.1. Classfer C s slghly less nformave han B, bu s sll much more useful han he nformaon-free classfer A. One way o mprove on he error rae as a performance measure s o repor he par (e +, e ), he false posve error rae and he false negave error rae, whch are (,.1) and (.1, ) for classfers A and B. I s especally mporan o dsngush beween hese wo error probables n applcaons where he wo sors of error have dfferen assocaed coss. However, here are a couple of problems wh he error rae par : How common sense ranks he classfers: (bes) B > C > A (wors). How error rae ranks he classfers: (bes) A = B > C (wors). Frs, f I smply old you ha classfer A has error raes (,.1) and B has error raes (.1, ), would no be mmedaely evden ha classfer A s acually uerly worhless. Surely we should have a performance measure ha gves he wors possble score o A!

7 Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp:// You can buy hs book for 3 pounds or $5. See hp:// for lnks. 44.5: Exercses 533 Second, f we urn o a mulple-class classfcaon problem such as dg recognon, hen he number of ypes of error ncreases from wo o 1 9 = 9 one for each possble confuson of class wh. I would be nce o have some sensble way of collapsng hese 9 numbers no a sngle rankable number ha makes more sense han he error rae. Anoher reason for no lkng he error rae s ha doesn gve a classfer cred for accuraely specfyng s uncerany. Consder classfers ha have hree oupus avalable,, 1 and a reecon class,?, whch ndcaes ha he classfer s no sure. Consder classfers D and E wh he followng frequency ables, n percenages: Classfer D y? Classfer E y? Boh of hese classfers have (e +, e, r) = (6%, %, 11%). Bu are hey equally good classfers? Compare classfer E wh C. The wo classfers are equvalen. E s us C n dsguse we could make E by akng he oupu of C and ossng a con when C says 1 n order o decde wheher o gve oupu 1 or?. So E s equal o C and hus nferor o B. Now compare D wh B. Can you usfy he suggeson ha D s a more nformave classfer han B, and hus s superor o E? Ye D and E have he same (e +, e, r) scores. People ofen plo error-reec curves (also known as ROC curves; ROC sands for recever operang characersc ) whch show he oal e = (e + + e ) versus r as r s allowed o vary from o 1, and use hese curves o compare classfers (fgure 44.7). [In he specal case of bnary classfcaon problems, e + may be ploed versus e nsead.] Bu as we have seen, error raes can be undscernng performance measures. Does plong one error rae as a funcon of anoher make hs weakness of error raes go away? For hs exercse, eher consruc an explc example demonsrang ha he error-reec curve, and he area under, are no necessarly good ways o compare classfers; or prove ha hey are. As a suggesed alernave mehod for comparng classfers, consder he muual nformaon beween he oupu and he arge, Error rae Reecon rae Fgure An error-reec curve. Some people use he area under hs curve as a measure of classfer qualy. I(T ; Y ) H(T ) H(T Y ) = y, P (y)p ( y) log P () P ( y), (44.12) whch measures how many bs he classfer s oupu conveys abou he arge. Evaluae he muual nformaon for classfers A E above. Invesgae hs performance measure and dscuss wheher s a useful one. Does have praccal drawbacks?

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5 TPG460 Reservor Smulaon 08 page of 5 DISCRETIZATIO OF THE FOW EQUATIOS As we already have seen, fne dfference appromaons of he paral dervaves appearng n he flow equaons may be obaned from Taylor seres

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Notes on the stability of dynamic systems and the use of Eigen Values.

Notes on the stability of dynamic systems and the use of Eigen Values. Noes on he sabl of dnamc ssems and he use of Egen Values. Source: Macro II course noes, Dr. Davd Bessler s Tme Seres course noes, zarads (999) Ineremporal Macroeconomcs chaper 4 & Techncal ppend, and Hamlon

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu

More information

Fitting a Conditional Linear Gaussian Distribution

Fitting a Conditional Linear Gaussian Distribution Fng a Condonal Lnear Gaussan Dsrbuon Kevn P. Murphy 28 Ocober 1998 Revsed 29 January 2003 1 Inroducon We consder he problem of fndng he maxmum lkelhood ML esmaes of he parameers of a condonal Gaussan varable

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as

More information

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that THEORETICAL AUTOCORRELATIONS Cov( y, y ) E( y E( y))( y E( y)) ρ = = Var( y) E( y E( y)) =,, L ρ = and Cov( y, y ) s ofen denoed by whle Var( y ) f ofen denoed by γ. Noe ha γ = γ and ρ = ρ and because

More information

CHAPTER 2: Supervised Learning

CHAPTER 2: Supervised Learning HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

[Link to MIT-Lab 6P.1 goes here.] After completing the lab, fill in the following blanks: Numerical. Simulation s Calculations

[Link to MIT-Lab 6P.1 goes here.] After completing the lab, fill in the following blanks: Numerical. Simulation s Calculations Chaper 6: Ordnary Leas Squares Esmaon Procedure he Properes Chaper 6 Oulne Cln s Assgnmen: Assess he Effec of Sudyng on Quz Scores Revew o Regresson Model o Ordnary Leas Squares () Esmaon Procedure o he

More information

Testing a new idea to solve the P = NP problem with mathematical induction

Testing a new idea to solve the P = NP problem with mathematical induction Tesng a new dea o solve he P = NP problem wh mahemacal nducon Bacground P and NP are wo classes (ses) of languages n Compuer Scence An open problem s wheher P = NP Ths paper ess a new dea o compare he

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

How about the more general "linear" scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )?

How about the more general linear scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )? lmcd Lnear ransformaon of a vecor he deas presened here are que general hey go beyond he radonal mar-vecor ype seen n lnear algebra Furhermore, hey do no deal wh bass and are equally vald for any se of

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC CH.3. COMPATIBILITY EQUATIONS Connuum Mechancs Course (MMC) - ETSECCPB - UPC Overvew Compably Condons Compably Equaons of a Poenal Vecor Feld Compably Condons for Infnesmal Srans Inegraon of he Infnesmal

More information

Pattern Classification (III) & Pattern Verification

Pattern Classification (III) & Pattern Verification Preare by Prof. Hu Jang CSE638 --4 CSE638 3. Seech & Language Processng o.5 Paern Classfcaon III & Paern Verfcaon Prof. Hu Jang Dearmen of Comuer Scence an Engneerng York Unversy Moel Parameer Esmaon Maxmum

More information

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth Should Exac Index umbers have Sandard Errors? Theory and Applcaon o Asan Growh Rober C. Feensra Marshall B. Rensdorf ovember 003 Proof of Proposon APPEDIX () Frs, we wll derve he convenonal Sao-Vara prce

More information

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer d Model Cvl and Surveyng Soware Dranage Analyss Module Deenon/Reenon Basns Owen Thornon BE (Mech), d Model Programmer owen.hornon@d.com 4 January 007 Revsed: 04 Aprl 007 9 February 008 (8Cp) Ths documen

More information

General Weighted Majority, Online Learning as Online Optimization

General Weighted Majority, Online Learning as Online Optimization Sascal Technques n Robocs (16-831, F10) Lecure#10 (Thursday Sepember 23) General Weghed Majory, Onlne Learnng as Onlne Opmzaon Lecurer: Drew Bagnell Scrbe: Nahanel Barshay 1 1 Generalzed Weghed majory

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)

5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015) 5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and

More information

Motion in Two Dimensions

Motion in Two Dimensions Phys 1 Chaper 4 Moon n Two Dmensons adzyubenko@csub.edu hp://www.csub.edu/~adzyubenko 005, 014 A. Dzyubenko 004 Brooks/Cole 1 Dsplacemen as a Vecor The poson of an objec s descrbed by s poson ecor, r The

More information

Chapter 4. Neural Networks Based on Competition

Chapter 4. Neural Networks Based on Competition Chaper 4. Neural Neworks Based on Compeon Compeon s mporan for NN Compeon beween neurons has been observed n bologcal nerve sysems Compeon s mporan n solvng many problems To classfy an npu paern _1 no

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

Reconstruction of Missing Data in Social Networks Based on Temporal Patterns of Interactions

Reconstruction of Missing Data in Social Networks Based on Temporal Patterns of Interactions Reconsrucon of Mssng Daa n Socal Neworks Based on Temporal Paerns of Ineracons Alexey Somakhn, Marn B. Shor, and Andrea L. Berozz Mahemacs Deparmen, Unversy of Calforna, Los Angeles E-mal: alexey@mah.ucla.edu,

More information

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c

The Analysis of the Thickness-predictive Model Based on the SVM Xiu-ming Zhao1,a,Yan Wang2,band Zhimin Bi3,c h Naonal Conference on Elecrcal, Elecroncs and Compuer Engneerng (NCEECE The Analyss of he Thcknesspredcve Model Based on he SVM Xumng Zhao,a,Yan Wang,band Zhmn B,c School of Conrol Scence and Engneerng,

More information

Tools for Analysis of Accelerated Life and Degradation Test Data

Tools for Analysis of Accelerated Life and Degradation Test Data Acceleraed Sress Tesng and Relably Tools for Analyss of Acceleraed Lfe and Degradaon Tes Daa Presened by: Reuel Smh Unversy of Maryland College Park smhrc@umd.edu Sepember-5-6 Sepember 28-30 206, Pensacola

More information

CSCE 478/878 Lecture 5: Artificial Neural Networks and Support Vector Machines. Stephen Scott. Introduction. Outline. Linear Threshold Units

CSCE 478/878 Lecture 5: Artificial Neural Networks and Support Vector Machines. Stephen Scott. Introduction. Outline. Linear Threshold Units (Adaped from Ehem Alpaydn and Tom Mchell) Consder humans: Toal number of neurons Neuron schng me 3 second (vs ) Connecons per neuron 4 5 Scene recognon me second nference seps doesn seem lke enough ) much

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

Fall 2009 Social Sciences 7418 University of Wisconsin-Madison. Problem Set 2 Answers (4) (6) di = D (10)

Fall 2009 Social Sciences 7418 University of Wisconsin-Madison. Problem Set 2 Answers (4) (6) di = D (10) Publc Affars 974 Menze D. Chnn Fall 2009 Socal Scences 7418 Unversy of Wsconsn-Madson Problem Se 2 Answers Due n lecure on Thursday, November 12. " Box n" your answers o he algebrac quesons. 1. Consder

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

Survival Analysis and Reliability. A Note on the Mean Residual Life Function of a Parallel System

Survival Analysis and Reliability. A Note on the Mean Residual Life Function of a Parallel System Communcaons n Sascs Theory and Mehods, 34: 475 484, 2005 Copyrgh Taylor & Francs, Inc. ISSN: 0361-0926 prn/1532-415x onlne DOI: 10.1081/STA-200047430 Survval Analyss and Relably A Noe on he Mean Resdual

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

P R = P 0. The system is shown on the next figure:

P R = P 0. The system is shown on the next figure: TPG460 Reservor Smulaon 08 page of INTRODUCTION TO RESERVOIR SIMULATION Analycal and numercal soluons of smple one-dmensonal, one-phase flow equaons As an nroducon o reservor smulaon, we wll revew he smples

More information

Neural Networks. Understanding the Brain

Neural Networks. Understanding the Brain Threshold uns Graden descen Mullayer neworks Backpropagaon Hdden layer represenaons Example: Face Recognon Advanced opcs Neural Neworks Neural Neworks Neworks of processng uns (neurons) wh connecons (synapses)

More information

Comparison of Differences between Power Means 1

Comparison of Differences between Power Means 1 In. Journal of Mah. Analyss, Vol. 7, 203, no., 5-55 Comparson of Dfferences beween Power Means Chang-An Tan, Guanghua Sh and Fe Zuo College of Mahemacs and Informaon Scence Henan Normal Unversy, 453007,

More information

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code

Using Fuzzy Pattern Recognition to Detect Unknown Malicious Executables Code Usng Fuzzy Paern Recognon o Deec Unknown Malcous Execuables Code Boyun Zhang,, Janpng Yn, and Jngbo Hao School of Compuer Scence, Naonal Unversy of Defense Technology, Changsha 40073, Chna hnxzby@yahoo.com.cn

More information

Panel Data Regression Models

Panel Data Regression Models Panel Daa Regresson Models Wha s Panel Daa? () Mulple dmensoned Dmensons, e.g., cross-secon and me node-o-node (c) Pongsa Pornchawseskul, Faculy of Economcs, Chulalongkorn Unversy (c) Pongsa Pornchawseskul,

More information

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he

More information

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach

A Novel Iron Loss Reduction Technique for Distribution Transformers. Based on a Combined Genetic Algorithm - Neural Network Approach A Novel Iron Loss Reducon Technque for Dsrbuon Transformers Based on a Combned Genec Algorhm - Neural Nework Approach Palvos S. Georglaks Nkolaos D. Doulams Anasasos D. Doulams Nkos D. Hazargyrou and Sefanos

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading Onlne Supplemen for Dynamc Mul-Technology Producon-Invenory Problem wh Emssons Tradng by We Zhang Zhongsheng Hua Yu Xa and Baofeng Huo Proof of Lemma For any ( qr ) Θ s easy o verfy ha he lnear programmng

More information

Foundations of State Estimation Part II

Foundations of State Estimation Part II Foundaons of Sae Esmaon Par II Tocs: Hdden Markov Models Parcle Flers Addonal readng: L.R. Rabner, A uoral on hdden Markov models," Proceedngs of he IEEE, vol. 77,. 57-86, 989. Sequenal Mone Carlo Mehods

More information

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes. umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal

More information

Density Matrix Description of NMR BCMB/CHEM 8190

Density Matrix Description of NMR BCMB/CHEM 8190 Densy Marx Descrpon of NMR BCMBCHEM 89 Operaors n Marx Noaon Alernae approach o second order specra: ask abou x magnezaon nsead of energes and ranson probables. If we say wh one bass se, properes vary

More information

Hidden Markov Models

Hidden Markov Models 11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc

More information

CHAPTER FOUR REPEATED MEASURES IN TOXICITY TESTING

CHAPTER FOUR REPEATED MEASURES IN TOXICITY TESTING CHAPTER FOUR REPEATED MEASURES IN TOXICITY TESTING 4. Inroducon The repeaed measures sudy s a very commonly used expermenal desgn n oxcy esng because no only allows one o nvesgae he effecs of he oxcans,

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β SARAJEVO JOURNAL OF MATHEMATICS Vol.3 (15) (2007), 137 143 SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β M. A. K. BAIG AND RAYEES AHMAD DAR Absrac. In hs paper, we propose

More information