A quantum-statistical-mechanical extension of Gaussian mixture model

Size: px
Start display at page:

Download "A quantum-statistical-mechanical extension of Gaussian mixture model"

Transcription

1 Journal of Physcs: Conference Seres A quantum-statstcal-mechancal extenson of Gaussan mxture model To cte ths artcle: K Tanaka and K Tsuda 2008 J Phys: Conf Ser Vew the artcle onlne for updates and enhancements Related content - Evaluaton of fnte rate homogenous mxture model n cavtaton bubble collapse M Bhatt, A Gnanaskandan and K Mahesh - Exstence of shape-dependent thermodynamc lmt n spn systems wth short- and long-range nteractons Takash Mor - A smple topologcal model wth contnuous phase transton F Baron Recent ctatons - Quantum Mnmum Dstance Classfer Enrca Santucc - A quantum-nspred verson of the nearest mean classfer Guseppe Sergol et al - A quantum mechancs-based framework for mage processng and ts applcaton to mage segmentaton Akram Youssry et al Ths content was downloaded from IP address on 25/08/2018 at 04:01

2 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ A quantum-statstcal-mechancal extenson of Gaussan mxture model Kazuyuk Tanaka 1, and Koj Tsuda 2 1 Graduate School of Informaton Scences, Tohoku Unversty, Aramak-aza-aoba, Aoba-ku, Senda , Japan 2 Max Planck Insttute for Bologcal Cybernetcs, Spemannstrasse 38, Tübngen, Germany E-mal: kazu@smappstohokuacjp Abstract We propose an extenson of Gaussan mxture models n the statstcal-mechancal pont of vew The conventonal Gaussan mxture models are formulated to dvde all ponts n gven data to some knds of classes We ntroduce some quantum states constructed by superposng conventonal classes n lnear combnatons Our extenson can provde a new algorthm n classfcatons of data by means of lnear response formulas n the statstcal mechancs 1 Introducton Statstcal approaches are appled to data mnng n computer scences Data n real worlds nclude some fluctuatons and t s expected to employee some statstcal methods to do systematcal approaches for the treatments of such data Many statstcal approaches for nferences n data mnng are based on Bayesan formulas and maxmum lkelhood[1] One of fundamental approaches s a classfcaton by means of Gaussan mxture models In Gaussan mxture models, each data pont yelds from one of Gaussan dstrbutons wth averages and varances If we hope to dvde a gven data to three knds of classes, we have to prepare three knds of Gaussan dstrbutons For the classfcaton of gven data by means of Gaussan mxture models, a temperature has been ntroduced n order to search the maxmal pont of hyperparameters for the margnal lkelhood by usng an teratve procedure[2] The procedure s proposed wth beng based on an dea of smulated annealng methods n the Markov random felds As one of the other physcal approaches to fnd optmal solutons for massve probablstc nferences, we have quantum annealng method Quantum annealng method s formulated by replacng thermal effect by quantum effect n the smulated annealng[3, 4, 5, 6, 7] Quantum annealng has been ntroduced n the probablstc mage processng as well as optmzaton problems[8] It has been proved that the quantum annealng can fnd an optmal soluton more quckly than the smulated annealng[9] Moreover, some quantum error correctng codes are proposed and some performance lmts have been gven by usng a gauge theory n the physcs[10, 11, 12] On the other hand, an edge state n mage processng has been extended to a quantum state[13] Thus, many physcsts are nterested n applcatons of quantum fluctuatons to computer scences c 2008 IOP Publshng Ltd 1

3 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ In applyng quantum effects to computer scences, we have manly two dfferent vewponts One of them s an annealng and s appled to fnd an optmal soluton n massve computatonal problems The other s to adopt a quantum state tself as a state n computatonal models or probablstc models Our data often nclude a pont to whch t s dffcult to assgn one label n the statstcal classfcaton It s nterestng to ntroduce quantum states as label states n the classfcatons n both statstcal and physcal vewponts In ths paper, we extend conventonal Gaussan mxture models to quantum Gaussan mxture models and gve an algorthm to determne the estmates of hyperparameters n our proposed models In secton 2, we summarze conventonal Gaussan mxture models and the scheme of determnaton of hyperparameters n the statstcal framework In secton 3, we propose quantum Gaussan mxture models The probablty densty functons of conventonal Gaussan mxture models are expressed n terms of densty matrx representatons Densty matrx representatons of conventonal Gaussan mxture models are gven by dagonal matrces We ntroduce offdagonal elements n the reformulated densty matrx representaton of Gaussan mxture model In order to derve the extremum condton of the margnal lkelhood n quantum Gaussan mxture model wth respect to hyperparameters, we adopt lnear response formulas for densty matrces In sectons 4 and 5, we gve some numercal experments and concludng remarks, respectvely 2 Conventonal Gaussan Mxture Model In the present secton, we explan a classfcaton of gven data by usng the conventonal Gaussan mxture model The framework s based on maxmum lkelhood estmaton and Bayesan formula n the statstcs We ntroduce a set of data consstng of N real number y 0,y 1,,y N 1 The set of data s expressed n terms of an N-dmensonal column vector y =(y 0,y 1,,y N 1 ) t We consder to classfy the gven data to K knds of classes A label assgned to y s expressed by x andthe set of labels s expressed n terms of an N-dmensonal column vector x =(x 0,x 1,,x N 1 ) t The varable y can take any real value n the nterval (, + ) An ntegers of 0, 1, 2,,K 1 s assgned to the label varable x Now we consder to nfer how a gven data s classfed to K knds of classes by means of Bayes formula and maxmum lkelhood estmaton x and y can be regarded as sets of random varables for parameters and data, respectvely We assume that a set of parameters x s generated by accordng to the followng pror probablty: P(x α) = α k δ x,k, (1) N 1K 1 =0 k=0 where the set α {α 0,α 1,,α K 1 } satsfes the condton K 1 k=0 α k =1andδ a,b s Kronecker s delta Gven data y are generated from parameters x by accordng to the followng condtonal probablty: P(y x, μ, σ) = g x (y μ, σ), (2) N 1 =0 g k (y μ, σ) 1 exp( 1 2πσk 2σ 2 (y μ k ) 2 ) (3) k In the statstcs, the set μ {μ 0,μ 1,,μ K 1 }, σ {σ 0,σ 1,,σ K 1 } as well as α {α 0,α 1,,α K 1 } are referred to as hyperparameters Equatons (1) and (2) gve us the followng 2

4 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ jont probablty of x and y: P(x, y α, μ, σ) =P(y x, μ, σ)p(x α) = α x g x (y σ, α) (4) In the statstcal pont of vew, the estmates of α, μ and σ, α { α 0, α 1,, α K 1 }, μ { μ 0, μ 1,, μ K 1 } and σ { σ 0, σ 1,, σ K 1 }, can be determned from gven data y so as to maxmzng the margnal lkelhood defned by P(y α, μ, σ) = K 1 K 1 x 0 =0x 2 =0 K 1 x N 1 =0 N 1 =0 P(x, y α, μ, σ) = α x g x (y σ, α) (5) N 1K 1 =0 x =0 The extremum condton of P(y α, μ, σ) wth respect to α, μ, σ can be gven as μ k = =0 y Ψ (k μ, σ, α) =0 Ψ (k μ, σ, α), (6) α k = 1 N 1 Ψ (k μ, σ, α), (7) N =0 σ k 2 = =0 (y μ k ) 2 Ψ (k μ, σ, α) =0 Ψ, (8) (k μ, σ, α) Ψ (k μ, σ, α) α k g k (y μ, σ) K 1 k=0 α kg k (y μ, σ) (9) By solvng equatons (6)-(8) n the teraton method numercally, we obtan the estmates μ, σ and α for gven data y By usng Bayesan formula, we derve a posteror probablty of the set of parameter x for gven data y as follows: P(x y, μ, σ, α) = P(x, y μ, σ, α) P(y μ, σ, α) = Ψ (x μ, σ, α) (10) The estmates of x, x ( x 1, x 2,, x N 1 ) t can be determned so as to maxmzng the posteror probablty P(x y, μ, σ, α) wth respect to x 3 Quantum Gaussan Mxture Model In ths secton, we extend conventonal Gaussan mxture models to quantum Gaussan mxture models Our extenson s based on the constructon of quantum states by superposng conventonal classes n the lnear combnatons We gve also the determnaton of hyperparameters by means of the maxmzaton of lkelhood for the quantum Gaussan mxture models For any matrx A, the exponental functon e A and the logarthm functon ln(a) are defned by e A + n=0 1 n! An, lna + n=1 N 1 =0 1 n (I A)n (11) 3

5 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ We ntroduce the followng two K K real dagonal matrces F and G(y ): lnα lnα 1 0 F 0 0 lnα K 1, (12) g 0 (y ) g 1 (y ) 0 G(y ) 0 0 g K 1 (y ) (13) The margnal lkelhood P(y μ, σ, α) n equaton (5) s replaced by the followng densty matrx: where P(y μ, σ, α) = N 1 =0 tr e H(y ), (14) tr e F H(y ) F lng(y ) ln(α 0 g 0 (y )) ln(α 1 g 1 (y )) 0 = 0 0 ln(α K 1 g K 1 (y )) (15) Now we extend the dagonal matrx F to any K K real symmetrc matrx as follows: lnα 0 γ γ γ lnα 1 γ F = γ γ lnα K 1 (16) The N N matrx H(y ) can be rewrtten as K 1 K 1 H(y ) B () kk X kk k=0 k =0 ln(α 0 g 0 (y )) γ γ γ ln(α 1 g 1 (y )) γ = γ γ ln(α K 1 g K 1 (y )), (17) where X k,k defnes by s a K K matrx whose (l, l )-component l X kk l (l, l =0, 1,,K 1) are l X kk l δ k,l δ k,l, (18) 4

6 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ and the coeffcents B k,k (k, k =0, 1,,K 1) are defned by B () kk ln(α kg k (y )), B () kk γ (k k ) (19) The functon P(y μ, σ, α) defned by equatons (14) and (17) does not always satsfy the normalzaton condton P(y μ, σ, α)dy 0dy 1 dy N 1 =1, so that t s not regarded as a probablty densty functon of data set y In the present paper, we regard the functon P(y μ, σ, α) as a margnal lkelhood approxmately and formulate a quantummechancal extenson of Gaussan mxture model n the present paper By usng the lnear response theory, we have B () kk ln(tr e H(y) )= 1 tr e H(y ) tr 1 0 e (1 λ)h(y) X kk e λh(y) dλ = tr X kk e H(y ) tr e H(y (20) ) The extremum condtons for the margnal lkelhood P(y μ, σ, α) wth respect to μ, σ and α can be derved as μ k = =0 y ( trx kke H (y ) ) tr e H (y ) N 1 =0 ( trx kke H (y ) ), (21) tr e H (y ) σ k 2 = =0 (y μ k ) 2 ( trx kke H (y ) =0 ( trx kke H (y ) ) tr e H (y ) tr e H (y ) ), (22) ( α k =exp tr X k,k ln( 1 N 1 e H(y ) ) N tr e H(y ) ) (23) For gven data y, we obtan estmates of μ, σ and α by means of the teraton method numercally We remark that equatons (21)-(23) can be reduced to equatons (6)-(8) by settng γ =0 When we ntroduce the energy matrx H(y ), the posteror dstrbuton P(x y, μ, σ, α) can be also replaced by P (y, μ, σ, α) = =0 e E (24) tr e E E H(y 0 ) I I I + I H(y 1 ) I I +I I H(y 2 ) I + + I I I H(y N 1 ) (25) The estmates of label for y can be gven as an K-dmensonal egenvector x = ( x (0), x (1),, x (K 1) ) t whch corresponds to the mnmal egenvalue of K K local energy matrx H(y ) Here all egenvectors should be normalzed n ther absolute values The egenvector whch corresponds to the mnmal egenvalue of KN KN global energy matrx E s gven by a KN-dmensonal vector x = x 0 x 1 x 2 x N 1 (26) 5

7 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ Numercal Experments In ths secton we gve some numercal experments for estmatng the hyperparameters α, μ and σ when the data y s set to a standard mage The obtaned set of label can be regarded as a segmented mage for a gven standard mage By usng equatons (21)-(23), we do some numercal experments for a monochrome mage gven n fgure 1(a) In table 1, we gve estmates of hyperparameters, α, μ and σ and the values of logarthm of margnal lkelhood, L(y, α, μ, σ) 1 N lnp(y α, μ, σ) fork =3 Forthe estmates, α = α, μ = μ and σ = σ, egenvectors x =( x (0), x (1), x (2) ) t whch corresponds to the mnmal egenvalue of 3 3 matrxh(y ) are drawn n fgure 1(b)-(c) In fgure 1(b)-(c), x =(1, 0, 0) t, x =(0, 1, 0) t and x =(0, 0, 1) t correspond to red, green and blue, respectvely (a) (b) (c) (d) Fgure 1 Image segmentaton based on the quantum Gaussan mxture model for K = 3 (a) Orgnal mage (256 grades, N = ) (b) Segmented mage x for γ = 0 (c) Segmented mage x for γ =02 (d) Segmented mage x for γ =04 The probablty densty functon P(y α, μ, σ) tr e H(y) /tr e F and hstogram for the gven mage y are shown n fgure 2 In the curves of fgure 2, the values of γ are set to 0, 02, and 04, respectvely In γ =02, we can fnd a very soft peak at y = 72 around The soft peak nclude some regons whch are segmented as red areas for γ =02 and are segmented as green areas 5 Concludng Remarks In ths paper, we have gven an extenson of the probablty densty functon for conventonal Gaussan mxture model to a densty matrx The formulaton s based on Bayesan statstcs and the maxmum lkelhood method In the estmaton of hyperparameters, we have to derve 6

8 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ Table 1 Estmates of hyperparameters, α, μ and σ and values of L(y, α, μ, σ) 1 N lnp(y α, μ, σ) γ L(y, α, μ, σ) α k μ k σ k k = k = k = k = k = k = k = k = k = the extremum condtons and we employee the lnear response formulas for quantum-statstcal models We have constructed the estmaton algorthm as an teratve procedure and have gven some numercal experments The functon P(y μ, σ, α) defned by equatons (14) and (17) does not always satsfy the normalzaton condton so that t s not regarded as a probablty densty functon of data set y In order to guarantee the normalzaton condton, we have to defne P(y μ, σ, α) = N 1 =0 tr e H(y ) + tr, (27) e H(z) dz nstead of equaton (14) We should formulate our quantum-mechancal extenson for equaton (27) However, t may be dffcult to calculate the ntegrals + tr e H(z) dz analytcally It remansasafutureproblem Densty matrces nclude some quantum effects and are based on states constructed by superposng some knds of classcal states For example, when we denote three possble classcal states n terms of three-dmensonal vectors (1, 0, 0) t,(0, 1, 0) t and (0, 0, 1) t, all possble quantum states are expressed n terms of lnear combnatons of the three vectors n every extenson to densty matrx In some capactes, quantum statstcal models are expected to be beyond the conventonal statstcal models For example, quantum statstcal models may gve us new optmal solutons n statstcal nferences In fact, our numercal experments yelds some nce segmentaton results by ntroducng quantum states n the Gaussan mxture model Quantum Gaussan mxture model has succeeded n splttng some regons from the background, though the conventonal Gaussan mxture model cannot splt the correspondng regons from the background In the present problems n ths paper, we assume not to nclude nteractons between any pars of components y and y j n gven data y = (y 0,y 1,,y N 1 ) t Moreover we do not consder nteractons between any pars of elements n the set of labels Thus the densty matrx n equaton (14) has been factorzed wth respect to each Though the energy matrx H n equaton (25) seems to be an NK NK matrx, the rank s just K Such problems can be regarded as one-body problems n the statstcal mechancs If we consder nteractons between some pars of components, n the gven data or n the set of label to estmate, t s hard to express the densty matrx n terms of a factorzable form as shown n equaton (14) When 7

9 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ P(y α,μ,σ) ^ ^ ^ 001 γ = P(y α,μ,σ) ^ ^ ^ γ = 02 y 002 P(y α,μ,σ) ^ ^ ^ y γ = y Fgure 2 Curves of the probablty densty functon P(y α, μ, σ) tr e H(y) /tr e F and hstogram for the gven mage y n fgure 1(a) The curves of red lne show the values of P(y α, μ, σ) for varous values of y n the nterval [0, 255] The hyperparameter γ s set to 0, 02 and04 The hstogram of the gven mage y n fgure 1 s expressed n terms of green area n each graph the energy matrx H sgvennsuchaway,therankofh s not K any more and we have to dagonalze the NK NK matrx H Many authors have nvestgated belef propagaton and the other advanced mean-feld methods to statstcal nferences[15, 16, 17, 18, 19] It s nterestng to apply quantum belef propagaton and the other advanced quantum mean-feld methods to such cases as an approxmate algorthm Suzuk et al nvestgated some quantum annealng algorthms by means of a quantum statstcal verson of Bethe approxmaton[20] One of quantum statstcal-mechancal extensons of belef propagatons corresponds to a quantum 8

10 Internatonal Workshop on Statstcal-Mechancal Informatcs 2007 (IW-SMI 2007) IOP Publshng Journal of Physcs: Conference Seres 95 (2008) do:101088/ /95/1/ cluster varaton method[21] It s nterestng to apply a quantum cluster varaton method to statstcal nferences wth quantum states and some nteractons n the computer scences Ths s one of future problems Acknowledgements Ths work was partly supported by the Grants-In-Ad (No , No and No ) for Scentfc Research from the Mnstry of Educaton, Culture, Sports, Scence and Technology of Japan References [1] Bshop C M 2006 Pattern Recognton and Machne Learnng (New York: Sprnger) [2] Ueda N and Nakano R 1998 Neural Networks [3] Kadowak T and Nshmor H 1998 Phys Rev E [4] Brooke J, Btko D, Rosenbaum T F and Aeppl G 1999 Scence [5] Santoro G E, Martoňák R, Tosatt E and Car R 2002 Scence [6] Santoro G E and Tossat E 2006 J Phys A: Math Gen 39 R393 [7] Suzuk S and Okada M 2005 J Phys Soc Jpn [8] Tanaka K and Horguch T 1997 IEICE Transactons (A), J80-A, 2117 (n Japanese); translated n Electroncs and Communcatons n Japan, Part 3: Fundamental Electronc Scence, 83, 84 [9] Morta S and Nshmor H 2006 J Phys A: Math Gen [10] Denns E, Ktaev A, Landahl A and Preskll J 2002 J Phys A: Math Gen [11] Nshmor H and Sollch P 2004 J Phys Soc Jpn [12] Takeda K and Nshmor H 2004 Nucl Phys B [13] Tanaka K 2002 J Phys A: Math Gen 35 R81 [14] Nshmor H 2001 Statstcal Physcs of Spn Glasses and Informaton Processng, An Introducton (New York: Oxford Unversty Press) [15] Opper M and Saad D (eds) 2001 Advanced Mean Feld Methods Theory and Practce (Cambrdge: MIT Press) [16] Tanaka K 2003 IEICE Transactons on Informaton and Systems E86-D 1228 [17] Ikeda S, Tanaka T and Amar S 2004 Neural Computaton [18] Yedda J S, Freeman W T and Wess Y 2005 IEEE Transactons on Informaton Theory [19] Pelzzola A 2005 J Phys A: Math Gen 38 R309 [20] Suzuk S, Nshmor H and Suzuk M 2007 Phys Rev E [21] Morta T 1957, J Phys Soc Jpn

A quantum-statistical-mechanical extension of Gaussian mixture model

A quantum-statistical-mechanical extension of Gaussian mixture model A quantum-statstcal-mechancal extenson of Gaussan mxture model Kazuyuk Tanaka, and Koj Tsuda 2 Graduate School of Informaton Scences, Tohoku Unversty, 6-3-09 Aramak-aza-aoba, Aoba-ku, Senda 980-8579, Japan

More information

Probabilistic image processing and Bayesian network

Probabilistic image processing and Bayesian network Computatonal Intellgence Semnar (8 November, 2005, Waseda Unversty, Tokyo, Japan) Probablstc mage processng and Bayesan network Kazuyuk Tanaka 1 Graduate School of Informaton Scences, Tohoku Unversty,

More information

Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation

Physical Fluctuomatics Applied Stochastic Process 9th Belief propagation Physcal luctuomatcs ppled Stochastc Process 9th elef propagaton Kazuyuk Tanaka Graduate School of Informaton Scences Tohoku Unversty kazu@smapp.s.tohoku.ac.jp http://www.smapp.s.tohoku.ac.jp/~kazu/ Stochastc

More information

Probabilistic image processing and Bayesian network

Probabilistic image processing and Bayesian network Randomness and Computaton Jont Workshop New Horzons n Computng and Statstcal echancal Approach to Probablstc Informaton Processng (18-21 July, 2005, Senda, Japan) Lecture Note n Tutoral Sessons Probablstc

More information

Gaussian process classification: a message-passing viewpoint

Gaussian process classification: a message-passing viewpoint Gaussan process classfcaton: a message-passng vewpont Flpe Rodrgues fmpr@de.uc.pt November 014 Abstract The goal of ths short paper s to provde a message-passng vewpont of the Expectaton Propagaton EP

More information

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach A Bayes Algorthm for the Multtask Pattern Recognton Problem Drect Approach Edward Puchala Wroclaw Unversty of Technology, Char of Systems and Computer etworks, Wybrzeze Wyspanskego 7, 50-370 Wroclaw, Poland

More information

Statistical performance analysis by loopy belief propagation in probabilistic image processing

Statistical performance analysis by loopy belief propagation in probabilistic image processing Statstcal perormance analyss by loopy bele propaaton n probablstc mae processn Kazuyuk Tanaka raduate School o Inormaton Scences Tohoku Unversty Japan http://www.smapp.s.tohoku.ac.p/~kazu/ Collaborators

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Why Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one)

Why Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one) Why Bayesan? 3. Bayes and Normal Models Alex M. Martnez alex@ece.osu.edu Handouts Handoutsfor forece ECE874 874Sp Sp007 If all our research (n PR was to dsappear and you could only save one theory, whch

More information

Conjugacy and the Exponential Family

Conjugacy and the Exponential Family CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the

More information

Statistical inference for generalized Pareto distribution based on progressive Type-II censored data with random removals

Statistical inference for generalized Pareto distribution based on progressive Type-II censored data with random removals Internatonal Journal of Scentfc World, 2 1) 2014) 1-9 c Scence Publshng Corporaton www.scencepubco.com/ndex.php/ijsw do: 10.14419/jsw.v21.1780 Research Paper Statstcal nference for generalzed Pareto dstrbuton

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Maxmum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models

More information

Inexact Newton Methods for Inverse Eigenvalue Problems

Inexact Newton Methods for Inverse Eigenvalue Problems Inexact Newton Methods for Inverse Egenvalue Problems Zheng-jan Ba Abstract In ths paper, we survey some of the latest development n usng nexact Newton-lke methods for solvng nverse egenvalue problems.

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess

More information

1 Motivation and Introduction

1 Motivation and Introduction Instructor: Dr. Volkan Cevher EXPECTATION PROPAGATION September 30, 2008 Rce Unversty STAT 63 / ELEC 633: Graphcal Models Scrbes: Ahmad Beram Andrew Waters Matthew Nokleby Index terms: Approxmate nference,

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore 8/5/17 Data Modelng Patrce Koehl Department of Bologcal Scences atonal Unversty of Sngapore http://www.cs.ucdavs.edu/~koehl/teachng/bl59 koehl@cs.ucdavs.edu Data Modelng Ø Data Modelng: least squares Ø

More information

EM and Structure Learning

EM and Structure Learning EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Relevance Vector Machines Explained

Relevance Vector Machines Explained October 19, 2010 Relevance Vector Machnes Explaned Trstan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introducton Ths document has been wrtten n an attempt to make Tppng s [1] Relevance Vector Machnes

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2) 1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern

More information

Probability-Theoretic Junction Trees

Probability-Theoretic Junction Trees Probablty-Theoretc Juncton Trees Payam Pakzad, (wth Venkat Anantharam, EECS Dept, U.C. Berkeley EPFL, ALGO/LMA Semnar 2/2/2004 Margnalzaton Problem Gven an arbtrary functon of many varables, fnd (some

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of

More information

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30 STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

The big picture. Outline

The big picture. Outline The bg pcture Vncent Claveau IRISA - CNRS, sldes from E. Kjak INSA Rennes Notatons classes: C = {ω = 1,.., C} tranng set S of sze m, composed of m ponts (x, ω ) per class ω representaton space: R d (=

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

On mutual information estimation for mixed-pair random variables

On mutual information estimation for mixed-pair random variables On mutual nformaton estmaton for mxed-par random varables November 3, 218 Aleksandr Beknazaryan, Xn Dang and Haln Sang 1 Department of Mathematcs, The Unversty of Msssspp, Unversty, MS 38677, USA. E-mal:

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Identification of Linear Partial Difference Equations with Constant Coefficients

Identification of Linear Partial Difference Equations with Constant Coefficients J. Basc. Appl. Sc. Res., 3(1)6-66, 213 213, TextRoad Publcaton ISSN 29-434 Journal of Basc and Appled Scentfc Research www.textroad.com Identfcaton of Lnear Partal Dfference Equatons wth Constant Coeffcents

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Statistical learning

Statistical learning Statstcal learnng Model the data generaton process Learn the model parameters Crteron to optmze: Lkelhood of the dataset (maxmzaton) Maxmum Lkelhood (ML) Estmaton: Dataset X Statstcal model p(x;θ) (θ parameters)

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

Convexity preserving interpolation by splines of arbitrary degree

Convexity preserving interpolation by splines of arbitrary degree Computer Scence Journal of Moldova, vol.18, no.1(52), 2010 Convexty preservng nterpolaton by splnes of arbtrary degree Igor Verlan Abstract In the present paper an algorthm of C 2 nterpolaton of dscrete

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors Stat60: Bayesan Modelng and Inference Lecture Date: February, 00 Reference Prors Lecturer: Mchael I. Jordan Scrbe: Steven Troxler and Wayne Lee In ths lecture, we assume that θ R; n hgher-dmensons, reference

More information

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k)

Hongyi Miao, College of Science, Nanjing Forestry University, Nanjing ,China. (Received 20 June 2013, accepted 11 March 2014) I)ϕ (k) ISSN 1749-3889 (prnt), 1749-3897 (onlne) Internatonal Journal of Nonlnear Scence Vol.17(2014) No.2,pp.188-192 Modfed Block Jacob-Davdson Method for Solvng Large Sparse Egenproblems Hongy Mao, College of

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

An identification algorithm of model kinetic parameters of the interfacial layer growth in fiber composites

An identification algorithm of model kinetic parameters of the interfacial layer growth in fiber composites IOP Conference Seres: Materals Scence and Engneerng PAPER OPE ACCESS An dentfcaton algorthm of model knetc parameters of the nterfacal layer growth n fber compostes o cte ths artcle: V Zubov et al 216

More information

Information Geometry of Gibbs Sampler

Information Geometry of Gibbs Sampler Informaton Geometry of Gbbs Sampler Kazuya Takabatake Neuroscence Research Insttute AIST Central 2, Umezono 1-1-1, Tsukuba JAPAN 305-8568 k.takabatake@ast.go.jp Abstract: - Ths paper shows some nformaton

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

C4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z )

C4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z ) C4B Machne Learnng Answers II.(a) Show that for the logstc sgmod functon dσ(z) dz = σ(z) ( σ(z)) A. Zsserman, Hlary Term 20 Start from the defnton of σ(z) Note that Then σ(z) = σ = dσ(z) dz = + e z e z

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

8 : Learning in Fully Observed Markov Networks. 1 Why We Need to Learn Undirected Graphical Models. 2 Structural Learning for Completely Observed MRF

8 : Learning in Fully Observed Markov Networks. 1 Why We Need to Learn Undirected Graphical Models. 2 Structural Learning for Completely Observed MRF 10-708: Probablstc Graphcal Models 10-708, Sprng 2014 8 : Learnng n Fully Observed Markov Networks Lecturer: Erc P. Xng Scrbes: Meng Song, L Zhou 1 Why We Need to Learn Undrected Graphcal Models In the

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Maximum Likelihood Estimation (MLE)

Maximum Likelihood Estimation (MLE) Maxmum Lkelhood Estmaton (MLE) Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 01 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y

More information

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis Statstcal analyss usng matlab HY 439 Presented by: George Fortetsanaks Roadmap Probablty dstrbutons Statstcal estmaton Fttng data to probablty dstrbutons Contnuous dstrbutons Contnuous random varable X

More information

Probabilistic & Unsupervised Learning

Probabilistic & Unsupervised Learning Probablstc & Unsupervsed Learnng Convex Algorthms n Approxmate Inference Yee Whye Teh ywteh@gatsby.ucl.ac.uk Gatsby Computatonal Neuroscence Unt Unversty College London Term 1, Autumn 2008 Convexty A convex

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

PHYS 215C: Quantum Mechanics (Spring 2017) Problem Set 3 Solutions

PHYS 215C: Quantum Mechanics (Spring 2017) Problem Set 3 Solutions PHYS 5C: Quantum Mechancs Sprng 07 Problem Set 3 Solutons Prof. Matthew Fsher Solutons prepared by: Chatanya Murthy and James Sully June 4, 07 Please let me know f you encounter any typos n the solutons.

More information

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model EXACT OE-DIMESIOAL ISIG MODEL The one-dmensonal Isng model conssts of a chan of spns, each spn nteractng only wth ts two nearest neghbors. The smple Isng problem n one dmenson can be solved drectly n several

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

Statistical pattern recognition

Statistical pattern recognition Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Mamum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models for

More information

RELIABILITY ASSESSMENT

RELIABILITY ASSESSMENT CHAPTER Rsk Analyss n Engneerng and Economcs RELIABILITY ASSESSMENT A. J. Clark School of Engneerng Department of Cvl and Envronmental Engneerng 4a CHAPMAN HALL/CRC Rsk Analyss for Engneerng Department

More information

Classification as a Regression Problem

Classification as a Regression Problem Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class

More information

Quantum and Classical Information Theory with Disentropy

Quantum and Classical Information Theory with Disentropy Quantum and Classcal Informaton Theory wth Dsentropy R V Ramos rubensramos@ufcbr Lab of Quantum Informaton Technology, Department of Telenformatc Engneerng Federal Unversty of Ceara - DETI/UFC, CP 6007

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Gaussian Conditional Random Field Network for Semantic Segmentation - Supplementary Material

Gaussian Conditional Random Field Network for Semantic Segmentation - Supplementary Material Gaussan Condtonal Random Feld Networ for Semantc Segmentaton - Supplementary Materal Ravtea Vemulapall, Oncel Tuzel *, Mng-Yu Lu *, and Rama Chellappa Center for Automaton Research, UMIACS, Unversty of

More information

On the Multicriteria Integer Network Flow Problem

On the Multicriteria Integer Network Flow Problem BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 5, No 2 Sofa 2005 On the Multcrtera Integer Network Flow Problem Vassl Vasslev, Marana Nkolova, Maryana Vassleva Insttute of

More information

A Local Variational Problem of Second Order for a Class of Optimal Control Problems with Nonsmooth Objective Function

A Local Variational Problem of Second Order for a Class of Optimal Control Problems with Nonsmooth Objective Function A Local Varatonal Problem of Second Order for a Class of Optmal Control Problems wth Nonsmooth Objectve Functon Alexander P. Afanasev Insttute for Informaton Transmsson Problems, Russan Academy of Scences,

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Mean Field / Variational Approximations

Mean Field / Variational Approximations Mean Feld / Varatonal Appromatons resented by Jose Nuñez 0/24/05 Outlne Introducton Mean Feld Appromaton Structured Mean Feld Weghted Mean Feld Varatonal Methods Introducton roblem: We have dstrbuton but

More information

Basic Statistical Analysis and Yield Calculations

Basic Statistical Analysis and Yield Calculations October 17, 007 Basc Statstcal Analyss and Yeld Calculatons Dr. José Ernesto Rayas Sánchez 1 Outlne Sources of desgn-performance uncertanty Desgn and development processes Desgn for manufacturablty A general

More information

LECTURE 9 CANONICAL CORRELATION ANALYSIS

LECTURE 9 CANONICAL CORRELATION ANALYSIS LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of

More information

TAP Gibbs Free Energy, Belief Propagation and Sparsity

TAP Gibbs Free Energy, Belief Propagation and Sparsity TAP Gbbs Free Energy, Belef Propagaton and Sparsty Lehel Csató and Manfred Opper Neural Computng Research Group Dvson of Electronc Engneerng and Computer Scence Aston Unversty Brmngham B4 7ET, UK. [csatol,opperm]@aston.ac.uk

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

Why BP Works STAT 232B

Why BP Works STAT 232B Why BP Works STAT 232B Free Energes Helmholz & Gbbs Free Energes 1 Dstance between Probablstc Models - K-L dvergence b{ KL b{ p{ = b{ ln { } p{ Here, p{ s the eact ont prob. b{ s the appromaton, called

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations Quantum Physcs 量 理 Robert Esberg Second edton CH 09 Multelectron atoms ground states and x-ray exctatons 9-01 By gong through the procedure ndcated n the text, develop the tme-ndependent Schroednger equaton

More information

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012 MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:

More information

Perron Vectors of an Irreducible Nonnegative Interval Matrix

Perron Vectors of an Irreducible Nonnegative Interval Matrix Perron Vectors of an Irreducble Nonnegatve Interval Matrx Jr Rohn August 4 2005 Abstract As s well known an rreducble nonnegatve matrx possesses a unquely determned Perron vector. As the man result of

More information

A FORMULA FOR COMPUTING INTEGER POWERS FOR ONE TYPE OF TRIDIAGONAL MATRIX

A FORMULA FOR COMPUTING INTEGER POWERS FOR ONE TYPE OF TRIDIAGONAL MATRIX Hacettepe Journal of Mathematcs and Statstcs Volume 393 0 35 33 FORMUL FOR COMPUTING INTEGER POWERS FOR ONE TYPE OF TRIDIGONL MTRIX H Kıyak I Gürses F Yılmaz and D Bozkurt Receved :08 :009 : ccepted 5

More information

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14 APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

5. POLARIMETRIC SAR DATA CLASSIFICATION

5. POLARIMETRIC SAR DATA CLASSIFICATION Polarmetrc SAR data Classfcaton 5. POLARIMETRIC SAR DATA CLASSIFICATION 5.1 Classfcaton of polarmetrc scatterng mechansms - Drect nterpretaton of decomposton results - Cameron classfcaton - Lee classfcaton

More information

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 )

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 ) Kangweon-Kyungk Math. Jour. 4 1996), No. 1, pp. 7 16 AN ITERATIVE ROW-ACTION METHOD FOR MULTICOMMODITY TRANSPORTATION PROBLEMS Yong Joon Ryang Abstract. The optmzaton problems wth quadratc constrants often

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

THE STURM-LIOUVILLE EIGENVALUE PROBLEM - A NUMERICAL SOLUTION USING THE CONTROL VOLUME METHOD

THE STURM-LIOUVILLE EIGENVALUE PROBLEM - A NUMERICAL SOLUTION USING THE CONTROL VOLUME METHOD Journal of Appled Mathematcs and Computatonal Mechancs 06, 5(), 7-36 www.amcm.pcz.pl p-iss 99-9965 DOI: 0.75/jamcm.06..4 e-iss 353-0588 THE STURM-LIOUVILLE EIGEVALUE PROBLEM - A UMERICAL SOLUTIO USIG THE

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

Fixed point method and its improvement for the system of Volterra-Fredholm integral equations of the second kind

Fixed point method and its improvement for the system of Volterra-Fredholm integral equations of the second kind MATEMATIKA, 217, Volume 33, Number 2, 191 26 c Penerbt UTM Press. All rghts reserved Fxed pont method and ts mprovement for the system of Volterra-Fredholm ntegral equatons of the second knd 1 Talaat I.

More information

SDMML HT MSc Problem Sheet 4

SDMML HT MSc Problem Sheet 4 SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be

More information