Fabio Rapallo. p x = P[x] = ϕ(t (x)) x X, (1)

Size: px
Start display at page:

Download "Fabio Rapallo. p x = P[x] = ϕ(t (x)) x X, (1)"

Transcription

1 Frst Internatonal School on Algebrac Statstcs STID Menton (France) February, 17-18, 2003 Torc Statstcal Models. The Dacons Sturmfels algorthm for log-lnear models. Tutoral Fabo Rapallo 1 Theoretcal recalls In ths tutoral we show the practcal applcablty of the algorthm descrbed n the paper by Dacons & Sturmfels (1998) who apply the algebrac theory of torc deals to defne a Markov Chan Monte Carlo method for samplng from condtonal dstrbutons, through the noton of Markov bass. For some basc concepts from Computatonal Commutatve Algebra, such as polynomal deal, term-orderng and Gröbner bass, we refer to Kreuzer & Robbano (2000). Fundamental references for the applcatons of Commutatve Algebra to Probablty and Statstcs for fnte sample spaces are Pstone et al. (2001a), Chapter 6, and Pstone et al. (2001b). For the theory of log-lnear models, we can refer to Fenberg (1980) and Agrest (2002). Let X be a fnte set and let p x = P[x] = ϕ(t (x)) x X, (1) be a probablty model wth suffcent statstc T : X N s and such that the dstrbuton of a sample of ndependent and dentcally P-dstrbuted random varables X = (X 1,..., X N ) s of the form P N = ψ(t N ) where T N (X) = N T (X k ). (2) Ths means that the suffcent statstc of X s the sum of the suffcent statstcs of the one-dmensonal random varables X k, k = 1,..., N. We denote Y t = {(x 1,..., x N ) : T N (x 1,..., x N ) = t}, (3).e., the set of all samples wth fxed value t of the suffcent statstc T N. It s known that the dstrbuton of X gven {T N = t} s unform on Y t, n fact P N (x 1,..., x N T N = t] = ψ(t) = P N (x 1,..., x N T N = t]. (4) 1 k=1

2 We ntroduce the space { F t = f : X N : } f(x)t (x) = t x X, (5).e., the set of all frequency tables obtaned from samples wth value t of the suffcent statstc T N. Denotng by F the canoncal mappng from Y t to F t, the mage probablty of P N [ T N = t] nduced by F s H t (f) = P N [F 1 (f) T = t] = #{(x 1,..., x N ) F (x 1,..., x N ) = f} #Y t, (6) whch s by defnton the hypergeometrc dstrbuton on F t. Smple computatons show that H t (f) = N! (f(x)!) 1. (7) #Y t x X A log-lnear model defnes restrctons on the parameter space, through constrants on the p x s. Such restrctons are the mathematcal counterpart of statstcal notons, such as ndependence, condtonal ndependence, symmetry and others. It s common practce to test wether the observed data match wth the statstcal model or not. Ths can be done wth a goodness of ft test, computng the maxmum lkelhood estmate of the cell counts and then usng the Pearson s statstc C = x X (f obs (x) ˆf(x)) 2 ˆf(x) (8) where ˆf(x) s the maxmum lkelhood estmate of the count n x. Large values of the test statstc ndcate a departure from the null hypothess, so we have a one-tal test. The usual approach s the asymptotc one, whch nvolves ch-squared dstrbutons, but n many cases, especally when the table s sparse, the ch-squared approxmaton may not be adequate (for further detals on ths topc see, for example, Appendx IV of Fenberg (1980)). In the exact framework, the p value s defned as the probablty of the tables n F t havng a value of the test statstc C greater than or equal to that of the observed table. A frst example of exact computaton s the Fsher s exact test for 2 2 tables, whch computes the exact p value usng the move + n order to obtan all the tables + wth fxed margnal totals. Note that n general t s dffcult to fnd the number of tables n F t and to have a complete lst of all tables n F t. We can obtan approxmatons of test statstcs va Monte Carlo methods, drawng an d hypergeometrc sample of contngency tables n F t. The problem s then reduced to sample from the hypergeometrc dstrbuton on F t, or equvalently from the unform dstrbuton on Y t. Lterature suggests to avod the enumeraton problem va the use of Markov Chans Monte Carlo (MCMC) methods. In partcular we are nterested n Metropols Hastngs algorthm whch rests on a set of moves for constructng the relevant Markov chan. A revew on the Metropols Hastngs algorthm can be found n Chb & Greenberg (1995). 2

3 Defnton 1.1 A Markov bass of F t s a set of functons m 1,..., m L : X Z, called moves, such that for any 1 L m (x)t (x) = 0, (9) x X where T s the suffcent statstc, and for any f, f F t there exst a sequence of moves (m 1,..., m A ) and a sequence (ɛ j ) A j=1 wth ɛ j = ±1 such that f = f + A ɛ j m j (10) j=1 and for all 1 a A. a f + ɛ j m j 0 (11) j=1 The condton (9) mples that a move s a table wth nteger entres (even negatve) and such that the value of the suffcent statstc s constant for every table obtaned wth moves n {m 1,..., m L }. Note once agan the mportance of the lnearty condton for the suffcent statstc T. In partcular, f the suffcent statstc are the margns, then every move s a table wth null margns. For example, f we consder the margnal totals as suffcent statstc for the 3 3 tables, a move s m = (12) From ths defnton t s clear that Markov bass s the man tool to defne a randomwalk-lke Markov chan on F t. It s well known that a connected, reversble and aperodc Markov chan converges to the statonary dstrbuton. In practce, to obtan a sample from the dstrbuton of nterest σ(f) on F t, the Markov chan s performed as follows: (a) at the tme 0 the chan s n f; (b) choose a move m unformly n the Markov bass and ɛ = ±1 wth probablty 1/2 each ndependently from m; (c) f f + ɛm 0 then move the chan from f to f + ɛm wth probablty mn{σ(f + ɛm)/σ(f), 1}; n all other cases, stay at f. Let us recall here the basc convergence theorem for the Metropols Hastngs algorthm workng n our framework. Theorem 1.2 Let σ(f) be a postve functon on F t. Gven a Markov bass {m 1,... m L }, the Markov chan generated followng the descrbed algorthm s connected, reversble and aperodc on F t wth statonary dstrbuton proportonal to σ(f). 3

4 In order to navgate the set F t and to defne a MCMC algorthm for the computaton of the p value of the goodness of ft tests, we use the theory of torc deals, see Bgatt & Robbano (2001) for detals. We assocate an ndetermnate ξ to every sample pont and an ndetermnate y j to every component of the suffcent statstc and we defne the polynomal rngs over a feld K wth such ndetermnates. Consder a model wth suffcent statstc T and a Markov chan wth moves n a set M. The correspondence between moves and polynomals s gven by the followng rule. Decompose a move m nto ts postve and negatve parts m + m and defne the bnomal b m = ξ m+ ξ m. For example, the prevous move (12) corresponds to the bnomal n the polynomal rng K[ξ 11,..., ξ 33 ]. Defne the two deals b = ξ 12 ξ 21 ξ 23 ξ 32 ξ 11 ξ 2 22ξ 33 (13) I T = Ideal(ξ a ξ b T (a) = T (b)) (14) I M = Ideal(ξ m+ ξ m = 1,..., L) (15) The deal I T s a torc deal. The man result for fndng Markov bases (e.g. for showng the connectedness of the Markov chan) usng Commutatve Algebra, presented n Dacons & Sturmfels (1998), s the followng. Theorem 1.3 The Markov chan wth moves n M s connected f and only f I M = I T. Thus, n order to fnd the relevant Markov bases, we compute I T, the torc deal assocated to the suffcent statstc T, and we mpose I M = I T. Elmnaton-based algorthm A smple algorthm to compute the Gröbner bass of the torc deal I = I(τ 1,..., τ r ) s based on the elmnaton algorthm as follows. We frst consder the homomorphsm π : K[ξ 1,..., ξ r ] K[y 1,..., y s ] (16) defned by ξ τ for all X ; then, we consder the deal J n the polynomal rng K[ξ 1,..., ξ r, y 1,..., y s ] generated by the set of polynomals Usng the elmnaton theory, we have {ξ 1 τ 1,..., ξ r τ r }. (17) I = J K[ξ 1,..., ξ r ]. (18) A (reduced) Gröbner bass of I s obtaned computng a (reduced) Gröbner bass of J wth respect to a term-orderng of elmnaton for y 1,..., y s and takng the only polynomals not nvolvng the y s. 4

5 Saturaton-based algorthm Another method can be found n Bgatt et al. (1999). Such method s based on the theory of the saturaton and t s more effcent from the computatonal pont of vew. As the suffcent statstc s a lnear map T : N r N s, we can wrte ts matrx representaton A T. The saturaton algorthm follows these steps. Compute a bass v 1,..., v q of the kernel of T as vector space homomorphsm. As the elements of A T are nteger, such bass can be chosen wth nteger elements. Defne v = v + v, for = 1,..., q and defne the bnomals b = ξ v+ ξ v n the polynomal rng K[ξ]. Defne the deal I = Ideal(b 1,..., b q ). Saturate the deal I wth respect to the polynomal ψ = X ξ, the product of all the ξ ndetermnates,.e. compute the deal The deal I s the torc deal assocated to T. I = Elm(v, I + Ideal(ψv 1)) (19) The functon Torc of CoCoA (Capan et al. (2000)) uses ths algorthm and computes a bass of the torc deal startng from the matrx A T. Note that the theory of Markov bases does not need the noton of Gröbner bass. It s suffcent to consder the deals and set of generators. Gröbner bases wll be used below as computatonal tool. Moreover, the Markov bass does not depend from the observed value t of the suffcent statstc T, but only from ts functonal form. Remark 1.4 In the framework of log-lnear models we have models of the form s log m = λ (j) (20) where m s the mean of the -th cell and λ s are real parameters. For example, the ndependence model for two random varables X and Y wth I and J levels respectvely s j=1 log m j = λ + λ (X) + λ (Y ) j = 1,..., I, j = 1,..., J. (21) In terms of cell probabltes, s s p = N 1 m = N 1 exp(λ j ) = ζ j. (22) j=1 In such way, a statstcal model s defned through mplct relatons among the p s gven n mplct form va a set of power products. 2 Exercses The MCMC step of the algorthm s a classcal one, and you can use the Matlab programs smul and chsq. Note that n such programs we use a vector representaton also for the contngency tables and not a matrx representaton. Our choce s very useful for mult-way tables. j=1 5

6 Moreover, n order to transform a lst of polynomals nto a matrx, you can use the Co- CoA functon TorcToMat, possbly wth some further ASCII manpulaton for matchng the Matlab nput requrements. The log-lnear models presented here are extensvely dscussed n Rapallo (2002a) and Rapallo (2002b), both from the statstcal and the algebrac pont of vew. Exercse 1 The ndependence model for two-way tables has the form log m j = λ + λ (X) + λ (Y ) j (23) for = 1,..., I and j = 1,..., J. The components of the suffcent statstc are the row sums and the column sums of the table. Consder a 2 3 table. a) Wrte the power products defnng the model; b) usng the elmnaton algorthm, compute the Markov bass; c) observe that the Markov bass s a well known algebrac object,.e. t s the set of + all for any 2 2 mnor of the table and 0 otherwse (possbly modulo the + sgn); d) wrte the matrx representaton of the suffcent statstc and compute the same Markov bass wth the functon Torc; e) use the Markov bass n the numercal algorthm n order to test the ndependence model for the followng table The maxmum lkelhood estmate s 3 0 f obs = 4 1 (24) ˆf = (25) Exercse 2 Consder now the ndependence model, but for a 3 3 table wth a structural zero. A structural zero s a cell wth a pror zero probablty. Suppose that the structural zero be the (1, 1) cell. The log-lnear form of the model s agan log m j = λ + λ (X) + λ (Y ) j (26) but for (, j) (1, 1). The components of the suffcent statstc are the row sums and the column sums of the table. 6

7 a) Wrte the power products defnng the model; b) usng the elmnaton algorthm, compute the Markov bass; c) wrte the matrx representaton of the suffcent statstc and compute the same Markov bass wth the functon Torc; d) repeat c) consderng all the dagonal elements as structural zeros. Exercse 3 The quas-ndependence model s a log-lnear model for square tables whch consder ndependence except for the dagonal cells whch are ftted exactly. The log-lnear form of the model s log m j = λ + λ (X) + λ (Y ) j + δ I =j (27) for = 1,..., I and j = 1,..., I. The components of the suffcent statstc are the row sums, the column sums and the dagonal counts. a) Compute the Markov bass for the quas-ndependence model for the 3 3 tables and for the 4 4 tables (use the saturaton algorthm); b) use the 4 4 Markov bass n the numercal algorthm n order to test the ndependence model for the followng table f obs = The maxmum lkelhood estmate s (28) ˆf = (29) c) use the 3 3 table to compute the cardnalty of the reference set F t for the followng table f obs = (30)

8 Exercse 4 The quas-symmetry model s a log-lnear model for square tables whch consder the symmetry of the two varables, but t does not need the margnal homogenety. The log-lnear form of the model s log m j = λ + λ (X) + λ (Y ) j + λ (XY j ) (31) for = 1,..., I and j = 1,..., I, wth the constrants λ (XY j ) = λ (XY j ) for all, j = 1,..., I. The components of the suffcent statstc are the row sums, the column sums and the dagonal-opposte cells sums. a) Compute the Markov bass for the quas-symmetry model for the 3 3 tables and for the 4 4 tables (use the saturaton algorthm); b) use the 4 4 Markov bass n the numercal algorthm n order to test the ndependence model for the followng table f obs = The maxmum lkelhood estmate s (32) ˆf = (33) c) compare the Markov bases for the quas-ndependence model and for the quassymmetry model n the 4 4 case and n the 3 3 case. Exercse 5 Consder now a multdmensonal example. three varables wth three levels. The complete ndependence model for a) Wrte the log-lnear form of the model; b) compute the Markov bass for ths model, usng the saturaton algorthm; c) compute a Gröbner bass of the torc deal, a mnmal set of generators (wth MnGens) and compare the results; d) try to compute the Markov bass usng the elmnaton algorthm. 8

9 MCMC functon functon smul=smul(tab,mle,m,n,bs,step); %tab = the observed table; %mle = maxmum lkelhood estmate; %m = the matrx representaton of the moves; %N = number of MCMC replcates; %bs = number of burn-n steps; %step = reducton of correlaton step; nc=length(tab); p=0; c=zeros(n,1); chsqref=chsq(tab,mle); mm=-m; m=[m; mm]; [nmoves,ncm]=sze(m); numt=n*step+bs; for =1:numt tabp=zeros(1,nc); r=cel(nmoves*rand(1)); tabp=tab+m(r,:); f tabp>=zeros(1,nc) mhr=1; for j=1:nc f m(r,j)~=0 mhr=mhr*prod(1:tab(j))/prod(1:tabp(j)); alpha=rand(1); f mhr>=alpha tab=tabp; f (rem(,step)==0) & (>bs) c((-bs)/step)=chsq(tab,mle); f c((-bs)/step)>=chsqref p=p+1; p=p/n; smul=p; 9

10 References Agrest, A. (2002). Categorcal Data Analyss. New York: Wley, 2nd ed. Bgatt, A., La Scala, R. & Robbano, L. (1999). Computng torc deals. J. Symb. Comput. 27, Bgatt, A. & Robbano, L. (2001). Torc deals. Mat. Contemp. 21, Capan, A., Nes, G. & Robbano, L. (2000). CoCoA, a system for dong Computatons n Commutatve Algebra. Avalable va anonymous ftp from cocoa.dma.unge.t, 4th ed. Chb, S. & Greenberg, E. (1995). Understandng the Metropols Hastngs algorthm. Amer. Statst. 49, Dacons, P. & Sturmfels, B. (1998). Algebrac algorthms for samplng from condtonal dstrbutons. Ann. Statst. 26, Fenberg, S. (1980). The Analyss of Cross-Classfed Categorcal Data. Cambrdge: MIT Press. Kreuzer, M. & Robbano, L. (2000). Computatonal Commutatve Algebra 1. New York: Sprnger. Pstone, G., Rccomagno, E. & Wynn, H. P. (2001a). Algebrac Statstcs: Computatonal Commutatve Algebra n Statstcs. Boca Raton: Chapman&Hall/CRC. Pstone, G., Rccomagno, E. & Wynn, H. P. (2001b). Computatonal commutatve algebra n dscrete statstcs. In Algebrac Methods n Statstcs and Probablty, M. A. G. Vana & D. S. P. Rchards, eds., vol. 287 of Contemporary Mathematcs. Amercan Mathematcal Socety, pp Rapallo, F. (2002a). Algebrac Markov bases and MCMC for two-way contngency tables. Scand. J. Statst. In press. Rapallo, F. (2002b). Exact algebrac nference for rater agreement models. Preprnt 467, Unverstà d Genova, Genova. Submtted. 10

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

SL n (F ) Equals its Own Derived Group

SL n (F ) Equals its Own Derived Group Internatonal Journal of Algebra, Vol. 2, 2008, no. 12, 585-594 SL n (F ) Equals ts Own Derved Group Jorge Macel BMCC-The Cty Unversty of New York, CUNY 199 Chambers street, New York, NY 10007, USA macel@cms.nyu.edu

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

EM and Structure Learning

EM and Structure Learning EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 3, Issue 1, July 2013 ISSN: 2277-375 Constructon of Trend Free Run Orders for Orthogonal rrays Usng Codes bstract: Sometmes when the expermental runs are carred out n a tme order sequence, the response can depend on the run

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Randomness and Computation

Randomness and Computation Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

Lecture 21: Numerical methods for pricing American type derivatives

Lecture 21: Numerical methods for pricing American type derivatives Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Maxmum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

Lecture 17 : Stochastic Processes II

Lecture 17 : Stochastic Processes II : Stochastc Processes II 1 Contnuous-tme stochastc process So far we have studed dscrete-tme stochastc processes. We studed the concept of Makov chans and martngales, tme seres analyss, and regresson analyss

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models Computaton of Hgher Order Moments from Two Multnomal Overdsperson Lkelhood Models BY J. T. NEWCOMER, N. K. NEERCHAL Department of Mathematcs and Statstcs, Unversty of Maryland, Baltmore County, Baltmore,

More information

An Introduction to Morita Theory

An Introduction to Morita Theory An Introducton to Morta Theory Matt Booth October 2015 Nov. 2017: made a few revsons. Thanks to Nng Shan for catchng a typo. My man reference for these notes was Chapter II of Bass s book Algebrac K-Theory

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Appendix B. The Finite Difference Scheme

Appendix B. The Finite Difference Scheme 140 APPENDIXES Appendx B. The Fnte Dfference Scheme In ths appendx we present numercal technques whch are used to approxmate solutons of system 3.1 3.3. A comprehensve treatment of theoretcal and mplementaton

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

ANOMALIES OF THE MAGNITUDE OF THE BIAS OF THE MAXIMUM LIKELIHOOD ESTIMATOR OF THE REGRESSION SLOPE

ANOMALIES OF THE MAGNITUDE OF THE BIAS OF THE MAXIMUM LIKELIHOOD ESTIMATOR OF THE REGRESSION SLOPE P a g e ANOMALIES OF THE MAGNITUDE OF THE BIAS OF THE MAXIMUM LIKELIHOOD ESTIMATOR OF THE REGRESSION SLOPE Darmud O Drscoll ¹, Donald E. Ramrez ² ¹ Head of Department of Mathematcs and Computer Studes

More information

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

DISCRIMINANTS AND RAMIFIED PRIMES. 1. Introduction A prime number p is said to be ramified in a number field K if the prime ideal factorization

DISCRIMINANTS AND RAMIFIED PRIMES. 1. Introduction A prime number p is said to be ramified in a number field K if the prime ideal factorization DISCRIMINANTS AND RAMIFIED PRIMES KEITH CONRAD 1. Introducton A prme number p s sad to be ramfed n a number feld K f the prme deal factorzaton (1.1) (p) = po K = p e 1 1 peg g has some e greater than 1.

More information

Conjugacy and the Exponential Family

Conjugacy and the Exponential Family CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

Chapter 5 Multilevel Models

Chapter 5 Multilevel Models Chapter 5 Multlevel Models 5.1 Cross-sectonal multlevel models 5.1.1 Two-level models 5.1.2 Multple level models 5.1.3 Multple level modelng n other felds 5.2 Longtudnal multlevel models 5.2.1 Two-level

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

DIFFERENTIAL FORMS BRIAN OSSERMAN

DIFFERENTIAL FORMS BRIAN OSSERMAN DIFFERENTIAL FORMS BRIAN OSSERMAN Dfferentals are an mportant topc n algebrac geometry, allowng the use of some classcal geometrc arguments n the context of varetes over any feld. We wll use them to defne

More information

Marginal Models for categorical data.

Marginal Models for categorical data. Margnal Models for categorcal data Applcaton to condtonal ndependence and graphcal models Wcher Bergsma 1 Marcel Croon 2 Jacques Hagenaars 2 Tamas Rudas 3 1 London School of Economcs and Poltcal Scence

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Min Cut, Fast Cut, Polynomial Identities

Min Cut, Fast Cut, Polynomial Identities Randomzed Algorthms, Summer 016 Mn Cut, Fast Cut, Polynomal Identtes Instructor: Thomas Kesselhem and Kurt Mehlhorn 1 Mn Cuts n Graphs Lecture (5 pages) Throughout ths secton, G = (V, E) s a mult-graph.

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore 8/5/17 Data Modelng Patrce Koehl Department of Bologcal Scences atonal Unversty of Sngapore http://www.cs.ucdavs.edu/~koehl/teachng/bl59 koehl@cs.ucdavs.edu Data Modelng Ø Data Modelng: least squares Ø

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Scence robablstc Graphcal Models Appromate Inference: Markov Chan Monte Carlo 05 07 Erc Xng Lecture 7 March 9 04 X X 075 05 05 03 X 3 Erc Xng @ CMU 005-04 Recap of Monte Carlo Monte

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y)

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y) Secton 1.5 Correlaton In the prevous sectons, we looked at regresson and the value r was a measurement of how much of the varaton n y can be attrbuted to the lnear relatonshp between y and x. In ths secton,

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

5 The Rational Canonical Form

5 The Rational Canonical Form 5 The Ratonal Canoncal Form Here p s a monc rreducble factor of the mnmum polynomal m T and s not necessarly of degree one Let F p denote the feld constructed earler n the course, consstng of all matrces

More information

Computing Correlated Equilibria in Multi-Player Games

Computing Correlated Equilibria in Multi-Player Games Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,

More information

Ballot Paths Avoiding Depth Zero Patterns

Ballot Paths Avoiding Depth Zero Patterns Ballot Paths Avodng Depth Zero Patterns Henrch Nederhausen and Shaun Sullvan Florda Atlantc Unversty, Boca Raton, Florda nederha@fauedu, ssull21@fauedu 1 Introducton In a paper by Sapounaks, Tasoulas,

More information

Zeros and Zero Dynamics for Linear, Time-delay System

Zeros and Zero Dynamics for Linear, Time-delay System UNIVERSITA POLITECNICA DELLE MARCHE - FACOLTA DI INGEGNERIA Dpartmento d Ingegnerua Informatca, Gestonale e dell Automazone LabMACS Laboratory of Modelng, Analyss and Control of Dynamcal System Zeros and

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA. 4. Tensor product

12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA. 4. Tensor product 12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA Here s an outlne of what I dd: (1) categorcal defnton (2) constructon (3) lst of basc propertes (4) dstrbutve property (5) rght exactness (6) localzaton

More information

1 Matrix representations of canonical matrices

1 Matrix representations of canonical matrices 1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:

More information

Polynomial Regression Models

Polynomial Regression Models LINEAR REGRESSION ANALYSIS MODULE XII Lecture - 6 Polynomal Regresson Models Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Test of sgnfcance To test the sgnfcance

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

NOTES FOR QUANTUM GROUPS, CRYSTAL BASES AND REALIZATION OF ŝl(n)-modules

NOTES FOR QUANTUM GROUPS, CRYSTAL BASES AND REALIZATION OF ŝl(n)-modules NOTES FOR QUANTUM GROUPS, CRYSTAL BASES AND REALIZATION OF ŝl(n)-modules EVAN WILSON Quantum groups Consder the Le algebra sl(n), whch s the Le algebra over C of n n trace matrces together wth the commutator

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

Google PageRank with Stochastic Matrix

Google PageRank with Stochastic Matrix Google PageRank wth Stochastc Matrx Md. Sharq, Puranjt Sanyal, Samk Mtra (M.Sc. Applcatons of Mathematcs) Dscrete Tme Markov Chan Let S be a countable set (usually S s a subset of Z or Z d or R or R d

More information

INVARIANT STABLY COMPLEX STRUCTURES ON TOPOLOGICAL TORIC MANIFOLDS

INVARIANT STABLY COMPLEX STRUCTURES ON TOPOLOGICAL TORIC MANIFOLDS INVARIANT STABLY COMPLEX STRUCTURES ON TOPOLOGICAL TORIC MANIFOLDS HIROAKI ISHIDA Abstract We show that any (C ) n -nvarant stably complex structure on a topologcal torc manfold of dmenson 2n s ntegrable

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

Errata to Invariant Theory with Applications January 28, 2017

Errata to Invariant Theory with Applications January 28, 2017 Invarant Theory wth Applcatons Jan Drasma and Don Gjswjt http: //www.wn.tue.nl/~jdrasma/teachng/nvtheory0910/lecturenotes12.pdf verson of 7 December 2009 Errata and addenda by Darj Grnberg The followng

More information

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2) 1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons

More information

QUASI-LIKELIHOOD APPROACH TO RATER AGREEMENT PLUS LINEAR BY LINEAR ASSOCIATION MODEL FOR ORDINAL CONTINGENCY TABLES

QUASI-LIKELIHOOD APPROACH TO RATER AGREEMENT PLUS LINEAR BY LINEAR ASSOCIATION MODEL FOR ORDINAL CONTINGENCY TABLES Journal of Statstcs: Advances n Theory and Applcatons Volume 6, Number, 26, Pages -5 Avalable at http://scentfcadvances.co.n DOI: http://dx.do.org/.8642/jsata_72683 QUASI-LIKELIHOOD APPROACH TO RATER AGREEMENT

More information

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS These are nformal notes whch cover some of the materal whch s not n the course book. The man purpose s to gve a number of nontrval examples

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Goodness of fit and Wilks theorem

Goodness of fit and Wilks theorem DRAFT 0.0 Glen Cowan 3 June, 2013 Goodness of ft and Wlks theorem Suppose we model data y wth a lkelhood L(µ) that depends on a set of N parameters µ = (µ 1,...,µ N ). Defne the statstc t µ ln L(µ) L(ˆµ),

More information

Why Monte Carlo Integration? Introduction to Monte Carlo Method. Continuous Probability. Continuous Probability

Why Monte Carlo Integration? Introduction to Monte Carlo Method. Continuous Probability. Continuous Probability Introducton to Monte Carlo Method Kad Bouatouch IRISA Emal: kad@rsa.fr Wh Monte Carlo Integraton? To generate realstc lookng mages, we need to solve ntegrals of or hgher dmenson Pel flterng and lens smulaton

More information

Introduction to Algorithms

Introduction to Algorithms Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Mamum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models for

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

A be a probability space. A random vector

A be a probability space. A random vector Statstcs 1: Probablty Theory II 8 1 JOINT AND MARGINAL DISTRIBUTIONS In Probablty Theory I we formulate the concept of a (real) random varable and descrbe the probablstc behavor of ths random varable by

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Modeling and Simulation NETW 707

Modeling and Simulation NETW 707 Modelng and Smulaton NETW 707 Lecture 5 Tests for Random Numbers Course Instructor: Dr.-Ing. Magge Mashaly magge.ezzat@guc.edu.eg C3.220 1 Propertes of Random Numbers Random Number Generators (RNGs) must

More information

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com

More information

Lecture 7: Boltzmann distribution & Thermodynamics of mixing

Lecture 7: Boltzmann distribution & Thermodynamics of mixing Prof. Tbbtt Lecture 7 etworks & Gels Lecture 7: Boltzmann dstrbuton & Thermodynamcs of mxng 1 Suggested readng Prof. Mark W. Tbbtt ETH Zürch 13 März 018 Molecular Drvng Forces Dll and Bromberg: Chapters

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information